Quantcast
Channel: Stephen Wolfram Writings
Viewing all 205 articles
Browse latest View live

In Less Than a Year, So Much New: Launching Version 12.1 of Wolfram Language & Mathematica

$
0
0
12-1-thumbnail

We’re pleased that despite the coronavirus pandemic and its impact on so many people and businesses we’re still able to launch today as planned… (Thanks to our dedicated team and the fact that remote working has been part of our company for decades…)

The Biggest .1 Release Ever

It’s always an interesting time. We’re getting ready to wrap up a .1 version—to release the latest fruits of our research and development efforts. “Is it going to be a big release?”, I wonder. Of course, I know we’ve done a lot of work since we released Version 12.0 last April. All those design reviews (many livestreamed). All those new things we’ve built and figured out.

But then we start actually making the list for the new version. And—OMG—it goes on and on. Different teams are delivering on this or that project that started X years ago. A new function is being added for this. There’s some new innovation about that. Etc.

We started this journey a third of a century ago when we began the development of Version 1.0. And after all these years, it’s amazing how the energy of each new release seems to be ever greater.

And as we went on making the list for Version 12.1 we wondered, “Will it actually be our biggest .1 release ever?”. We finally got the answer: “Yes! And by a lot”.

Counting functions isn’t always the best measure, but it’s an indication. And in Version 12.1 there are a total of 182 completely new functions—as well as updates and enhancements to many hundreds more.

In Less Than a Year, So Much New: Launching Version 12.1 of Wolfram Language & Mathematica

Look at It Now: HiDPI

Back in 1988 when we released Version 1.0 a typical computer display was maybe 640 pixels across (oh, and it was a CRT). And I was recently using some notebooks of mine from the 1990s (yes, they still work, which is spectacular!), and I was amazed at what a small window size they were made for. But as I write this today, I’m looking at two 3000-pixel displays. And 4k displays aren’t uncommon. So one of the things we’ve done for Version 12.1 is to add systemwide support for the new world of very-high-resolution displays.

One might think that would be easy, and would just “come with the operating system”. But actually it’s taken two years of hard work to deliver full HiDPI support. Well over a thousand carefully designed icons and other assets went from being bitmaps to being work-at-any-size algorithmic graphics. Everything about rasterization (not just for Rasterize, but for 3D graphics textures, etc. etc.) had to be redone. Sizes of things—and their interactions with the tower of kludges that operating systems have introduced over the years—had to be respecified and rethought.

But now it’s done. And we’re ready for displays of any resolution:

HiDPI icons

By the way, talking of displays, another “infrastructure” enhancement in Version 12.1 is moving to Metal and Direct3D 11 for 3D graphics rendering on macOS and Windows. Right now these just make 3D graphics modestly faster. But they also lay the groundwork for full multithreaded rendering, as well as VR, AR and more.

The Beginning of Video Computation

We’ve been working towards it for nearly 15 years… but finally it’s here: computation with video! We introduced images into the language in 2008; audio in 2016. But now in Version 12.1 we for the first time have computation with video. There’ll be lots more coming in future releases, but there’s already quite a bit in 12.1.

So… just like Image and Audio, which symbolically represent images and audio, we now have Video.

This asks for five frames from a video:

VideoFrameList
&#10005
VideoFrameList[
 Video["ExampleData/Caminandes.mp4", Appearance -> Automatic, 
  AudioOutputDevice -> Automatic, SoundVolume -> Automatic], 5]

This asks to make a time series of the mean color of every frame:

VideoTimeSeries
&#10005
VideoTimeSeries[Mean, 
 Video["ExampleData/Caminandes.mp4", Appearance -> Automatic, 
  AudioOutputDevice -> Automatic, SoundVolume -> Automatic]]

And then one can just plot the time series:

DateListPlot
&#10005
DateListPlot[%, 
 PlotStyle -> {RGBColor[1, 0, 0], RGBColor[0, 1, 0], RGBColor[
   0, 0, 1]}]

Video is a complicated area, with lots of different encodings optimized for different purposes. In Version 12.1 we’re supporting more than 250 of them, for import, export and transcoding. You can refer to a video on the web as well:

Video
&#10005
Video["http://exampledata.wolfram.com/cars.avi"]

And the big thing is that video is now getting integrated into everything. So, for example, you can immediately use image processing or audio processing or machine learning functions on video. Here’s an example plotting the location of cars in the video above:

v = Video
&#10005
v = Video["http://exampledata.wolfram.com/cars.avi"]
ts = VideoTimeSeries
&#10005
ts = VideoTimeSeries[Point[ImagePosition[#, Entity["Word", "car"]]] &,
   v]
HighlightImage
&#10005
HighlightImage[
 VideoExtractFrames[v, Quantity[5, "Seconds"]], {PointSize[Medium], 
  Values[ts]}]

Let’s say you’ve got a Manipulate, or an animation (say from ListAnimate). Well, now you can just immediately make a video of it:

Video
&#10005
Video[CloudGet["https://wolfr.am/L9r00rk5"]]

You can add an audio track, then export the whole thing directly to a file, the cloud, etc.

So is this new video capability really industrial strength? I’ve been recording hundreds of hours of video in connection with a new project I’m working on. So I decided to try our new capabilities on it. It’s spectacular! I could take a 4-hour video, and immediately extract a bunch of sample frames from it, and then—yes, in a few hours of CPU time—“summarize the whole video”, using SpeechRecognize to do speech-to-text on everything that was said and then generating a word cloud:

Working session

Speaking of audio, there’s new stuff in Version 12.1 there too. We’ve redone the GUI for in-notebook Audio objects. And we’ve introduced SpeechInterpreter, which is the spoken analog of the Interpreter function, here taking an audio object and returning what airline name was said in it:

SpeechInterpreter
&#10005
SpeechInterpreter["Airline"][CloudGet["https://wolfr.am/L9r410jA"]]

In Version 12.0 we introduced the important function TextCases for extracting from text hundreds of kinds of entities and “text content types” (which as of 12.1 now have their own documentation pages). In 12.1 we’re also introducing SpeechCases, which does the same kind of thing for audio speech.

A Computer Science Story: DataStructure

One of our major long-term projects is the creation of a full compiler for the Wolfram Language, targeting native machine code. Version 12.0 was the first exposure of this project. In Version 12.1 there’s now a spinoff from the project—which is actually a very important project in its own right: the new DataStructure function.

We’ve curated many kinds of things in the past: chemicals, equations, movies, foods, import-export formats, units, APIs, etc. And in each case we’ve made the things seamlessly computable as part of the Wolfram Language. Well, now we’re adding another category: data structures.

Think about all those data structures that get mentioned in textbooks, papers, libraries, etc. Our goal is to have all of them seamlessly usable directly in the Wolfram Language, and accessible in compiled code, etc. Of course it’s huge that we already have a universal “data structure”: the symbolic expressions in the Wolfram Language. And internal to the Wolfram Language we’ve always used all sorts of data structures, optimized for different purposes, and automatically selected by our algorithms and meta-algorithms.

But now with DataStructure there’s something new. If you have a particular kind of data structure you want to use, you can just ask for it by name, and use it.

Here’s how you create a linked list data structure:

ds = CreateDataStructure
&#10005
ds = CreateDataStructure["LinkedList"]

Append a million random integers to the linked list (it takes 380 ms on my machine):

Do
&#10005
Do[ds["Append", RandomInteger[]], 10^6]

Now there’s immediate visualization of the structure:

ds
&#10005
ds["Visualization"]

Here’s the numerical mean of all the values:

Mean
&#10005
Mean[N[Normal[ds]]]

Like so much of what we do DataStructure is set up to span from small scale and pedagogical to large scale and full industrial strength. Teaching a course about data structures? Now you can immediately use the Wolfram Language, storing everything you do in notebooks, automatically visualizing your data structures, etc. Building large-scale production code? Now you can have complete control over optimizing the data structures you’re using.

How does DataStructure work? Well, it’s all written in the Wolfram Language, and compiled using the compiler (which itself is also written in the Wolfram Language).

In Version 12.1 we’ve got most of the basic data structures covered, with various kinds of lists, arrays, sets, stacks, queues, hash tables, trees, and more. And here’s an important point: each one is documented with the running time for its various operations (“O(n)”, “O(n log(n))”, etc.), and the code ensures that that’s correct.

It’s pretty neat to see classic algorithms written directly for DataStructure.

Create a binary tree data structure (and visualize it):

(ds = CreateDataStructure
&#10005
(ds = CreateDataStructure["BinaryTree", 
    3 -> {1 -> {0, Null}, Null}])["Visualization"]

Here’s a function for rebalancing the tree:

RightRotate
&#10005
RightRotate[y_] :=
 Module[{x, tmp},
  x = y["Left"]; tmp = x["Right"]; x["SetRight", y]; 
  y["SetLeft", tmp]; x]

Now do it, and visualize the result:

RightRotate
&#10005
RightRotate[ds]["Visualization"]

The Asymptotic Superfunction

You’ve got a symbolic math expression and you want to figure out its rough value. If it’s a number you just use N to get a numerical approximation. But how do you get a symbolic approximation?

Ever since Version 1.0—and, in the history of math, ever since the 1600s—there’s been the idea of power series: find an essentially polynomial-like approximation to a function, as Series does. But not every mathematical expression can be reasonably approximated that way. It’s difficult math, but it’s very useful if one can make it work. We started introducing “asymptotic approximation” functions for specific cases (like integrals) in Version 11.3, but now in 12.1 we’re introducing the asymptotic superfunction Asymptotic.

Consider this inverse Laplace transform:

InverseLaplaceTransform
&#10005
InverseLaplaceTransform[1/(s Sqrt[s^3 + 1]), s, t]

There’s no exact symbolic solution for it. But there is an asymptotic approximation when t is close to 0:

Asymptotic
&#10005
Asymptotic[InverseLaplaceTransform[1/(s Sqrt[s^3 + 1]), s, t], t -> 0]

Sometimes it’s convenient to not even try to evaluate something exactly—but just to leave it inactive until you give it to Asymptotic:

Asymptotic
&#10005
Asymptotic[
 DSolveValue[Sin[x]^2 y''[x] + x  y[x] == 0, y[x], x], {x, 0, 5}]

Asymptotic deals with functions of continuous variables. In Version 12.1 there’s also DiscreteAsymptotic. Here we’re asking for the asymptotic behavior of the Prime function:

DiscreteAsymptotic
&#10005
DiscreteAsymptotic[Prime[n], n -> Infinity]

Or the factorial:

DiscreteAsymptotic
&#10005
DiscreteAsymptotic[n!, n -> Infinity]

We can ask for more terms if we want:

DiscreteAsymptotic
&#10005
DiscreteAsymptotic[n!, n -> Infinity, SeriesTermGoal -> 5]

Sometimes even quite simple functions can lead to quite exotic asymptotic approximations:

DiscreteAsymptotic
&#10005
DiscreteAsymptotic[BellB[n], n -> Infinity]

More Math, As Always

Math is big, and math is important. And for the Wolfram Language (which also means for Mathematica) we’re always pushing the frontiers of what’s computable in math.

One long-term story has to do with special functions. Back in Version 1.0 we already had 70 special functions. We covered univariate hypergeometric functions—adding the general pFq case in Version 3.0. Over the years we’ve gradually added a few other kinds of hypergeometric functions (as well as 250 other new kinds of special functions). Typical hypergeometric functions are solutions to differential equations with three regular singular points. But in Version 12.1 we’ve generalized that. And now we have Heun functions, that solve equations with four regular singular points. That might not sound like a big deal, but actually they’re quite a mathematical jungle—for example with 192 known special cases. And they’re very much in vogue now, because they show up in the mathematics of black holes, quantum mechanics and conformal field theory. And, yes, Heun functions have a lot of arguments:

Series
&#10005
Series[HeunG[a, q, \[Alpha], \[Beta], \[Gamma], \[Delta], z], {z, 0, 
  3}]

By the way, when we “support a special function” these days, there’s a lot we do. It’s not just a question of evaluating the function to arbitrary precision anywhere in the complex plane (though that’s often hard enough). We also need to be able to compute asymptotic approximations, simplifications, singularities, etc. And we have to make sure the function can get produced in the results of functions like Integrate, DSolve and Sum.

One of our consistent goals in dealing with superfunctions like DSolve is to make them “handbook complete”. To be sure that the algorithms we have—that are built to handle arbitrary cases—successfully cover as much as possible of the cases that appear anywhere in the literature, or in any handbook. Realistically, over the years, we’ve done very well on this. But in Version 12.1 we’ve made a new, big push, particularly for DSolve.

Here’s an example (oh, and, yes, it happens to need Heun functions):

DSolveValue
&#10005
DSolveValue[(d + c x + b x^2) y[x] + a x y'[x] + (-1 + x^2) y''[x] == 
  0, y[x], x]

There’s a famous book from the 1940s that’s usually just called Kamke, and that’s a huge collection of solutions to differential equations, some extremely exotic. Well, we’ll soon be able to do 100% of the (concrete) equations in this book (we’re still testing the last few…).

In Version 12.0 we introduced functions like ComplexPlot and ComplexPlot3D for plotting complex functions of complex variables. In Version 12.1 we now also have complex contour plotting. Here we’re getting two sets of contours—from the Abs and the Arg of a complex function:

ComplexContourPlot
&#10005
ComplexContourPlot[
 AbsArg[(z^2 - I)/(z^3 + I)], {z, -3 - 3 I, 3 + 3 I}, Contours -> 30]

Also new in 12.1 is ComplexRegionPlot, which effectively solves equations and inequalities in the complex plane. Like here’s the (very much branch-cut-informed) solution to an equation whose analog would be trivial over the reals:

ComplexRegionPlot
&#10005
ComplexRegionPlot[Sqrt[z^(2 + 2 I)] == z^(1 + I), {z, 10}]

In a very different area of mathematics, another new function in Version 12.1 is CategoricalDistribution. We introduced the idea of symbolic representations of statistical distributions back in Version 6—with things like NormalDistribution and PoissonDistribution—and the idea has been extremely successful. But so far all our distributions have been distributions over numbers. In 12.1 we have our first distribution where the possible outcomes don’t need to be numbers.

Here’s a distribution where there are outcomes x, y, z with the specified probabilities:

dist = CategoricalDistribution
&#10005
dist = CategoricalDistribution[{x, y, z}, {.1, .2, .7}]

Given this distribution, one can do things like generate random variates:

RandomVariate
&#10005
RandomVariate[dist, 10]

Here’s a 3D categorical distribution:

dist = CategoricalDistribution
&#10005
dist = CategoricalDistribution[{{"A", "B", "C"}, {"D", "E"}, {"X", 
    "Y"}}, {{{2, 4}, {2, 1}}, {{2, 2}, {3, 2}}, {{4, 3}, {1, 3}}}]

Now we can work out the PDF of the distribution, asking in this case what the probability to get A, D, Y is:

PDF
&#10005
PDF[dist, {"A", "D", "Y"}]

By the way, if you want to “see the distribution” you can either click the + on the summary box, or explicitly use Information:

Information
&#10005
Information[dist, "ProbabilityTable"]

There are lots of uses of CategoricalDistribution, for example in machine learning. Here we’re creating a classifier:

cf = Classify
&#10005
cf = Classify[{1, 2, 3, 4} -> {a, a, b, b}]

If we just give it input 2.3, the classifier will give its best guess for the corresponding output:

cf
&#10005
cf[2.3]

But in 12.1 we can also ask for the distribution—and the result is a CategoricalDistribution:

cf
&#10005
cf[2.3, "Distribution"]
Information
&#10005
Information[%, "ProbabilityTable"]

The Leading Edge of Optimization

In Version 12.0 we introduced industrial-scale convex optimization. We covered most of the usual problem classes (like linear, semidefinite, quadratic and conic). But there was one straggler: geometric optimization. And now we’re adding that for 12.1:

GeometricOptimization
&#10005
GeometricOptimization[\[Pi] r (r + Sqrt[h^2 + r^2]), {1 <=  \[Pi]/
    3 h r^2 }, {h, r}]
GeometricOptimization
&#10005
GeometricOptimization[
 1/(h w d), {h <= 2 w, d <= 2 w, h*w + h*d <= 50, 2 w*d <= 20}, {h, w,
   d}]

You can solve all sorts of practical problems with GeometricOptimization—with thousands of variables if need be. As one example, consider laying out rectangles of certain sizes with a certain partial ordering in x and y. To specify the problem, you give a bunch of inequalities:

With
&#10005
With[{c1 = 0.25, c2 = 0.618}, 
  ineqs = {{c1 + w[1] + x[1] <= x[2], c1 + w[1] + x[1] <= x[3], 
     c1 + w[1] + x[1] <= x[4], c1 + w[1] + x[1] <= x[5], 
     c1 + w[1] + x[1] <= x[6], c1 + w[1] + x[1] <= x[7], 
     c1 + w[2] + x[2] <= x[3], c1 + w[4] + x[4] <= x[5], 
     c1 + w[2] + x[2] <= x[3], c1 + w[2] + x[2] <= x[5], 
     c1 + w[2] + x[2] <= x[7], c1 + w[4] + x[4] <= x[3], 
     c1 + w[4] + x[4] <= x[5], c1 + w[4] + x[4] <= x[7], 
     c1 + w[6] + x[6] <= x[5], c1 + w[8] + x[8] <= x[4], 
     c1 + w[9] + x[9] <= x[4], c1 + w[10] + x[10] <= x[4], 
     c1 + w[10] + x[10] <= x[6], c1 + w[6] + x[6] <= x[7], 
     c1 + w[8] + x[8] <= x[9], c1 + w[8] + x[8] <= x[10], x[1] >= 0, 
     x[8] >= 0, w[3] + x[3] <= \[ScriptW], w[5] + x[5] <= \[ScriptW], 
     w[7] + x[7] <= \[ScriptW]}, {c1 + h[1] + y[1] <= y[6], 
     c1 + h[1] + y[1] <= y[7], c1 + h[1] + y[1] <= y[8], 
     c1 + h[1] + y[1] <= y[9], c1 + h[1] + y[1] <= y[10], 
     c1 + h[2] + y[2] <= y[4], c1 + h[2] + y[2] <= y[9], 
     c1 + h[4] + y[4] <= y[6], c1 + h[3] + y[3] <= y[5], 
     c1 + h[5] + y[5] <= y[7], c1 + h[9] + y[9] <= y[6], 
     c1 + h[9] + y[9] <= y[10], y[1] >= 0, y[2] >= 0, y[3] >= 0, 
     h[6] + y[6] <= \[ScriptH], h[7] + y[7] <= \[ScriptH], 
     h[8] + y[8] <= \[ScriptH], 
     h[10] + y[10] <= \[ScriptH]}, {c2 <= h[1]/w[1] <= (1 + c2), 
     c2 <= h[2]/w[2] <= (1 + c2), c2 <= h[3]/w[3] <= (1 + c2), 
     c2 <= h[4]/w[4] <= (1 + c2), c2 <= h[5]/w[5] <= (1 + c2), 
     c2 <= h[6]/w[6] <= (1 + c2), c2 <= h[7]/w[7] <= (1 + c2), 
     c2 <= h[8]/w[8] <= (1 + c2), c2 <= h[9]/w[9] <= (1 + c2), 
     c2 <= h[10]/w[10] <= (1 + c2)}, {h[1] w[1] == 1, h[2] w[2] == 2, 
     h[3] w[3] == 3, h[4] w[4] == 4, h[5] w[5] == 5, h[6] w[6] == 6, 
     h[7] w[7] == 7, h[8] w[8] == 8, h[9] w[9] == 9, 
     h[10] w[10] == 10}}];

It then takes only about a second to generate an optimal solution:

GeometricOptimization

In optimization, there are usually two broad types: continuous and discrete. Our convex optimization functions in 12.0 handled the case of continuous variables. But a major new feature—and innovation—in 12.1 is the addition of support for discrete (i.e. integer) variables, and for mixed discrete and continuous variables.

Here’s a very simple example:

QuadraticOptimization
&#10005
QuadraticOptimization[
 2 x^2 + 20 y^2 + 6 x y + 5 x, -x + y >= 2, {x \[Element] Integers, 
  y \[Element] Reals}]

If x wasn’t constrained to be an integer, the result would be different:

QuadraticOptimization
&#10005
QuadraticOptimization[
 2 x^2 + 20 y^2 + 6 x y + 5 x, -x + y >= 2, {x, y}]

But—as with our other optimization capabilities—this can be scaled up, though the combinatorial optimization that’s involved is fundamentally more computationally difficult (and for example it’s often NP-complete). And actually the only reason we can do large-scale problems of this kind at all is that we’ve implemented a novel iteration-based technique that successfully unlocks mixed convex optimization.

Cracking the Vector-Plotting Problem

I’ve been trying to make good vector plots for about 40 years, but in the past it just never worked. If the vectors got too short, you couldn’t see their direction—and if you made them longer they crashed into each other. But particularly after our success in Version 12.0 in cracking the ComplexPlot problem (which had also been languishing for a long time) we decided for Version 12.1 to try to solve the vector-plotting problem once and for all. And, I’m happy to say, we seem to have been able to do that.

So now, you can just ask VectorPlot (and all sorts of related functions) to make a vector plot, and you’ll automatically get something that’s a good representation of your vector field:

VectorPlot
&#10005
VectorPlot[{2 x^2 - y^2, 3 x y}, {x, -5, 5}, {y, -5, 5}]
VectorPlot
&#10005
VectorPlot[{2 x^2 - y^2, 3 x y}, {x, -5, 5}, {y, -5, 5}, 
 VectorPoints -> 30]

What’s the trick? It’s basically about placing vectors on a hexagonal grid so they’re packed better, and are visually more uniform. (You can also make other choices if you want to.) And then it’s about using appropriately scaled color to represent vector magnitudes.

There are all sorts of other challenges too. Like being able to draw vectors in a region:

VectorPlot
&#10005
VectorPlot[{2 x^2 - y^2, 3 x y}, {x, y} \[Element] Disk[{0, 0}, 3]]

And putting together our complex-number-plotting capabilities with our new vector plotting, we also in 12.1 have ComplexVectorPlot:

ComplexVectorPlot
&#10005
ComplexVectorPlot[z^ Log[z], {z, 6}, PlotLegends -> Automatic]

Cross-Hatching & All That

Before there were gray scales, there were things like cross-hatching. Look at a book from a century ago (or less), and you’ll see all sorts of diagrams elegantly drawn with things like cross-hatching. Well, now we can do that too.

Graphics
&#10005
Graphics[Style[RegularPolygon[5], HatchFilling[]]]
Plot
&#10005
Plot[{Sin[x], Cos[x]}, {x, 0, 10}, Filling -> Axis, 
 FillingStyle -> HatchFilling[]]

Of course, everything is computable:

Graphics
&#10005
Graphics[Table[
  Style[Disk[RandomReal[10, 2]], 
   HatchFilling[RandomReal[{0, 2 \[Pi]}]]], 50]]

We also have an important generalization of cross-hatching: PatternFilling. Here are examples with named patterns:

Graphics
&#10005
Graphics[Style[Disk[], PatternFilling["DiamondBox"]]]
Graphics
&#10005
Graphics[Style[Disk[], PatternFilling[{"Checkerboard", Red, Black}]]]

You can use any image as a pattern too:

GeoGraphics
&#10005
GeoGraphics[
 Style[Polygon[Entity["Country", "UnitedStates"]], 
  PatternFilling[CloudGet["https://wolfr.am/L9r9AL5O"], 270]]]

Version 12.1 also has what one can think of as 3D generalizations of these kinds of textures:

Graphics3D
&#10005
Graphics3D[Style[Icosahedron[], HatchShading[]]]

It looks pretty good even in black and white:

Graphics3D
&#10005
Graphics3D[Style[Icosahedron[], HatchShading[]], Lighting -> "Accent"]

There’s stipple shading too:

Plot3D
&#10005
Plot3D[Exp[-(x^2 + y^2)], {x, -2, 2}, {y, -2, 2}, 
 PlotStyle -> {White, StippleShading[]}, Mesh -> None, 
 Lighting -> "Accent"]

The Beginning of Computational Topology

In the past few versions, we’ve introduced deeper and deeper coverage of computational geometry. In coming versions, we’re going to be covering more and more of computational topology too. Things like EulerCharacteristic and PolyhedronGenus were already in Version 12.0. In Version 12.1 we’re introducing several powerful functions for dealing with the topology of simplicial complexes, of the kind that are for example used in representing meshes.

This makes a connectivity graph for the dimension-0 components of a dodecahedron, i.e. its corners:

MeshConnectivityGraph
&#10005
MeshConnectivityGraph[Dodecahedron[], 0]

Here’s the corresponding result for the connectivity of lines to lines in the dodecahedron:

MeshConnectivityGraph
&#10005
MeshConnectivityGraph[Dodecahedron[], 1]

And here’s the connectivity of corners to faces:

MeshConnectivityGraph
&#10005
MeshConnectivityGraph[Dodecahedron[], {0, 2}]

It’s a very general function. Here are the graphs for different dimensional cells of a Menger sponge:

Table
&#10005
Table[MeshConnectivityGraph[MengerMesh[2, 3], d], {d, 0, 3}]

Given a mesh, it’s often useful to do what amounts to a topological search. For example, here’s a random Voronoi mesh:

vm = VoronoiMesh
&#10005
vm = VoronoiMesh[RandomReal[1, {200, 2}]]

Here are the 10 closest mesh cells to position {.5, .5} (the 2 before each index indicates that these are dimension-2 cells):

NearestMeshCells
&#10005
NearestMeshCells[vm, {.5, .5}, 10]

Now highlight these cells:

HighlightMesh
&#10005
HighlightMesh[vm, %]

Tabular Data: Computing & Editing

Dataset has been a big success in the six years since it was first introduced in Version 10.0. Version 12.1 has the beginning of a major project to upgrade and extend the functionality of Dataset.

The first thing is something you might not notice, because now it “just works”. When you see a Dataset in a notebook, you’re just seeing its displayed form. But often there is lots of additional data that you’d get to by scrolling or drilling down. In Version 12.1 Dataset automatically stores that additional data directly in the notebook (at least up to a size specified by $NotebookInlineStorageLimit) so when you open the notebook again later, the Dataset is all there, and all ready to compute with.

In Version 12.1 a lot has been done with the formatting of Dataset. Something basic is that you can now say how many rows and columns to display by default:

Dataset
&#10005
Dataset[CloudGet["https://wolfr.am/L9o1Pb7V"], MaxItems -> {4, 3}]

In Version 12.1 there are many options that allow detailed programmatic control over the appearance of a display Dataset. Here’s a simple example:

Dataset
&#10005
Dataset[CloudGet["https://wolfr.am/L9o1Pb7V"],
 MaxItems -> 5,
 HeaderBackground -> LightGreen,
 Background -> {{{LightBlue, LightOrange}}},
 ItemDisplayFunction -> {"sex" -> (If[# === 
        "female", \[Venus], \[Mars]] &)}
 ]

A major new feature is “right-click” interactivity (which works on rows, columns and individual items):

Hide column

Sort

Dataset is a powerful construct for displaying and computing with tabular data of any depth. But sometimes you just want to enter or edit simple 2D tabular data—and the user interface requirements for this are rather different from those for Dataset. So in Version 12.1 we’re introducing a new experimental function called TableView, which is a user interface for entering—and viewing—purely two-dimensional tabular data:

TableView
&#10005
TableView[{{5, 6}, {7, 3}}]

Like a typical spreadsheet, TableView has fixed-width columns that you can manually adjust. It can efficiently handle large-scale data (think millions of items). The items can (by default) be either numbers or strings.

When you’ve finished editing a TableView, you can just ask for Normal and you’ll get lists of data out. (You can also feed it directly into a Dataset.) Like in a typical spreadsheet, TableView lets you put data wherever you want; if there’s a “hole”, it’ll show up as Null in lists you generate.

TableView is actually a dynamic control. So, for example, with TableView[Dynamic[x]] you can edit a TableView, and have its payload automatically be the value of some variable x. (And, yes, all of this is done efficiently, with minimal updates being made to the expression representing the value of x.)

GANs, BERT, GPT-2, ONNX, …: The Latest in Machine Learning

Machine learning is all the rage these days. Of course, we were involved with it even a very long time ago. We introduced the first versions of our flagship highly automated machine-learning functions Classify and Predict back in 2014, and we introduced our first explicitly neural-net-based functionImageIdentify—in early 2015.

And in the years since then we’ve built a very strong system for machine learning in general, and for neural nets in particular. Several things about it stand out. First, we’ve emphasized high automation—using machine learning to automate machine learning wherever possible, so that even non-experts can immediately make use of leading-edge capabilities. The second big thing is that we’ve been curating neural nets, just like we curate so many other things. So that means that we have pretrained classifiers and predictors and feature extractors that you can immediately and seamlessly use. And the other big thing is that our whole neural net system is symbolic—in the sense that neural nets are specified as computable, symbolic constructs that can be programmatically manipulated, visualized, etc.

In Version 12.1 we’ve continued our leading-edge development in machine learning. There are 25 new types of neural nets in our Wolfram Neural Net Repository, including ones like BERT and GPT-2. And the way things are set up, it’s immediate to use any of these nets. (Also, in Version 12.1 there’s support for the new ONNX neural net specification standard, which makes it easier to import the very latest neural nets that are being published in almost any format.)

This gets the symbolic representation of GPT-2 from our repository:

gpt2 = NetModel
&#10005
gpt2 = NetModel["GPT-2 Transformer Trained on WebText Data", 
  "Task" -> "LanguageModeling"]

If you want to see what’s inside, just click—and keep clicking to drill down into more and more details:

See what's inside—click to enlarge

Now you can immediately use GPT-2, for example progressively generating a random piece of text one token at a time:

Nest
&#10005
Nest[StringJoin[#, 
   gpt2[#, "RandomSample"]] &, "Stephen Wolfram is", 20]

Hmmmm. I wonder what that was trained on….

By the way, people sometimes talk about machine learning and neural nets as being in opposition to traditional programming language code. And in a way, that’s correct. A neural net just learns from real-world examples or experience, whereas a traditional programming language is about giving a precise abstract specification of what in detail a computer should do. We’re in a wonderful position with the Wolfram Language, because what we have is something that already spans these worlds: we have a full-scale computational language that takes advantage of all those precise computation capabilities, yet can meaningfully represent and compute about things in the real world.

So it’s been very satisfying in the past few years to see how modern machine learning can be integrated into the Wolfram Language. We’ve been particularly interested in new superfunctions—like Predict, Classify, AnomalyDetection, LearnDistribution and SynthesizeMissingValues—that do “symbolically specified” operations, but do them using neural nets and modern machine learning.

In Version 12.1 we’re continuing in this direction, and moving towards superfunctions that use more elaborate neural net workflows, like GANs. In particular, Version 12.1 introduces the symbolic NetGANOperator, as well as the new option TrainingUpdateSchedule. And it turns out these are the only things we had to change to allow our general NetTrain function to work with GANs.

A typical GAN setup is quite complicated (and that’s why we’re working on devising superfunctions that conveniently deliver applications of GANs). But here’s an example of a GAN in action in Version 12.1:

NetGANModel

The Calculus of Annotations

How do you add metadata annotations to something you’re computing with? For Version 12.1 we’ve begun rolling out a general framework for making annotations—and then computing with and from them.

Let’s talk first about the example of graphs. You can have both annotations that you can immediately “see in the graph” (like vertex colors), and ones that you can’t (like edge weights).

Here’s an example where we’re explicitly constructing a graph with annotations:

Graph
&#10005
Graph[{Annotation[1 -> 2, EdgeStyle -> Red], 
  Annotation[2 -> 1, EdgeStyle -> Blue]}]

Here we’re annotating the vertices:

Graph
&#10005
Graph[{Annotation[x, VertexStyle -> Red], 
  Annotation[y, VertexStyle -> Blue]}, {x -> y, y -> x, y -> y}, 
 VertexSize -> .2]

AnnotationValue lets you query values of annotations:

AnnotationValue
&#10005
AnnotationValue[{%, 
  x}, VertexStyle]

Something important about AnnotationValue is that you can assign it. Set g to be the graph:

g = CloudGet
&#10005
g = CloudGet["https://wolfr.am/L9rgvixl"];

Now do an assignment to an annotation value:

AnnotationValue
&#10005
AnnotationValue[{g, x}, VertexStyle] = Green

Now the graph has changed:

g
&#10005
g

You can always delete the annotation if you want:

AnnotationDelete
&#10005
AnnotationDelete[{g, x}, VertexStyle]

If you don’t want to permanently modify a graph, you can just use Annotate to produce a new graph with annotations added (3 and 5 are names of vertices):

Annotate
&#10005
Annotate[{CloudGet["https://wolfr.am/L9rpqPJ0"], {3, 5}}, 
 VertexSize -> .3]

Some annotations are important for actual computations on graphs. An example is edge weighting. This puts edge-weight annotations into a graph—though by default they don’t get displayed:

Graph
&#10005
Graph[Catenate[
  Table[Annotation[i -> j, EdgeWeight -> GCD[i, j]], {i, 5}, {j, 5}]]]

This displays the edge weights:

Graph
&#10005
Graph[%, EdgeLabels -> "EdgeWeight"]

And this actually does a computation that includes the weights:

WeightedAdjacencyMatrix
&#10005
WeightedAdjacencyMatrix[
  CloudGet["https://wolfr.am/L9rtJdd9"]] // MatrixForm

You can use your own custom annotations too:

Graph
&#10005
Graph[{Annotation[x, "age" -> 10], 
  Annotation[y, "age" -> 20]}, {x -> y, y -> x, y -> y}]

This retrieves the value of the annotation:

AnnotationValue
&#10005
AnnotationValue[{CloudGet["https://wolfr.am/L9rx8Mxe"], x}, "age"]

Annotations are ultimately stored in an AnnotationRules option:

Options
&#10005
Options[CloudGet["https://wolfr.am/L9rx8Mxe"], AnnotationRules]

You can always give all annotations as a setting for this option.

A major complexity with annotations is when in a computation they should be preserved—or combined—and when they should be dropped. We always try to keep annotations whenever it makes sense:

TransitiveReductionGraph
&#10005
TransitiveReductionGraph[CloudGet["https://wolfr.am/L9rBdIZt"]]

Annotations are something quite general, that apply not only to graphs, but to an increasing number of other constructs too. But in Version 12.1 we’ve added something else that’s specific to graphs, and that handles a complicated case there. It has to do with multigraphs, i.e. graphs with multiple edges between the same vertices. Take the graph:

Graph
&#10005
Graph[{1 -> 2, 1 -> 2}]

How do you distinguish these two edges? It’s not a question of annotation; you actually want these edges to be distinct, just like the vertices in the graph are distinct. Well, in Version 12.1, you can give names (or “tags”) to edges, just like you give names to vertices:

EdgeTaggedGraph
&#10005
EdgeTaggedGraph[{1 -> 2, 1 -> 2} -> {a, b}]

In the edge list for this graph the edges are shown “tagged”:

EdgeList
&#10005
EdgeList[%]

The tags are part of the edge specification:

InputForm
&#10005
InputForm[%]

But back to annotations. Another kind of structure that can be annotated just like graphs is a mesh. This is saying to annotate dimension-0 boundary cells with a style:

Annotate
&#10005
Annotate[{MengerMesh[2], {0, "Boundary"}}, MeshCellStyle -> Red]

A completely different kind of structure that can also use annotations is audio. This annotates an Audio object with information about where there’s voice activity in the audio:

AudioAnnotate
&#10005
AudioAnnotate[ExampleData[{"Audio", "MaleVoice"}], "Voiced"]

This retrieves the value of the annotation:

AnnotationValue
&#10005
AnnotationValue[%, "Voiced"] // TimelinePlot

We’ll be rolling out annotations in lots of other things too. One that’s coming is images. But in preparation for that, in Version 12.1 we’ve added some new capabilities to HighlightImage.

Use machine learning to find what’s in the picture:

ImageBoundingBoxes
&#10005
ImageBoundingBoxes[CloudGet["https://wolfr.am/L9qz1zu4"]]

Now HighlightImage can use the annotation information:

HighlightImage
&#10005
HighlightImage[CloudGet["https://wolfr.am/L9qz1zu4"], %]

Language Innovations & Extensions

Nothing has been a big success:

{a, b, Nothing, c, Nothing}
&#10005
{a, b, Nothing, c, Nothing}

Before Nothing, you always had to poke at a list from the outside to get elements in it deleted. But Nothing is a symbolic way of specifying deletion that “works from the inside”.

Pretty much as soon as we’d invented Nothing, we realized we also wanted another piece of functionality: a symbolic object that would somehow automatically disgorge its contents into a list. People had been using idioms like Sequence@@… to do this. But Sequence is a slippery construct, and this idiom is fragile and ugly.

The functionality of our auto-inserter was easy to define. But what were we going to call it? For several years this very useful piece of functionality languished for want of a name. It came up several times in our livestreamed design reviews. Every time we would discuss it for a while—and often our viewers would offer good suggestions. But we were never happy with the name.

Finally, though, we decided we had to solve the problem. It was a painful naming process, culminating in a 90-minute livestream whose net effect was a change in one letter in the name. But in the end, we’re pretty happy with the name: Splice. Splice is a splice, like for film, or DNA—and it’s something that gets inserted. So now, as of Version 12.1 we have it:

{a, b, Splice
&#10005
{a, b, Splice[{x, y, z}], c, d}

Of course, the more common case is something like:

{a, b, x, c, d} /. x -> Splice
&#10005
{a, b, x, c, d} /. x -> Splice[{p, q, r}]

There’s a lot of strange (and potentially buggy) Flatten operations that are going to be avoided by Splice.

One of the things we’re always trying to do in developing Wolfram Language is to identify important “lumps of computation” that we can conveniently encapsulate into functions (and where we can give those functions good names!). In Version 12.1 there’s a family of new functions that handle computations around subsets of elements in lists:

SubsetCases
&#10005
SubsetCases[{a, b, a, b, a, c}, {x_, y_, x_}, Overlaps -> True]

I must have written special cases of these functions a zillion times. But now we’ve got general functions that anyone can just use. These functions come up in lots of places. And actually we first implemented general versions of them in connection with semantic-query-type computations.

But on the theory that any sufficiently well-designed function eventually gets a very wide range of uses, I can report that I’ve recently found a most unexpected but spectacular use for SubsetReplace in the context of fundamental physics. But much more on that in a little while…

Talking about physics brings me to something else in 12.1: new functions for handling time. DateInterval now provides a symbolic representation for an interval of time. And there’s an interesting algebra of ordering that needs to be defined for it. Which includes the need for the symbols InfinitePast and InfiniteFuture:

Today < InfiniteFuture
&#10005
Today < InfiniteFuture

Functional Programming Adverbs & More

We’re always working to make the Wolfram Language easier and more elegant to use, and Version 12.1 contains the latest in an idea we’ve been developing for symbolic functional programming. If you think of a built-in function as a verb, what we’re adding are adverbs: constructs that modify the operation of the verb.

A first example is OperatorApplied. Here’s the basic version of what it does:

OperatorApplied
&#10005
OperatorApplied[f][x][y]

Why is this useful? Many functions have “operator forms”. For example, instead of

Select
&#10005
Select[{1, 2, 3, 4}, PrimeQ]

you can say

Select
&#10005
Select[PrimeQ][{1, 2, 3, 4}]

and that means you can just “pick up” the modified function and do things with it:

Map
&#10005
Map[Select[PrimeQ], {{6, 7, 8}, {11, 12, 13, 14}}]

or (using the operator form of Map):

Map
&#10005
Map[Select[PrimeQ]][{{6, 7, 8}, {11, 12, 13, 14}}]

OK, so what does OperatorApplied do? Basically it lets you create an operator form of any function.

Let’s say you have a function f that—like Select—usually takes two arguments. Well, then

OperatorApplied
&#10005
OperatorApplied[f][y]

is a function that takes a single argument, and forms f[x,y] from it:

OperatorApplied
&#10005
OperatorApplied[f][y][x]

OperatorApplied allows for some elegant programming, and often lets one avoid having to insert pure functions with # and & etc.

At first, OperatorApplied may seem like a very abstract “higher-order” construct. But it quickly becomes natural, and is particularly convenient when, for example, one has to provide a function for something—like as a setting for an option, the first argument to Outer, and so on.

By default, OperatorApplied[f][y] creates an operator form to be applied to an expression which will become the first argument of f. There’s a generalized form in which one specifies exactly how arguments should be knitted together, as in:

OperatorApplied
&#10005
OperatorApplied[f, 4 -> {3, 2, 1, 4}][x][y][z][u][v]

CurryApplied is in a sense a “purer” variant of OperatorApplied, in which one specifies up front the number of arguments to expect, and then (unless specified otherwise) these arguments are always used in the order they appear. So, for example, this makes a function that expects two arguments:

CurryApplied
&#10005
CurryApplied[f, 2][x][y]
CurryApplied
&#10005
CurryApplied[f, 2][x][y][z][u][v]

Needless to say—given that it’s a purer construct—CurryApplied is itself curryable: it has an operator form in which one just gives the number of arguments to expect:

CurryApplied
&#10005
CurryApplied[2][f][x][y][z][u][v]

In Version 12.1, there’s another convenient adverb that we’ve introduced: ReverseApplied. As its name suggests, it specifies that a function should be applied in a reverse way:

ReverseApplied
&#10005
ReverseApplied[f][x, y, z]

This is particularly convenient when you’re doing things like specifying sorting functions:

Sort
&#10005
Sort[{5, 6, 1, 7, 3, 7, 3}, ReverseApplied[NumericalOrder]]

All of this symbolic functional programming emphasizes the importance of thinking about symbolic expressions structurally. And one new function to help with this is ExpressionGraph, which turns the tree structure (think TreeForm) of an expression into an actual graph that can be manipulated:

ExpressionGraph
&#10005
ExpressionGraph[{{a, b}, {c, d, e}}]
ExpressionGraph
&#10005
ExpressionGraph[{{a, b}, {c, d, e}}, VertexLabels -> Automatic]

While we’re talking about the niceties of programming, one additional feature of Version 12.1 is TimeRemaining, which, as the name suggests, tells you how much time you have left in a computation before a time constraint hits you. So, for example, here TimeConstrained said the computation should be allocated 5 seconds. But after the Pause used about 1 second, there was a little less than 4 seconds remaining:

TimeConstrained
&#10005
TimeConstrained[Pause[1]; TimeRemaining[], 5]

If you’re writing sophisticated code, it’s very useful to be able to find out how much “temporal headroom” you have, to see for example whether it’s worth trying a different strategy, etc.

Now We Can Prove That Socrates Is Mortal

In using the Wolfram Language the emphasis is usually on what the result of a computation is, not why it is that. But in Version 11.3 we introduced FindEquationalProof, which generates proofs of assertions given axioms.

AxiomaticTheory provides a collection of standard axiom systems. One of them is an axiom system for group theory:

axioms = AxiomaticTheory
&#10005
axioms = AxiomaticTheory[{"GroupAxioms", "Multiplication" -> p, 
   "Identity" -> e}]

This axiom system is sufficient to allow proofs of general results about groups. For example, we can show that—even though the axioms only asserted that e is a right identity—it is possible to prove from the axioms that it is also a left identity:

FindEquationalProof
&#10005
FindEquationalProof[p[e, x] == x, axioms]

This dataset shows the actual steps in our automatically generated proof:

Dataset
&#10005
Dataset[%["ProofDataset"], MaxItems -> {6, 1}]

But if you want to prove a result not about groups in general, but about a specific finite group, then you need to add to the axioms the particular defining relations for your group. You can get these from FiniteGroupData—which has been much extended in 12.1. Here are the axioms for the quaternion group, given in a default notation:

FiniteGroupData
&#10005
FiniteGroupData["Quaternion", "DefiningRelations"]

To use these axioms in FindEquationalProof, we need to merge their notation with the notation we use for the underlying group axioms. In Version 12.1, you can do this directly in AxiomaticTheory:

AxiomaticTheory
&#10005
AxiomaticTheory[{"GroupAxioms", "Quaternion", "Multiplication" -> p, 
  "Identity" -> e}]

But to use the most common notation for quaternions, we have to specify a little more:

AxiomaticTheory
&#10005
AxiomaticTheory[{"GroupAxioms", 
  "Quaternion", <|"Multiplication" -> p, "Inverse" -> inv, 
   "Identity" -> e, "Generators" -> {i, j}|>}]

But now we can prove theorems about the quaternions. This generates a 54-step proof that the 4th power of the generator we have called i is the identity:

FindEquationalProof
&#10005
FindEquationalProof[p[i, p[i, p[i, i]]] == e, %]

In addition to doing mathematical proofs, we can now use FindEquationalProof in Version 12.1 to do general proofs with arbitrary predicates (or, more specifically, general first-order logic). Here’s a famous example of a syllogism, based on the predicates mortal and man. FindEquationalProof gives a proof of the assertion that Socrates is mortal:

FindEquationalProof
&#10005
FindEquationalProof[
 mortal[socrates], {ForAll[x, Implies[man[x], mortal[x]]], 
  man[socrates]}]

I think it’s pretty neat that this is possible, but it must be admitted that the actual proof generated (which is 53 steps long in this case) is a bit hard to read, not least because it involves conversion to equational logic.

Still, FindEquationalProof can successfully automate lots of proofs. Here it’s solving a logic puzzle given by Lewis Carroll, that establishes (here with a 100-step proof) that babies cannot manage crocodiles:

FindEquationalProof
&#10005
FindEquationalProof[
 Not[Exists[x, And[baby[x], manageCrocodile[x]]]], {ForAll[x, 
   Implies[baby[x], Not[logical[x]]]], 
  ForAll[x, Implies[manageCrocodile[x], Not[despised[x]]]], 
  ForAll[x, Implies[Not[logical[x]], despised[x]]]}]

Geo-Everything

The Wolfram Language knows about many things. One of them is geography. And in Version 12.1 we’ve substantially updated and expanded our sources of geographic data (as well as upgrading our server-based algorithms). So, for example, the level of detail available in typical maps has increased substantially:

GeographicData map

For many years now we’ve had outstanding geodetic computation in the Wolfram Language. And we also have excellent computational geometry for doing all sorts of computations on regions in Euclidean space. But of course the Earth is not flat, and one of the achievements of Version 12.1 is to bring our region-computation capabilities to the geo domain, handling non-flat regions.

It’s an interesting exercise in geometry. We have things like the polygon of the United States defined in geo coordinates—as a lat-long region on the Earth. But to use our computational geometry capabilities we need to make it something purely Euclidean. But we can do that by using our geodesy capabilities to embed it in full 3D space.

So now we can just compute the centroid of the region that is the US:

RegionCentroid
&#10005
RegionCentroid[Polygon[Entity["Country", "UnitedStates"]]]

That third element in the geo position is a depth (in meters), and reflects the curvature of the US polygon. And, actually, we can see this directly too:

DiscretizeRegion
&#10005
DiscretizeRegion[Entity["Country", "UnitedStates"]["Polygon"]]

This is a 3D object, so we can rotate it to see the curvature more clearly:

alt

We can also work the other way around: taking geo regions and projecting them onto a flat map, then computing with them. One knows that Greenland looks very different sizes with different map projections. Here’s its “map area” in the Mercator projection (in units of degrees-squared):

Area
&#10005
Area[GeoGridPosition[Entity["Country", "Greenland"]["Polygon"], 
  "Mercator"]]

But here it is (also in degrees-squared) in an area-preserving projection:

Area
&#10005
Area[GeoGridPosition[Entity["Country", "Greenland"]["Polygon"], 
  "CylindricalEqualArea"]]

And as part of the effort to make “geo everything”, Version 12.1 also includes GeoDensityPlot and GeoContourPlot.

Advance of the Knowledgebase

Every second of every day there is new data flowing into the Wolfram Knowledgebase that powers Wolfram|Alpha and Wolfram Language. Needless to say, it takes a lot of effort to keep everything as correct and up to date as possible. But beyond this, we continue to push to cover more and more domains, with the goal of making as many things in the world as possible computable.

I mentioned earlier in this piece how we’re extending our computational knowledge by curating one particular new domain: different types of data structures. But we’ve been covering a lot of different new areas as well. I was trying to think of something as different from data structures as possible to use as an example. I think we have one in Version 12.1: goat breeds. As people who’ve watched our livestreamed design reviews have commented, I tend to use (with a thought of the Babylonian astrologers who in a sense originated what is now our scientific enterprise) “entrails of the goat” as a metaphor for details that I don’t think should be exposed to users. But this is not why we have goats in Version 12.1.

For nearly a decade we’ve had some coverage of a few million species. We’ve gradually been deepening this coverage, essentially mining the natural history literature, where the most recent “result” on the number of teeth that a particular species of snail has might be from sometime in the 1800s. But we’ve also had a project to cover at much greater depth those species—and subspecies—of particular relevance to our primary species of users (i.e. us humans). And so it is that in Version 12.1 we’ve added coverage of (among many other things) breeds of goats:

Entity
&#10005
Entity["GoatBreed", "OberhasliGoat"]["Image"]
EntityList
&#10005
EntityList[
 EntityClass["GoatBreed", "Origin" -> Entity["Country", "Spain"]]]

It may seem a long way from the origins of the Wolfram Language and Mathematica in the domain of mathematical and technical computing, but one of our great realizations over the past thirty years is just how much in the world can be put in computable form. One example of an area that we’ve been covering at great depth—and with excellent results—is food. We’ve already got coverage of hundreds of thousands of foods—packaged, regional, and as-you’d-see-it-on-menu. In Version 12.1 we’ve added for example computable data about cooking times (and temperatures, etc.):

Entity
&#10005
Entity["FoodType", "Potato"][
 EntityProperty["FoodType", "ApproximateCookingTimes"]]

ExternalIdentifier, Wikidata & More

Books have ISBNs. Chemicals have CAS numbers. Academic papers have DOIs. Movies have ISANs. The world is full of standardized identifiers. And in Version 12.1 we’ve introduced the new symbolic construct ExternalIdentifier as a way to refer to external things that have identifiers—and to link them up, both among themselves, and to the entities and entity types that we have built into the Wolfram Language.

So, for example, here’s how my magnum opus shows up in ISBN space:

ExternalIdentifier
&#10005
ExternalIdentifier["ISBN10", "1-57955-008-8"]

Right now we support 46 types of external identifiers, and our coverage will grow broader and deeper in the coming years. One particularly nice example that we’re already covering in some depth is Wikidata identifiers. This leverages both the structure of our built-in knowledgebase, and the work that we’ve done in areas like SPARQL support.

Let’s find our symbolic representation for me:

CloudGet
&#10005
\!\(\*NamespaceBox["LinguisticAssistant", 
    DynamicModuleBox[{Typeset`query$$ = "stephen wolfram", 
      Typeset`boxes$$ = 
       TemplateBox[{"\"Stephen Wolfram\"", 
         RowBox[{"Entity", "[", 
           RowBox[{"\"Person\"", ",", "\"StephenWolfram::j276d\""}], 
           "]"}], "\"Entity[\\\"Person\\\", \\\"StephenWolfram::j276d\
\\\"]\"", "\"person\""}, "Entity"], Typeset`allassumptions$$ = {}, 
      Typeset`assumptions$$ = {}, Typeset`open$$ = {1}, 
      Typeset`querystate$$ = {"Online" -> True, "Allowed" -> True, 
        "mparse.jsp" -> 0.488214`6.140155222562331, 
        "Messages" -> {}}}, 
     DynamicBox[
      ToBoxes[AlphaIntegration`LinguisticAssistantBoxes["", 4, 
        Automatic, Dynamic[Typeset`query$$], Dynamic[Typeset`boxes$$],
         Dynamic[Typeset`allassumptions$$], 
        Dynamic[Typeset`assumptions$$], Dynamic[Typeset`open$$], 
        Dynamic[Typeset`querystate$$]], StandardForm], 
      ImageSizeCache -> {117., {7., 16.}}, 
      TrackedSymbols :> {Typeset`query$$, Typeset`boxes$$, 
        Typeset`allassumptions$$, Typeset`assumptions$$, 
        Typeset`open$$, Typeset`querystate$$}], 
     DynamicModuleValues :> {}, 
     UndoTrackedVariables :> {Typeset`open$$}], 
    BaseStyle -> {"Deploy"}, DeleteWithContents -> True, 
    Editable -> False, SelectWithContents -> True]\)

Now we can use the WikidataData function to get my WikidataID:

WikidataData
&#10005
WikidataData[Entity["Person", "StephenWolfram::j276d"], "WikidataID"]
InputForm
&#10005
InputForm[%]

Let’s ask what Wikidata classes I’m a member of:

WikidataData
&#10005
WikidataData[Entity["Person", "StephenWolfram::j276d"], "Classes"]

Not that deep, but correct so far as I know.

There’s lots of data that’s been put into Wikidata over the past few years. Some of it is good; some of it is not. But with WikidataData in Version 12.1 you can systematically study what’s there.

As one example, let’s look at something that we’re unlikely to curate in the foreseeable future: famous hoaxes. First, let’s use WikidataSearch to search for hoaxes:

WikidataSearch
&#10005
WikidataSearch["hoax"]

Hover over each of these to see more detail about what it is:

WikiDataSearch["hoax"]
&#10005
WikidataSearch["hoax"]

OK, the first one seems to be the category of hoaxes. So now we can take this and for example make a dataset of information about what’s in this entity class:

WikidataData
&#10005
WikidataData[
 EntityClass[
  ExternalIdentifier["WikidataID", 
   "Q190084", <|"Label" -> "hoax", 
    "Description" -> 
     "deliberately fabricated falsehood made to masquerade as the truth"|>], All], "WikidataID"]

We could use the Wikidata ExternalIdentifier that represents geo location, then ask for the locations of these hoaxes. Not too many have locations given, and I’m pretty suspicious about that one at Null Island (maybe it’s a hoax?):

GeoListPlot
&#10005
GeoListPlot[
 Flatten[WikidataData[
   EntityClass[
    ExternalIdentifier["WikidataID", 
     "Q190084", <|"Label" -> "hoax", 
      "Description" -> 
       "deliberately fabricated falsehood made to masquerade as the \
truth"|>], All], 
   ExternalIdentifier["WikidataID", 
    "P625", <|"Label" -> "coordinate location", 
     "Description" -> 
      "geocoordinates of the subject. For Earth, please note that \
only WGS84 coordinating system is supported at the moment"|>]]]]

As another example, which gets a little more elaborate in terms of semantic querying, let’s ask for the opposites of things studied by philosophy, giving the result as an association:

WikidataData
&#10005
WikidataData[
 EntityClass[All, 
  ExternalIdentifier["WikidataID", 
    "P2579", <|"Label" -> "studied by", 
     "Description" -> 
      "subject is studied by this science or domain"|>] -> 
   ExternalIdentifier["WikidataID", 
    "Q5891", <|"Label" -> "philosophy", 
     "Description" -> 
      "intellectual and/or logical study of general and fundamental \
problems"|>]], 
 ExternalIdentifier["WikidataID", 
  "P461", <|"Label" -> "opposite of", 
   "Description" -> 
    "item that is the opposite of this item"|>], "Association"]

What Is That Molecule? Advances in Chemical Computation

You have an image of a molecular structure diagram, say from a paper. But how can you get the molecule it represents in a computable form? Well, with Version 12.1 all you need do is use MoleculeRecognize:

MoleculeRecognize
&#10005
MoleculeRecognize[CloudGet["https://wolfr.am/L9rL9B2K"]]

It’s the analog of TextRecognize, but for molecules. And what it produces is a Wolfram Language symbolic representation of the molecule. So, for example, you can then generate a 3D structure:

mol = MoleculeRecognize
&#10005
mol = MoleculeRecognize[CloudGet["https://wolfr.am/L9rL9B2K"]];
MoleculePlot3D
&#10005
MoleculePlot3D[mol]

Or you can compute the distribution of torsion angles of the structure:

Histogram
&#10005
Histogram[MoleculeValue[mol, "TorsionAngle"], 360]

You can also connect to the world of external identifiers:

MoleculeValue
&#10005
MoleculeValue[mol, "PubChemCompoundID"]

But what’s really useful about MoleculeRecognize is that it can be used programmatically. Take all the images of chemicals from a paper, “molecule OCR” them—then do things like check whether the molecules are equivalent, or make a word cloud of their 3D structures:

WordCloud
&#10005
WordCloud[
 MoleculePlot3D /@ DeleteDuplicates[MoleculeRecognize[{\!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3Qm4VVX5BvBSTGwSM8vUTEzTUjOHDNEyVDSaQ8kUS0HAqBwCC9RSaZJM
Tc1GrTSFQrFBDQQtpcxyaLCyOacsxyybbF5/f+v/LJ7t6XI498K5Z59zv/d5
Tsa955y799rrXd/7DetbIycfNX7aGo95zGNmDn/kf8ZPmjVmxoxJx+434pF/
TDhy5vTDj5w6ZdyRx0w9fOqMUZPXfOSH8x953fvIa9gjrxQIBAKBQCAQCAQC
gUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAg
EAi0gH//+9/pb3/7W/rNb36Tbrnllvy69dZb00MPPZT+85//dPryAoEhCxy8
6aab0oIFC9LZZ5+dZs2alSZPnpxfRx99dDrzzDPTV77ylfTTn/40Pfzww52+
3EBgSOHOO+9M5513XjrggAPSNttskzbccMO07rrrpic+8Yn59eQnPzltvPHG
aeedd07Tp09PixYtSg8++GCnLzsQGBK4//7701lnnZVe8IIXZD4OGzYsbbLJ
JmnXXXdN48aNS/vuu2964QtfmJ7+9Kenxz3ucWm99dZLY8eOTQsXLkx//vOf
O335gUBPg9950UUXZT6utdZa2Xbutttuac6cOenLX/5y+s53vpO+9a1vZT4e
f/zxaaeddkrDhw9P66yzTnrVq16VvvnNb4aPGgi0ET/4wQ/SG97whsw7NvS1
r31t5uY999yT+fvPf/4z/eMf/0h//etf0913350uuOCC9JKXvCStvfbamc94
e++993b6NgKBnoT47TnnnJM233zzpO3wqFGjluvX//73v//zfj/74x//mD75
yU+mrbfeOj32sY9No0ePTkuXLu3A1QcCvY8//OEPaebMmdmGsov+/+9+97um
n8HTX//61+nQQw/Nvilbesopp2Q7GwgEVi9+9rOfpQkTJmR7KGZ77rnnZtu6
MtC/c+fOTeuvv37+7BFHHJF++9vfDsIVBwJDC1dffXXaY489luvcJUuWtPzZ
888/P2255Zb5s/I1P/7xj9t4pYHA0MTixYszN/Hsla98Zbrhhhta/qxaBjFe
nx0zZkxatmxZG680EBia+PrXv77cju65557pG9/4RsufvfDCC3PcqPD7xhtv
bOOVBgJDE3KfL3vZyzLP1BZ98YtfbOlzfNYPfvCD6SlPeUr+7GGHHZZuu+22
9l5sIDAEIT570EEH5bjPBhtskE4//fT0l7/8ZaWfU2s/bdq0HAsW2z3uuOOi
LjAQaAPkQU866aQ0YsSIXP83fvz4rFmb1Q2pZ1CXtOOOO2Zub7HFFmnevHlR
axQItAFynVdddVWO+ay55pppo402SrNnz877WuRXGuFn3/72t3Mct9T1vulN
b8rvDwQC7QGNeuqpp6bNNtss85RdPOqoo9Jll12WfvKTn6Tbb789v26++eZ0
8cUXp4MPPjjrYjaUDzt//vz097//vdO3EQj0LGhUXMRL+9HYRhx88YtfnKZM
mZJmzJiRX2984xvTDjvskOuKcHnkyJHpfe97X/ZNA4FAe0HD2tt97LHH5j0v
9qA9/vGPX75v1OsJT3hC/pl6pL333jt96EMfynVKrdQlBQKBVYdYEE176aWX
pne/+91p4sSJaa+99sr7RtU5yNGo0WU7+bD2uvzrX//q9GUHAkMKdK/9aPal
/fznP881DmK48qb2kP7qV79KDzzwQPY/+9oXEwgEBgf4R8Piov0seMvO4nBw
MxDoDHCP/bR/BR+rwFWxIT1VgqOBQGdg7/ZnPvOZXOMnFlTFD3/4w+yH+v19
993XoSsMBIY27C3TM0XfsUsuueRRv/vc5z6Xtt122/SKV7wi+6WBQGDwIVa7
++67p2c84xnZXlZxxhln5HyM/KjahkAgMPgQv2UrcfHTn/70o36n1l5dw/bb
b5/3jQYCgcHH5z//+fS85z0vOBoI1BTB0UCg3lAv//znPz84GgjUFGqK9LYO
jgYC9cT3vve9fDZEcDQQqCdwVO+w4GggUE+oTbDnrC+OfuQjH8l50+BoINA5
XHfddWmfffbpk6Mf/ehHcx+V4Ggg0Dk0s6Mf/vCH09Oe9rTg6BCCvcH2O8Ue
4frAOaL6YIc/OnRhH+Itt9ySrrjiivTZz34214A62/1rX/ta7vP68MMP9/k5
e6XkBbyc6RV7o9qD73//+yuM6zrjW6+j4Ghvwt5De5s+9alP5X7m6radd2ld
1oOOvtLvCm/Nk8YezJdffnk+t9b77McI29se6Pv3mte8JvzRIQZ7+Z1X8OY3
vznzUQ8rfc2dQfDMZz4zP3f9l9dbb7201VZb5V6tzhyp9rFy3p7zubzPet5X
39fAqqMZR88888z8c3VIzvgO9Ab01XCOO36K2+MmHr7+9a/P/ef0dH3/+9+f
+1htt912mb94ePjhh+eeykXTOjNa71day9yJfujtQeGo2BDNU/UpSu7lWc96
Vrrgggs6eJWB1Qn7+t/73vdmW7nWWmulXXfdNWsmPunvf//7/HvnSOvtSufK
n+vduskmm2Rf1e+BHaWNn/rUp2a+Rq/I9qBwVJ/O00477VF9re3xtr56NmII
gd6AMwf23XfftMYaa6TnPOc5mV+42cgx/3buyIIFC/J5lvor64WOu4C/tK7+
rrgbWrc9+O53v5v7LAwfPjy95z3veVRc4Atf+ELeW+rV6tlqgXpDzypalj7y
zKdPn55jt81isnfccUf+jPiQ3jneD/Tts5/97LTOOuvk9T042h44H3jcuHFZ
y8i1VPuOnXPOOVnnRsyodyBfMnny5GwTxRo+8YlP/E+vuUawp3fffXeO7erj
WrQW34g/6nwR/mucLdIelNyLeLvnZS0sOWz6VsxvqHDUXPzTn/6UtUSv+lY/
+tGP0qtf/ep8fqweOPInrUA8yJyojkuJGfku9jU42h4Uf/RJT3pSOuaYY7Km
FSvS28i/h4IdNe/uuuuu9NWvfjVrCb7V0qVLs+3oNa6WvYh45QwCvulAUWJG
bPIHPvCB4GibUDgqfsDv3GWXXXK8j58hTyYu35+zvrsNDz30UFq2bFk+R8Pc
de9eaq9OPPHEdM011+T39ArUEonj4qg4BF9noNALS17OHNH7dWWaOdB/0C7O
eXnRi16Un5k4vDyYWN+mm26ax97P+S3msHNhesWuiFdan+g12k+Ob/3118+a
wVk3eOq+5R34AHwCOrjb4Z75Np7r2LFjW+7JarxoDf3Si70sHGVHTz755LCj
qxHqM53rYozlrdWS4KN8tbz2xz72sXw+2v7775/9jbXXXjvb2JkzZ2bN61l1
K1f523IH4mFqZ7beeut8/+5djv7888/PuYZ3vOMdee3CU+856KCD0sc//vGc
Q1xR/Wo3wJlaYkb98Uf5onq8qm/ARTyHonXFG/lHUQu46sArz0ht5Vve8pa0
44475tovZ4rut99+2QcVt2Mv5Knt/xZzf/nLX579UnUOdNIJJ5yQ9aGcWrfU
lsgtuCf+2BFHHJF5RzOIiekBbr7p0y9eZA2Tb1i4cGHOTbAVxom+ML/9XHy0
G9cpPLLW8m3ECe1taqzDbYQ12Zh5v7P0Cq9LzIjmUuMSNfUDBx7h5pVXXplz
oM4BNufUJtA9eHj99ddnv6s679gc56Xho7oU2khezVmkNKDYivyqOV1nqJlR
m+p6S88JtTEvfelLs11wD433bszc1y9+8Ys8FydMmJDno1oP8/S4447L42md
qjM8Pz61dUUdH47iJW2Pp2IRxmZF6w39Om/evOwD8IXoLnWEUDhqLtEY3bhm
1QFsov1nOGZ/Ao6xH3yu448/PvPPc1xR/tna6Dnh+JIlS/LZ3mpO+G5skHO+
6Ub6sW45bLwr3Bw/fnzWC+5/9OjR+T7YA3uqml23eYeH1jB8Vp9TalTtjfcz
HHBuVZ3m6IMPPpiuvfbadMopp+T4rectl+ZZuhfc5N/QRm9729uyfm28frpj
8eLFuXbBOdF0Ld+8+OWFo7gecd3+Aa/4+DfeeGM6++yz8/MQAzHOYrUHHnhg
7qtLz7V6pqj38MPUmNA11lP61xqqXtBz9jxpwE77Ja6TzTCfcNM8klsSq546
dWr60pe+lONf8sBVre7cKfdgbOzZKn6ne8djv8dH+fqy3nnhAK5aDzqtKfBM
rIF/SCPRoXSD67UmeTa0rbHx3MR76CLjwg/Ha/Ex67q9LNYka7p6v2nTpvVZ
Ux/50f7BM+BXqQGhz6x99JlnJX5pXb3pppvycxqIL1lqOMUD586dmzngu9kV
53+LKyxatCjblcH2Vc2d2267LV144YV5PskZmVuuj7/NnrIXjdwskCv0Pr6q
/ZT0obMdi30pmsK9yWEcffTR+W/YF2Lts26Z+7SgvzGYcD80gbyuPZ/4Z116
7nOfm2MP9i2JyZb7+OUvf5l5xddWy4eH9Cxei4/Jy9BJ6gVx3LiIG1U1R3C0
/6BH5eHZNPOM3WQ/aRXctM7zzVaHJpULowH1QxJD4tvxb8UV5BbZGjwejHxF
OUNVPMgaUWI87Lxr4YNbl2jfZjZezBbvcNrnxcf4nY050qIpxNf4YcbXZ3BV
jbl4sf0IbPVgrFN0rb2dc+bMyeuk9dLa/LrXvS7vZ7FmN65LxoGOEsd2vc7N
479Y07xwlm+wxx57pFmzZmWfqPFZOqfLvYt/+55O6yfgc8lXuDbj3wzWXu8X
Q5X/L3XIfcH43XnnnTmOygbRWmyBz5rnxrLZusxv4GPII48ZMybHQ4yzNV4O
wXfgUztyzL7T9RkT/i1Nhav0kznCnpsj7cpvGxe61BpEl/nb5po1w5xlV8Ql
W1mX2AHxI5/Tk8B6w4awLbiojq76HDxj/GB/rQP+Jv+O/WKT8LvEv9sBupou
pUldozG3tqhDcQ/mRF/7WQpwlp4yT8038V58nThxYo5niyXS/rSJNanRJ6Ix
6AnzlF3udFzXM7bn2RqNByurjbPuukfrkzHDu0YYO/dmDVJ3x0/cbbfdsg20
HnvO9Kmx8x7vra5Vxris5cXvYjvlCIy1fQniku2u/3Ad5m7RWu985ztzjMJ8
pbXkXK1tnunqeo7GztrAh5IPocusS3J+Rx55ZH4+fGNzqz+2zHPGadrwrW99
a75++T/PxJ5ne4Gsp+U5uB/j6zPWV7Eoz9s6gTM0o/UDX1aXpvC3zQV7w+wV
87xdo7lprM01NsS9tDLexhJXPT/fa864R5qr2fj5nN/XpUeZa5k9e/byfhLq
Opvpb/fLJxo2bNjyvZnV+3B/Yjr4J39OW1h/6SVzmy3EN/82/t7jvWq3ik2g
bT1/ZyqZn7QdTltXnf3bmE9oN4oGtO6ay3w688d47bzzzlmHsre33nrrgPV2
8bus3+yUtUANAm3HZxK79P1szEB1ZjXvImZt7eTfFRtFm7AvVZ/bf0tMzRk6
+E0DiinxVa2h5gx/gz8/0OvyzN07n1NdiblhbVZvgrOueaB7Ajw/f8NnuyXv
XYX7tkaKM/e117IR1vBJkyZlf1qdDu1R3m8sPEvfx1/EYxyVt7QOnnTSSblW
1HMWM6S35Kb4GXLJtInvYJesG+wtncUvpLvom06ua2WOWyesF3IU1p1SA8G/
ETfs71y1XtOifF1+prGjR/lM4lfWPLp/dc2vEh8TA6Kh7OHzDKwJ/r5aLPax
eh8+U/iNqyV2xa7S4bhK/xub/sSVSl2xecefcB38RfUlYsrWbna6TvmfwUaV
o/gkD9zMFlQ5il98tdJjwliqt7M2+z7xBrFvNoZ+4+N4+Y6rr7462095A3Fy
9sJ88V3suFiB9bP4DXRXp/2CAuuE6xQPpH/x0/zGLTFndU3yBCvT4u6TL+g+
5WPZDc/A9xkb4yZm1Kq2G8h9eB56WYopqc+ic1xHieX25aviKl+ZLyJWSufQ
O3Sz5y1OYW9NszVF7prtlUOjm3GTPVdHYd33/NnWuuWmOwHjSFvhiTVxZfWJ
VY4Wu1ueofVezpL99F1sY185gRLrN4/ZHnPb37d2ymOV31vH6ay6rqGukb8m
l0En8JvMcf+lFfiO9GvjXDW+xlHdf7FHPsf3NOf5orSEcWv3ulTGmr82f/78
nM+gX0p+uNQIWiuq1+Ie+HWeuZgeXwRXrVV0E57hPt1chXWLXqJD2F73TNeW
Hn18Tj5jndbkOuBd73pX5oi1UF1FM61S5aj3s7vV3C/b6Xdi5XIGzfjudzQe
f4tNLnWWnc6X9wclhmi9KbEOcTFrlLlqX43aOnPVOKkXwE1ano7HTVqZbhaL
Mx70yGD7TUX/lphN2XfgmYj38VPoBratXFupgSg2US6RBsBv6265J3F99tp3
l1phNldMgn7iF7PJtHcn7r0bUDiqN4S1jH9Ay/T1Mr/kAvDQGNOn4PnSzGwo
+2oOmo8rgzXcumCu0sfskXW021B6HRgjcTTzU/yz1MLwK8WA+Vilrtj8F38x
buIlddB25T74/65VLbN7KLUceIiP1ZhFiQHTt3TulClTsi22ThUOsqu4Wfbc
WMvFJPi+8iwry/EOdRSO4h0bQIPwj/p64afn1chRPiMb4udifmxqq2MuVkrv
+SytzJZ0K9wzm8nmWKdKfFZchQYWCzVu/HDxYfEX8dq+8nSdBM5ZM8QD3v72
t+f7KPEhvmqJtVZ9btdPg4kb8ckPOeSQ5XvkvMT3vdhadpktsI4HN1eOKkfZ
QbmRUpfR+GIjy3urHC19+Pxc7rQ/PUjEj8Qwfdbzk4vsZhQfT5zM3gp7xNhU
Y8uG8N3U79F2dH1d/e2Sc7KGuA8+jrwqzcNO8l0b6woB59hF9pH9lKfybK3t
NLQ1mR/fak31UIc5UvbdGUd+Pz+JPqPXzCf1GWIH7Cudxtdo5KhYPe3i5/QM
7dYqcNTf9Fl7T1rtF1V38KvoRmOhPsT9yXP4tzxSp3Vtqyj6V/xPrJcd5auy
jZ51qWWmdaucc39yUfRy6Y/g3sPn7B+sd/LGOMo+4qOYP62GO7in5q3U9Ik5
mmeNHFUTisN+bt30vlbBh2F7fdba4Ln2EuQv1H+6PzXH4m5lLpc4Ersy2PXq
/UXJOclZso9iSSXmRQfZJyhnIj5UoJ5YvN69izXQFoH+QUyOLy9eRIvxofhT
dIiXtdA6ai55RvxOcaVGjppjasr83HPz81b2C1T35vqsNcKa0Euorl98u2os
TW6GbTLuq9LLbjBR6grthZM7klel4fna4glsatEIVY6ytyWXHmgd6mLl5Iwh
nsrJy3utCNXcS5Wj/Bb1DPYDyaPQxmIKrfx93FaLyNf1HK0bvYRmHLW/Rg8F
65q9HN2Cas2VvCp9oHZTzFEepcR9qxw1P3qhv9lgQ17vgAMOyGPopWZzIBxl
Z0s/Nb+Tk2Yfmn0Xne15ikNYH9Tu+o5ei/M142jpC8vXkIfoNpS91bSPGjP5
GnqgaCg+bOlNHRwdGMTk2DEcWRWOAl/DPhcxenaRj2KPCl+2GvcrPfz1PRAr
UverPt++DjHEXkMzjqrFEYczBmK93Yqyn00srBqv5YvTv8HRgaOck7E6OOo5
iffIwRTtKk6Lt/YG81+87EfzvORZijZWyy3m14v9hlvhqJoGOeVuRl95lGpO
Ljg6MLBb4jSlRl7tWrNxZP/UkeC0PKpar6qNLGfGif2yp76znJlsH5eXuG+x
tX6nfsXeq17zQwta4ahx4Bv0GsIfXXXghZi5ehC1JPatN8vb4aAaapyTL2ns
JWEtZYflbtTqy7GWvZb0HF7ip5/RuXKz+qCYt72aM2uFo/REVZP0CvTyUo8d
HB04yvnk9v7xD1d2ZqP3qx+hy9SDiTk1vr/U2bC5cqzsg+cjxyNHpq7Jz2hb
uZwV9crqFbTCUXkvdb69Br5RydUFRwcOdrOcfddKTJXP6P1sarP3F656L3uN
s/KB/r+f9bffR7eiFY7SFtbJXoO9LiVvEBytP3B2KNZntsJR9ZV0Sa+BRiu5
F/t7GveTBgJ1QCscFQPvVX9UHb17l1ujpQKBuqEVjtqzpW9Ir0HPhXLvelPw
TwOBuqEVjtpHIqbea6jmR8OOBuqKVjhq77Q6j16DvVLqkaOmPlBnDOX8qD0T
Ja6rB11o3UAd0QpHu7WmfmVQD67HjnvHVftkAoG6oRWOqq3Um7/XwI7qFRMc
DdQZrXDUvp9u2j/aKqo9KPQ/YlcDgbqhFY7qYdCLMaNqXDdiRoG6ohWO6sPg
nKRew+rY9yLO5FXX/omB7kcrHNUX1TkevYZV5aheHdYuZ7M1O+s2EFgVtNqH
Qe+1XsOqclR9rx4Bzgq2hyoQaAeG8t40cVznng+Uo/ZN6acudxwx4UC70ApH
9Z3pxXrdan50IBy1T4reda5F3fsPB7oXQ7lXCm7pl7UqMaOyp3Eo7msMDA7s
/Sj9QlbEUedY9eL+UXGeUsOgd3O1h30gUBfoReCMlGYcdY6a83F7DdWaej0n
9fYMBOqGVjjqzBS9iHsNzmMtfRj0oONbBgJ1QytaV8yoW/Oj+mBde+21WQeI
vVZrDXC09MAOrRuoK5rFjJyJ7dxlZ0p24x5v+UvnibKV2267bd4XUNWz1VpA
5/Pdc889HbzaQKBvNOOo3IS+48400p+rW8BWspE4OXr06Fwnte666+a+rGK5
BXq7TpgwYfm5jvpCBgJ1QzOO6oMqrqIep5w1VmeU8y0vueSSdNhhh6WRI0fm
vJF+6HonO2NKT9YC7y3n8kWvlEBdUeWoed147hSeetU9/8c+qvmxzji/Gzf1
YTrkkENy/3Q2U51BuQ98doZaOd+dPxpx3UAdYc/G1KlT89k3ztNV8+cswGZn
dtQJzhll6+fOnbu8X7c4tLNAnGXo3AK1CdVYET916dKlmc+bb755PiP+xBNP
zH3TA4G6ARfVJzibVfzWeav6WF522WU5hlLXXv0455znc889Nx144IHLda3z
7uQ6ndNDu1bXGrZTrkl9rViY8731atpmm21WepZQINBJ0LfO0LV/Qw9Ac5cd
mjNnTj4Psk55Q2uGWNall16a+4Q5n1v/34033jjXDC1cuDCfKVs9C8R/77rr
ruyn6rdgHXKu3hZbbJH9UecD8U3rrucDQxfsC/13zTXXZM3n/GR60byXP3Se
uZ7unT5/la6VDxJrpms32GCDHK8Vuz3hhBPSddddl/Vq1fbTucuWLcs1CqNG
jcpn5FmH+KH2fdLCztLrtfPZA72HckYVGyX24hw5NmrEiBH5v9OmTctnKNOP
g91zwLXxke2n1htM7xa9IXCOXbz44ouz7ayuIf6/eFjRBxtuuGHmZ9EH6jPE
mejbsJ+BbgIbJP5pftvrwo6qqWevdt9995xjlMMYDP2LP7fffnu6/PLLl2tU
/c/UD1szaFf+ZdV24pvPzJ8/P+9pcXatdUZs6NBDD83fZZ2xHgU3A90M2k9t
HP1r3xZ+0r98Veegn3XWWbmuoR37JnFH/68rr7wy50SKrt10001zDTyNqqcf
7Vu16XgnnzJr1qy0ww475HWFrlWnIE4k/lvNvwQC3Y6q/hWjUS8nP0Nnbrnl
lvl8XXWw8jera977e2r05ICsBTQqru2yyy7ZVxbD4jtX/Uc8FRO66KKLMh/Z
fbrWter3d/3112efM+K2gV5F0b90JQ1Jd2633XY511G4c8MNN6ySTcUzdQbn
nXdemjhxYq4/kAviC6utsBbccccd/1NPISbE3rKdahfoWlrYZ+bNm5evudOx
rkBgsIBHagJxyT6YffbZJ2tfelIsh/4Vd+1vLQAbt2TJkpyXVfvObuKZ73R+
ON+4MV5rPWAf9UJjb2lh17LnnnvmOt2+ahcCgaEC8159wxVXXJH9Rb5fqb2j
Ndkv9emt1D+Ix4pN7b333lmf4udee+2VuadOmF/aqGvFhNRd2FPnb6otEhti
3xctWpT/dujawFAH/qkToD/Vw4qhbrXVVpljciJymPIbfMdmXBWTYgf5uKX2
ffHixbl2tsoz+vaBBx7I68LMmTPT9ttvn2O84rXqjORY7A+t1i4EAoH/5yr9
q8aB7Rs7dmyufdhoo41ynQDdSZOuSP/Ke6q1ZY/V4/k3nlV9TraUhnZ2GxtL
14oL4fapp56afWEx6KhDCARWjKJ/1arPnj07x29oUHaOX+lMJ/mSxviNf9Om
YsO4XvUfcU5NLhu5//775++iqdUWz5gxI1111VVRhxAI9ANF/6p9XbBgQa6n
pV/xSq5G3YEaJnmSqs3zuSrH/H/7WEu9k5p3Gtp32GvGF2Vvow4hEBgYSvzX
+Z5yKfanqt+T61QLrCaCH9pXrRKbKu4jn6N2QS6FrnXmQ6mvFa8NXRsIrDpw
FZ/0/OKX0qv2muCrGkO1P0X/lvpD78NjfObTqpfnq1brawOBwOoFDorTqtNT
z77TTjtl+4ivkyZNyrnWM844I02ZMiXnUOw1UyPB51THr1d16NpAoL3AL1yV
23TesDpCNfKlDpdtLTVCBx98cPY5xYsaa3IDgUB7UfTvzTffnGO9fFW9QXGT
/pVLiRqhQKDzEPOhf/mqdC6+0sLiueFzBgL1QNG/4rvsZjf0HwwEAoFAIBAI
BAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKB
QCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUCg2/F/bm+/Rg==
"], {{0, 166.}, {
           233., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->Automatic,
ImageSizeRaw->{233., 166.},
PlotRange->{{0, 233.}, {0, 166.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3Qm8pWMdB3BbkrZJlKJEUVKKRqgQ2caEzDSUtZkYkQyzYWYoCQmlkdRM
JaVNG7JFImU3pdFqaZH2sqRN25Pv4/Ncr+OuM/eee973/H+fzzHjzjnnvs/7
Pv/t91+etadMmzB1uWWWWWbGSg/9Z8LkI7aePn3yURPHPPQ/kw6dcdCBhx6w
/7hDZx5w4AHTN5uy/EM/PPuh1zLLLrPMCg/9kQKBQCAQCAQCgUAgEAgEAoFA
IBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUBgCfHf//43Pfjg
g+mf//xnfv373/8e9Of+9a9/pf/85z8jfIUjh//97395DWXt/u5nTYdnZ71/
+ctf0j333JNf9913X/r73/+e/y3QbHjGDzzwQPrNb36TfvKTn6Qbb7wxXXPN
Nfm1ePHidPfdd6d77703y0NvICPec/PNN6c77rgj/eMf/+j39/meP/7xj+mX
v/xl+t3vfjfg+0cSrt0+//3vf5+vfdGiRT1r/973vpd+/vOf52slH00DXX3/
/fen22+/PV133XXpvPPOS2effXZ+ff7zn0/f/OY3837485//3OezD9Qb9v7P
fvazdOGFF6bjjz8+ve1tb0t77LFH2m233fJr8uTJ6Zhjjkmf+tSn0ve///2s
J1pt4t/+9rf873vuuWd617velX7wgx/0+zvJmj02Y8aMNH/+/LzHRgNkms67
6qqr0qmnnpoOPfTQtNdee/WsfZ999kmzZ89OZ555ZpYPdrEp9tBzv/POO9OX
vvSlNHPmzPTGN74xbbXVVukVr3hF2njjjdPmm2+eXv/616e3v/3t6eMf/3h+
pj4TaA48z29/+9tp7ty56XWve11aY4010lOf+tS02mqrpWc+85lplVVWSU95
ylPy38eOHZsOOeSQ9PWvfz37AlUd8Ktf/Sq99a1vTcsvv3xaf/3107nnntvv
7yXv5GzllVdOW265Zf7OdoM9o8/IPVlfe+2105gxY9LTn/70vF5/Wvuqq66a
XvziF2dd8LnPfS79+te/rr0OsHY+3rx589JrXvOavMYnP/nJ6VnPelZ6/vOf
n+/F6quvntfv9fKXvzxNnz4975XQAc2AuP6GG27Icmu/P/7xj8/P33444IAD
0qxZs9KUKVPS1ltvnffDE57whCwTEyZMyL7CX//6157vIs98BuPOnv3sZ2d7
0R9+/OMfp0mTJuX321vnn3/+SC/3UaC7+Pr03jrrrJPXbm2bbLJJ2nvvvbM9
nDp1atphhx2yPiMD5GOLLbZIn/jEJ9If/vCHtl7vcMLa+XtHHHFEft6Pe9zj
st4fN25cOvLII9OHPvSh9IEPfCA/f3rxBS94QVpppZXSM57xjLT//vvnGK/O
HE/gYbDZ9j89b/+/9KUvTYcddlj2B2+99dYcm99yyy1Z1vn0r371q7McPOlJ
T8q28Lvf/W6PDzBU+f/pT3/a83621e9sJ8S8H/3oR9OLXvSitMIKK6TnPve5
6S1veUuOYezvX/ziF9nf5Zd88IMfTDvvvHO2kSuuuGLWh5deeumgedFOA47P
89lggw3Scsstl+Xbc7cme0JMJJ5zD8RF73znO7OOtkfslRNPPDHzIYH6Ar//
hS98IW200UZ5D9gLnis57m1f4/Y++clP5viQvXjOc56TTj/99MwFwFDlH6fG
znr/euutl7mmdoHOwu3ttNNOeS2u9x3veEe66aabeuUhxfyXXHJJmjhxYtZ9
dCDb6J7UEfSbtZBnOm3atGnpRz/6Ua82XZzjWdkb9IS9st1226Wrr766K/Ii
TQXOy57n14nBDzrooMwB9/dMxfwnnXRS9hXF+eRdbgCGKv933XVX2m+//fL7
11133RxXtwtk/H3ve1+Oedh+HNd3vvOdfn1aNpGOYgeXXXbZzI2xl3XjAVzv
woULczxHlrfZZpt05ZVXDrgO8dqb3/zmrC/pDJxJNf4L1AvXX3992nHHHXv8
78985jMD5neK3dx+++2zDPCd+RBQlX8+ovgRx89PlDsSL+PNyou87brrrvn9
fI8vf/nL7Vh2BrstjiX7eM73vve9Odc9EPjDuBIxwNOe9rR0yimn1I4L46+J
+3E5T3ziE3P8Nxhfnv57//vfn3W7Z48XwiEE6gn8/Ete8pIsfzgecf5gQI75
DeyAPAE5Zzer8o8n23333fN+OeOMM9JZZ52VY+h3v/vdPS/+xgtf+ML8fvmm
r33tayO84keA88R1+d3in8H+bnERvwFPyHYefvjhteMB6T75XDIshvNsBuvD
XHTRRemVr3xlvm/uHxsSqCf4gDgvz5I8s9WDAR/huOOOy3EwOzhnzpzMpVXl
n2yQEbw6H4HPzMfnF5QX+0mHeD/O3d5qFy677LLMZfrdeD1x/2ChZkEc7LPy
l6NVt7CkEOO96U1v6tG7Q7nv6h/E/j5LD+BEAvVEVf7JML53MBAD4ADYfjaE
7lC/V5V/P8eR4QnWWmut9LznPe9Rf3rRAbgH7+eHtNP/r8q/eqWhyDAOQLzk
s7vsskvm0uqEqvzL837jG98Y9GfJv9ivPLOvfvWrI3ilgZFEVf7VvA3W/uPO
5IPEjuy3fLHYuSr/5JqdOPbYY9Npp52W8wTVP73k1+XVRyP+r8r/UGUYr4k7
K7oDL1YnqPdz3a5f/CUOHCyPf/HFF6dNN900f/a1r31t+ta3vjXCVxsYKchz
88k9S3WfhccfCOJHtUG4MzZejC8urso/fhinVmrmSz+N95W+GvUF6ohGo/7H
vpXDH2rs4brFPnwfMY6c+WD1Zqfgt7/9bX5+8jdqHfH4g+m9wPHgcnAG7ptc
gJxhoJ64/PLLs//HVxcH8uUGqmfBE+kHKbIjDpY3gKXJ/6k7+spXvjJsaxsI
rtX+JQOulV8yEI9f6gXVLJB9HCcdN9i4qVNA1vlldDf/7cADD8w+wUA+AL2h
D4Bv53NqgcV9gXpCTYc+H7l/dkCPi/xWXzqA7KsZwN2TGTKAA7722mvzvw9V
/uWO8Gejwf+rf9PnJPdnL7sOPX9qonoD2RDj4Mo33HDDnpwp37lu+X/Q48fv
of9e9rKX5XXJ0fa2lrJ29Rk+w17gcdRARz9gfcGXPeecc7LvzZcng+zgbbfd
1tPz7cXvYzPUAtsnr3rVq7LMkB01wcX/XZr6v8022yzX0rQL9rSaF7X9auDE
8zhQ9czy49Zc1u4+yfGpgVavgPfwkv/+4Q9/2LZrHk6I4axX/RN7vu2222Y/
zjOWy3EPvNR7qQf+4he/mHW9nE+p/dYXHfV/9QYZVAtCXtWD0AG4PT6+2M5L
nK7WE+cvXuAv8B3Vj6oFWtL6/5KHHo38H9jb9B0Okg4Qy/Bvybm6f2v3J/8G
V/qGN7wh5yzJi/uArxzNmQVLA8/MujxDz9Kzp4NxG3wa94A/9tnPfjadcMIJ
Od6j8zx7/U/eE7V/9Qf7pg5v3333zbyO58u3Gz9+fK5z86LrceS4QvuEDLAF
+IJS+w/6eUpeaajyL/4fjVwSnSXuwYPb3/rb9EDrA7J2f+Io+Uil7wn/bRZA
3Xi/VuAt9Fx5tp699Xv2/Dv3QF2wHL9cLa5DH7g+7QULFkTc3yDQ4/hwfoCc
mL3gWeO4PXc6wd/5ivJ0YmVcnZ6YKviOBx98cK778b7CC/YFfJK+E3EEe8rm
tBul/19vi7y2eiW5i7L20uvjGvkH9KKeQT5xEyDmlwuVwxUD0IP6gd0DL8+c
/NMDdDV/IGS/ecD5lTkw+D3PWk+MHL6XWl79bh/5yEdyzacYsRV8ARyR3JLc
fuEF+4L36x0wTwSPLs4YDfCF8Zpy22p71SXz9cva2X81TvJkcia4gCbFvXQg
Llb89eEPfzgdddRRWY974fg9GzVP4v2qvxdoHugB+9uzVhdGJvByZJl95zP2
t/fl+8mxmpiB+ml8j/eLG+QdRjuexPfxacT9/CHr9vJ3P8MX1JHrHwzKzFPP
V1zm+XnRC6WGo0k6L9A/8AL88yLzTYe9/6c//SnrvtYZn+J8suDfmyoD9J41
6sus9kDLh8oH4UHlTAPdAbZOn4s6ETmxpoPNs168nv1eQC/gukteoInzf+m0
K664InMx7kHVb6P/cSPiubr1OQSWHLgAtXk4L3XCTQdOQw+02UbVnjb+gDgY
D0gGmsh90WlyfHhba63ONHIv5AXVR3z6059ubPwTeDTUtegJkOvTt9/0WY/q
Hch+6xxCvXLqhOUo6UP/3zTw9fR/qQFTC8zmF6jxkxORE/jYxz4W8t8lKPIv
3y8GGG1ubqTRl/xXZ5SqBVD/3zTwcdSBqwVuXSOZZ/v1iYoNmsp/BB6NIv/V
/t4moz/7X2qa5DWrtrEpKP2A+jnkfKuzUIr88wPVStZ13nFgaCjyry726KOP
bjz325f8V3sUmir/+jDVf+rrcQ/0RRQU+ecbqA3uqz8q0CzI/+t1wXvhf+ta
5z5Y9CX/pUeRbIiNyUrTQKcV+Vf363yfAvPey6wzvV5NzH8EHgv5IH0e6oDN
fGh63FfmAekF0utWUI3/1SnylZsG9T1qHvF/av71ehWoz1TLHfLfXdAfbi6u
nJCa0KZDP4M+eOfeVfsWukH+cTty/7geOp8vVKCeu8w6DPnvHpQ5zyX/1/S8
j7kGZN9MUjFvQVX+5cbVxzUNdJqzEPB/rfJf4v+Q/+4CDsh8Rz2h73nPe2p3
xsVQoe5vzTXXzH3LdEFBVf6dBdpE/g/HWeYw6fOvzvQM+e9OOBfa2Xh6f/UD
Nr0HwHm3alzs9b78f/LfRP6vKv+eefVMj+IXhfx3F9gAXFC3+P9F/tW5VOud
q/KvB0CfcNNQlX/nwZnxX0AXOr8l5L+7oNfFTC78fzfU/xb5N+Ooeg5xN/B/
eq9LjUNr/G8mEx445L+74Mx7c6DMvhH/N33uQ5H//up/myr/ZQ5bb/yfGYDs
QMh/d8HsDzO51P+o/296/V83y7/6f/l/NX6t8s8XKmfEhvx3D5zFs/HGG+dz
Acr5vk1GkX+xrn7/gm6Qf7rdOca95f+D/+9OmG2tHqbb5B/XbcZlQTfIv95O
sw2c5dwq/2Y46/8N+e8uFPsv/++cnKbn/6v8n5q3gm6Qf/6/3obe/H89v+6J
2mD7IPp/ugN4n7Fjx+b43/zXpuv9bvb/zTLX96//p1X+zfwxE9w8dHmg6P/t
DhT599zL+b5Nhrnm6v+60f+vxv96IK+66qrc7+WF/9P/Iw/kzPam74PAw/Mg
S96X/6/+r+n5P36uPlexrpo3s37lxfTC6oNusvzz7cz8N+tBzRfdbwaQ3mf3
wtlMdIMZgeH/Nxvm3ZpxSdeTB3Ef3/CWW27JNcBNrQO0z8m++n/zjvDezkHU
F4cHbbL8V+d/Was6x8MOOyzfB8/eeVC4QXMgQv6bC36gs7D0+zv3hu9v34uJ
nQ2mR16s2DQugL8zf/783PvnbE/5bnKAD3D+FR+ozP8TDzQpF2LtzmtxxpE1
OvPMup2DWP4k++ZAdoMf2I0gz/pa+fzqQMR7eD97Xz28c+DkgPWGOAP4hhtu
yOfGNUEO5DXotLlz5+ZeBzJgvzvvzywQPhD5x43ph8YT0AEDnYPU6fDs9P2b
8+48Q+ebWrsYAAeiD9BZ0O4Jv4BeNCNIPOSskKb6gd0Ecu9sG/1efDv9vuZ9
qPkX+zvzTsy35557ZllQC0Ae7APxMjlwDmAdOSF+rLXrczbXbvPNN89ybo3y
nnxgcnHyySfnc0HpQvdGLSy/WH2kWKBu/jC5NctNbG/O0YwZM/J5xuTc+ui4
Y445Jl1wwQU5JtIXJP+H/3M2sHMg9UfhB/gCoQfqh3LWlTP9zHR1vrfY1x5Q
/0re7X1nY7Pz3ocPcD4GP4BttGfEBPaQ99WlPoDdY7/MuBXnmHHKxylrN+vf
2p2FRU6sXz0wfeDsa/yA83BxgmIGZ+KYn+Wedjroaf2Lznun162Hb0fnyfU4
79y8f/2A5FpdkF5AZ6I6D5XuZxs8e/4SntBZSeLG0AOdD/4qOXWWJd2u54td
p/f5fP6fTJD36rxvz5YcyAuzlbiBNdZYI8uNHoE5c+bkfmG8YafaQ2tnr+gq
Pa32ujjf/mfXyDMdZ787D6MKPo55yPQAPpAPgCvweWeCLFy4MPMm7lknxkSu
yZrE+Z4vG77eeutluZfbN998wYIFeeZr67nO7ht+0LPXB8YXov/sF7NC1Q2a
FcefqIsN6EaQS7rfXE+5Hvoe12MPkGHcjri+qss9+6peL2eC8g2dF+M7fJ4e
mDRpUvYl2EP2tZNiArbZdTvL2H7l69N5bDldpr/J/h7ofE/xEr+A/ysGwovS
AyVeUDfEHrKbncINkElx2jnnnJO5Hbk8vC6dZ9a/+Oaaa67J+qG/a7YPyhnp
4gP8AN1p/XKGciXmhrqHdfCFugXkkH9Ktp1fv/POO2d5tf/pfr6+elccWJH1
wg3R6eK81t6/Igd8CLUx9pLvYw+nTJmSa8bY2dHmyEqcw66p7xs/fnyO4/n6
+A06jC7DfQ7FfyVT9Jx6ODGR+0kOzA+ePXt2PjNP7cBo+kLWjt8o+p7P7hlZ
vxo/8orD8J6hPCN7w9r4Qp41rtj38glwBfZE8YUiJhhd8Hf5+nLZfHs+n72P
zyH37DW9UGZ7eV72tjNvzH2eNWtW3s/e0xvYOX4/eyA3YA+ID3FI9tzll1+e
7W677QGdx54tXrw496/w9dk914a/4P/yd92bpcll+h38BveIfPl+OTMzdE45
5ZR8brL3tNMXKnGO87rNbRa3q2vkp8lpyvN7tvT90uhm+2TRokU5nsCZ2FP8
Kc+en0E/4BG8r1N8oW6Be158NXWdbJ3nr45DnGtvinOruVxyYLad/ey8H7aM
f2fPmI3dF8i2mZjmheDFS0yAH+Br4MjIQbtiAmvnm/B3zeu19hLjk0u+Pj+1
Nc5dUrCHfCTnZLB/fCo6VmygpwbX4HroypG2h9bONpvdpH5nww03zNciVp84
cWKWVRzAcJ7lws7ji/QGsQF8Ifyw+ABXxP8YbV+oW0AW8XT2t7iO/JW8lTOc
2Sk1PN5TdDKZ9P/0gRzgLrvsku24GJFOF9fy5waCvU0O8Mf2GjtbYgL2V0yA
Qxspv9Ce5mvwR/ge7BC5V78ir6l39dJLL817cSQ4OjGSeyjGYnOtX0xQ7nv5
3SMhB6V+Qy5Tr9a4ceOyrvfadtttM2+npp8OHgnYS9Zmb+FX1BK47/IF9iB7
Y56sfVa1AfYBPYw37u/lc3Rbb/uGDRPD+J5O5F7bAffU+ul28Rd/jM63//H7
JV9PPsv+K5wwmSSzbJccoBw4+beHPTc+3lB8ZM+DDNIl9IDvdB3kUf0s31PN
yXD5hXSedRTZk88r+ktsyhflv1T5jZGC9eBaxNVyInjVUkOFa5NfcD/t5+GI
iTxDz10uk9yTNTLnfstRkEU5Or5gO2Jx68f7yJ/ac/yPUkfl//lC1fpp+oiv
4rn197Kn+RHiiar+pPPpNT4mX3ek9FunouTz+JfirenTp2f/m93h7+K73D9c
VfH17QN/xwnj/cgk/pqfKHbFDc2bNy/bkoH48L7gd5ADtWLsrjmC5IBN8Hfx
hTiDbl9SOaDzPG++Scnn4Tf8Dr43Xo680Ynt9j2tiY5TG8F/IgdkEgfhnA26
2HWT3SW9v/wNz9CcFjE9fa9Gxz2gA8kgfd9u7sV6+Hj2nGtgV8g/DgIH5ecF
7oG5g67dy7V7r+fnJZaiz+xPsYUYA6dT7JG5pfa8z7nP5KBbUPp0Cv/E1vD3
cDD8P/eKPyZ3W0AO2EFn+4jVxfh8dHuTXLqX9qz7Olz2Ajegn5YdLvWk7KF6
c/zUUPPmrsv7+S1iCjE2H7v07tr75F4MRAeNJvfkfuPhXA9fSH0RXoSOdj/U
W/Ob2bHBXKf3FJ5Gzl3tntjGcycnYjf12dbOxxhtsDO4Af3jcgVsAc61wN5V
g6TuWH2xNeBR3R8vcu9n9qj+Q74k28THsA/kdfhVPm/tVd3SVJR4XT8ePoed
I/N0P65OzS75FhMVGfYnH5mfyKenH+xDPB09SwezoXj/kbCVfBRyoE5GbELH
8wtdL/9jsHlz/86meb/cHbnnV1h/O+LcJYF771nwheQg8a/uvetWe6C2TrzE
R+/v3pf6Dbl6Mq72xrp9l/x74drYhE7i20v+max6xlXusSr/dBg7xo/ll5o7
YE/Snc5d5wvQAWw9rtX3dJP8e6b2P7snHnIWDfmRc7IH7Cu9Kf697KPyGTLD
BpNz95GuoEvF+HhC3Iz3jTQKVyxv7neL0127mEMtcV9581Krz5/xvhJXi3PM
q+C3iPHbFecuKfj7cqJ8L9fND+IT8A3YSM9BzFXlyEq9sr0tX0t/eobiHDG+
ekRrd986ee29oSr/8jS4CrAOL2svPansGj9VPzpeg19blX/ch7pm+0TdClvm
hTOoe01S0f32P5+PH+3545h32GGHzDXZV+SrzG1hc/mJeGd8OFvJ5vKT5WbU
+7GVVT+hHXBt9rNn77o9f3JMD5S8uXoD12UN1lTyzHQGm8cftF/EvcV3qEvt
qX1oT5JZuVk+bpWnVa9Ah7tHfDa+rjiH/1zqldlAPRvsYOlTqCOq8m9P81t7
gz2DM7Fu76UDxRH0ArvvZ+IgewQXSFfgg7zkQfmd7mPddIB1F7/dDF66jpzY
A4Wnw5+Kscva/ElXkG3+MP1AZvgJ4n32s8Sdowk6B8+AbyDH7KBrJAdqk/jK
/FnyXepW6S+2r/To4PbqIvet8JzoLRwILqTEufSaPavGmP/L1rHznjndTUfq
3ZHvaIfPNpIYjPwXW8a3dY/0YbtfbD89Kf71efcGhyX3yC6SE/sJ3+Rn8iN1
O7uR7SOr5k6UuhI5LXqNP8+OFLkvfiJbWepSSx7MZ3Gv9lP1M50Atosep9/0
EJR6UvoA51t4TWuRzxAvt/Yn1RUlXyj+54/R6fw6e5nMy5kWf4f/ph6TjMj1
d1J/xZKiKv/0u5otfn158QHZCHaAb4Svkp/mA7NfOCU+oc+X2SX2i3+XY/KZ
whu4n2q961QjQMfjs8XrbB/dL59Hxovut4fU78oFsRl0AxtK7skN/8hn9Ph3
6uyW0mMmfsP34LbKzB3P3B7h66gxbHe80g5Yj7wMzkvNoj3MhsnH8vlL/YZ4
tpN099KiKv/4HLlRfl958eX1GdvD7oW5LOQYRygmrsb/ZpSIa90nuVc6Emci
FqA/7Sc+QJ3OsbYWOS16TY8m3Y/fLfufX1Rqb/n1VT8RH05m8AL4jzrIDD3g
WsVrcuaeK59OfZJ1NsHm9Qd7k08rp8/Hx9uceeaZOf7rVN29NKjKP/nl69B3
fAEvvqCfmT3GnomN8NUldq3KP19fzrs191PmF3uP+zlcdd/tADlgw9lzHFCV
17dOfqOcqhi/xDt6W+VRcKninU6WGb6Yfe1V9cs818LryP2Q/W6CZ8Yf8vzc
m07K5w0nqvLPttvrclo40epLHKTPlC4Uv5b7UZV/ekOs3Hqv/A5xVR3lnw/E
L+IPqiktoAf4Au4JuV955ZWzn8BHKP5PJ8t9gT0up8FHEQcXsIFyYp4ZLqz6
b4HmoCr/7LxaEBwAm1198fdxPq01TVX55++KH/v7HXzk1tkunQxcV+E0qvLP
T1QDghMTF5V8nhqTuug3elqOAi+J38R1FIT8dweqssnus99DiVMHU/9TzrH3
Hryg+LIuIOO9yT/7717Jm+GExYv4ozrY/AL+Pn+Oz9d65nbIf3dgsPn/vjAY
+Rcjqw3wHrnUTqiLHizIvzncrfJf8kby32qd6lj/Qc+reZPXUIOsL7Ag5L87
0A7518dWfkfd4n+5MLkLnB4/pmlg//EXrWdu0mulriPkv7loh/yXc+zrKP/s
o5ofHKjahaahyL/8hjrAgir/H/LfXLRD/tW8sS91lH91O0X+S29Ek6CGV62C
XD8+o8C8MP0cnpk+PzU/geYBX60XFceFw67GuIOBHgC14urj1ITriW0FuRE/
e4/Zb62zbDsZJf+nbl+OrGkwl9Scklb51/+j1pH8q/3XzxBoHtSsql9Rw2se
8VDnd/ALyxku+sZ6qxPR9+O7vQfHVCeurPj/TY3/9b7w/1vlH2ejP5b8m+vS
xDN3Aw/Pc1DDLw9M9ofay4UH1x8g70+X9CbbZVY9X6MdM+CGE/r3zEXRv2Te
S9PAv1G/0Cr/agHUNEb8H+hmFPnHj1X58aZAXb8+l1b5j/xfIPCI/Jt7YCZS
06BWWU93q/wH/x8IPCL/epjVQTcNJf/XKv/iwVKzFfIf6FYU+ZcbbWL+z5wS
8y5a5V/NpnxQyH+gm1Hkv7X+tymQu+mN/zMPqMx1CfkPdCuaLv9mXfaW/w/+
LxBovvz3Vf8T8h8IpDzPyOyjpsp/X/xfyH8gkHJNo7lnTZV/9X/mvoT8BwKP
BfnAjzVV/vvi/0P+A4GH/WP20ex35/g0DSH/gUDfcCaG+R9mITsvqmko8h/z
vwKBx8J5BeZ/sY9Nln85AGeXFIT8BwKPzP9tav9vkX/n4OoFKAj5DwRSWrBg
QZ6P23T+T39DzP8MBB6Ncv5X0+VfD6A5bQUh/4FASqeffnqWj0033TRdcskl
o305ww7rW3311bP/7yz7gqr8O7/RmYeBQLfhxBNPzPwffrw6H7cpKPV/7L/z
zAuq8u88Y2e+BQLdhsL/m1/c2v9vVprz85xnZMZZHVH4jf7if2eD3XHHHaN4
lYHA6MB8PP6x808vvvjinp+bYWhGNvuph+7666/PMXLdzoZ3pvG66677mPkm
Vfk3/9dZ7oFAt0FMvM466zyG/2Pvzc5zrqGzAfbZZ5/cS0dunA1al/OizTQf
P3582nXXXfPMn4Lg/wKBR+Zjts7/Z+fNAzIjT/y8yiqrZD3gfOMrrrgiz8uu
gy9gdrPzntX+VTm+kP9A4JHzP5xfZv5ndb65c0zFBIcffnieD+Z8Ezz6hAkT
ct+Qs1Duueeejj4TWBzzwAMPPMpn8TPnf+y0004h/4GuxllnnZX9/1VXXTVN
nTo1XX311Znvc3Y20AfONyHvzsviK3iv8wKcDa6m1hlJZKzTzz0ouuDOO+/M
dQFyHiuuuGI+s40eCwS6DTi+vfbaK9t2PPnuu++e+T7+8X333dejB9hH/MBx
xx2Xz1PjB6gbGDt2bJo+fXqOFciV81Y6jRtwPfSYWMB1kvdNNtkkxzT6HnAg
dTqzKRAYLvCLnV8szlcHaBYA2XAmpnyZM4/KmUnsp3Py9AnPmjUr5wxWW221
rDe22WabNHfu3NxDcPfdd3dMvhBH4WxP5zeZdbbddtvl63Xd5n+bfxK5v0A3
g22/7LLL0sknn5zPOJUvMxNAToCtdIYymS42Urx/2223Zd9f7myjjTbKPUR4
Qmclk6lrr702f2/xH9oNuopPz79xBrAzXOk3vc6uV80PvsO6Oj1uCQRGGmSa
vJJb56XqB2Qj11prrZw7UyfofEPcgPfyqfn6t956a5o/f34PN+AsUbVE5E0c
gTtoJzfg9zh/md9CP5FzXAW5d3177LFHrgly/k+dzmkOBNqBct5pVXZK/wzf
AF/oPNX7778/v9dLXk1OkN7Yfvvtc6+92Bo3MHv27HTRRRdlbmCk9YCYw/nM
F1xwQZo5c2buaZDb4MvIb4pP8Jt4jU7jKAKBTgJZZSOL70z+2VB5QOdlmxXi
rOMHH3wwv5883XXXXbmHaN68eWnLLbdMa665ZtYFdMKcOXPShRdemOWTnA6n
/Inx+S5qFE866aT8+9Q00lt4CbqAr08H1aFmIRDoFLCVuLOjjz461wKyp17k
yuyQm266KfcIFL6PfddLg1OcMWNG9gHGjBmTuQFyKY7wfXjEojuWFH6X61u8
eHGONfbdd998jinfQ6w/efLkfB34vaGe+x4IBB6GeJ/tVEd38MEH5zrAEudP
mTIl1wawvWSx+PdkWwyuf2C33XbL/oMcI7mUYzzttNPSjTfemGuMlqR+yPfT
M3oW9e/yS/gnfA4cpHMNXFPk9QKB4YGYgK3VU48T5NvjCDfYYIPMFaixpSdK
rZ0Xv/zKK6/MuTc+Az+A7lB7wz7LvdMTg5VT+QTfSXfwJeTw+Pp8EvG+vKQc
JP+ik+sSA4E6gkzLp8sJiunJHB1A/rbaaqvMsekjIH8l1uYTyLOdf/756cgj
j8zn7uLk2GvxAd5QnqA/kPt777031yUtXLgw7b333jlPKe/ID1G7aL7P7bff
HnIfCIww6AG23vwwsienzq7zv+UC+feLFi3KMltiAvqAz37eeeeladOm5c+I
1dUZ9DV7yGflGekHHN4hhxzS87v4EnSJGQbO8w1fPxBoL8QEZE+cP2nSpB7+
bf311++pseHfe18BPcCO4w3UHqshuu666x7z3ThFOQb+vL5D/CNfA6+/xRZb
5J8504NvEfm8QGD0gMeTW1f7p6+Ofy8mkHcXk6sBIMvVWmK1RPrwbr755swd
FvDf1e75N9+nNplfIY+Aa+BvnHvuudn/WNocQiAQGB6QablAPTZ6hsk+W62n
YMcdd8ycvBkc+IPe5LbU7qkvUn+EHzS3R78hv0K/vlpDfcfh6wcCnQk2ns/P
95cfJMNiAn3Gcn9iBXUDbHzhBshzqd3DDZTeIj4Ef+LUU0/N9cf8jPD1A4HO
Bz2AA5SnwwmqAWDLnTW433775Xm8+ojUDaoJ8j41QmIHfgMdoN5A3iDmcwQC
9QNbbbam2QFqAOTrxQP0AN7f7IBjjz0284D0A7lXXyTGFwPoL8L/BwKB+oKf
j/9TI6QuGDfAvy9nc5QzOkvfoN6DkPtAoFkQE6jHVzeghkfdLn5AbaAeAzG+
XGHE+IFAc6EuSA2v+fxnnHFGjvHVBkV/XiDQPVDjw96H3AcCgUAgEAgEAoFA
IBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAI
BAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCARGG/8HT54sqw==
"], {{0, 161.}, {
           256., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->{60.703125, Automatic},
ImageSizeRaw->{256., 161.},
PlotRange->{{0, 256.}, {0, 161.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3Qnc5WP5P/CxDYMaKlmSopJSoexNRAijaFCaUo0YScY0QzUZTIORJYwR
UilKNWSyJHtE0jYyEaN916Z90fr9ed+v1z3/2/mfZ5s5Z875nnN9Xq/DM89z
nvOc5Xt/7uv6XJ/rujc+5OgJk1ccNWrU9NUe+8+ESe/eZdq0STP2X+uxfxw4
ZfoRh0857NC9phxz2OGHTdv+kJUe++bdj93WW2HUqJUf+38VCAQCgUAgEAgE
AoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFA
IBAIBAKBQCAQCAQCgUAgUFP873//q/7973+nm68DgUCgVfjLX/5S3XvvvdWi
RYvS14FAINAqPPTQQ9X06dOrww47rPra174WMUwgEGgZFi9eXB1wwAHVBhts
UF1wwQXV3//+904/pUAg0CP4zW9+U02ePLladdVVUwzz/e9/v9NPKRAI9Aj+
8Y9/VLNnz67WXnvtaty4cdUtt9zS6acUCAR6BPSW+fPnVy984Qurddddt7ro
oouq//znP51+WoFAoEfwzW9+sxo/fnw1evToasaMGdUjjzzS6acUCAR6BA8/
/HB1xBFHJA1m4sSJ1QMPPNDppxQIBHoE//rXv6q5c+dWT3va06qXvvSl1c03
3xx16kAg0DLgFPruhhtumDSYf/7zn51+SoFAoEfwwx/+sDr44IOrMWPGVNOm
TUs5UyAQCLQC6tSzZs2q1lprrWqfffapvv71r3f6KQUCgR7CJz7xieq5z31u
9fznP7+64oorqv/+97+dfkqBQKBHoP/ola98ZfXEJz6xOuOMM6JXIBAItAy/
//3vq6OOOirVqY855pjUOxAIBAKtgJr0Bz7wgWqdddZJPY/mNgQCgUCrcOml
l1abbrpp9aIXvai66qqrwgcTCARahi996UvVy1/+8hTDnHvuuamuFAgEAq3A
T3/602rSpEnJB/OOd7yj+slPftLppxQIBHoE4pX3v//91VOf+tRq7733rr76
1a92+ikFAoEeAb2F7vKSl7ykes5znlNdfvnlocEEAoGWYeHChdWrXvWqavXV
V69OO+200GACgUDLwPdy9NFHV2ussUaamfmjH/2o008pEAj0CJyFZNb3M5/5
zGqXXXZJNaVAIBBoFcxrMAsGx3zsYx9LM2ICgUCgFXCOwOtf//qkwegV+OUv
f9nppxQIBHoENN0TTjihGjt2bNJ6zegNBAKBVkGvgHkNzhZYsGBBzGsIBAIt
w+233556BcycOv300+N86kAg0DKYmfnGN76xWmWVVdL5AtErEAgEWgXzpWgw
5k1Fr0AgEGg1PvKRj1Qbb7xx0mHmzZtX/eAHP6h++9vfVn/961+TTyYQCASW
FjfddFPywein3nbbbaspU6ZUH/zgB5Pe++Uvf7l68MEHq5///OfVn//85zhX
NhAIjAjmNbz5zW+uVlxxxXR+rPOR1JO23nrrarfddktnmpx00kmpD/IrX/lK
tXjx4uSVkVvpi4zeyEAgMBD4YI477rjUiySG2WabbVJNafPNN095kzMfN9lk
k2qrrbaqdt999+otb3lLmu/w+c9/Pnlm8M2vfvWr6m9/+1vKp4JvAoFABs/L
mWeemebZbbTRRqmf+otf/GI6y+Tkk09OHl+xzDOe8YxqvfXWq9Zff/3q2c9+
duKhvfbaqzr00EPTWQRmPtx5552Jb+g3+CbyqUCgv/HII49UM2fOrJ785Cen
GEVcIg6ht9BdzAC/+uqrE+8ceeSR1X777Zdmx4htNthggxTfmCPje7vuumv1
tre9rTrvvPOqa665Jp3h9t3vfndJPhUIBPoHziv59Kc/Xb3sZS9LuRHu+MY3
vvH/5ThyKBwhNnGG0pVXXlmdddZZSQvWWyC+edaznpVm4olxzA/ffvvtq/Hj
x1eTJ09O+dR1112Xfv9nP/tZ4rQ4/zoQ6E3gD9zymc98Jmkqa665ZupznDBh
QvXZz362WrRoUfXjH/843aexr9rvyn1+/etfp9gE31x77bXV+eefn7gEVzkf
8ulPf3r1lKc8pVp33XWTfrPjjjumub8nnnhi9eEPf7i65ZZbUm0Kb/3xj39M
fBPaTSBQb1jD4gdxC25Ze+21U+yCX3CCGrU45j3veU+a26BGjUcefvjhpp4Y
j/foo49Wv/vd7xJf3HbbbdUnP/nJ5Ns76KCD0nyZLbfcMmk2ZkHIq/DPDjvs
kLRieZc+KL/n98U3eC34JhCoF6xXs+suu+yyVHum6dJceHfVjF7wghckjRfP
iDnUqf1M74D+JLENj6++AhzVzH9H06Xd6DWg3dxxxx1L8qlDDjmkGjduXPW8
5z0v6Tb+llzK33nFK16Rfi6+wWtmXsmnxDaBQKD7gRM+/vGPJy6hze68884p
r7n77rtT/HDJJZdUp5xySjq3xHoXc/DDuPn6xS9+cfXqV7+6mj59evL9qhc9
8MADyfM7EN/gNNqufOr+++9Pc60uvvjixCNvetObEt/Qh3GNGpX4Rm0c/6lN
yb3ETYHehOtDrCrnjhln9YTPUBwwf/78lK/ol3bOvX9b934u7nAfNaOHHnqo
uuGGG1LMoWZEp3Xeo/wm84CYAwe84Q1vqKZOnVp96EMfSnzjd+VSerGb1adx
kL/D1/ftb387+YcvvPDCasaMGanXkpdY7EQrxoFmX8V84N4EXvnFL36Rrhv+
BvscL1XwTL3wpz/9KX1+eILeIicSf9A5BoKakc9ajuJzl1PxyagZ4SaaipqR
2EauQ1PBXW9961tTLnXFFVek+jS+oakMxDeuJbEPDqEr33jjjdU555yTciXP
V68CP02gd5D3O7n2qaeemnJw/obXvOY1aU/jE7fvBc90N3yOtBCeFpqtmEDc
4Yx7ccpw9dP8OOISWq/rghaj7qy3QM1os802S7GN2APX5OtFjiMXwjdq33hk
sPq079OI5FL4CTeFT693INc1m/VTn/pU8m66XvTvq2E+6UlPSrmy6+bss89O
+5prJT7/7gRO4Ds54IADki/Fmhcb0F6XtTYjX1Zbvu+++1KtWX40a9asdN7J
HnvskTQU8Y2cKvcX0G7kOxdddNGSepHH8Dxjr+pt+Hx91q5H1wB/FP+CHJgG
t//++yfNz/XC64lnDjzwwKTVfe9734szuroMcqLrr78+cYvP0fp2nv1I4pbh
gqZin6Gp0Hvl0zwuatziG94Xuq26FO0m51K5l0mt3Cw9+RHuw11Rm+4NiD1c
G3xS8maeCD0m8nS+TNcIznHmn76UOXPmpGvDfohn7Inu42fi59iHOg/1mi98
4QtpT+BvUwO2jtvBLc3gmqLtyIVouGpAcjJ1qT333DNxnXkz4htcoy6llrXv
vvtW733ve1M+h2fiWqovXGf2OHvGRz/60ZQLiUl4Iuw18nU+BLXHHJvYp1yj
+lGmTZuWtH48I++m0dD/8FTkTJ2Bz1R+K35Qi/HZWMuzZ89OOm2nYoKsFdN6
XR/4Q57GV7PPPvskT5/6Ny4U34h39DDx7AXqB35LvMEfTu9XexRDy33win5Y
vig80eyazBoN3xRPOF8WXrInTZw4MfGVONn9Is5dfvjDH/6Q6r3yDjEoLdc6
5ofrpn4fz0V9AOfRcMXH6gae90477ZR6s8VbOClQH+T4gx7Hb7DddtulHAe3
8Gsff/zxyTcpz8nXY/ZHNfOGqznS+dUQab65x22LLbZIsbCaKB6L3tn2w+cj
rvQ5+AzEAT5POUq3c7xrDZe4luTZeq/lVaHp1QO5vsivTT/RR09rUw9SV+Q1
MJ+s3OdyX4nrUy6PK5pdq9lzrpbksXlDxeV0YfnTsccem64Z+1U37aG9Au8/
nhe30Nt9priFRq+2U7cZup6v6y7y6+5H7nWVy+APuQytj3ZLa3nd616XvJN4
B//keYf2DbUAvEC7VT/CSTwMA33uvi82ouXxfMr75dP8V/py586dm3rlBsq5
AksH7yc9w2eZ633vfOc7k64WfB5oF+wBeuzltrRYuVDWYs3s4F/hk6KhZc4w
y0ycKn/iZeDTxA/2RBzzuc99bsh9BZ/ZN/XE8lbRZvg7aTNqpfpf6HzRU7Ls
8F6LK2mkuIXmQk8TSwa3BNoBHKE2KFaQf9PLXHc4Qh1QPMLbhEdy7Jzzp+98
5zspzsAreIivTrxDs8UXegWGE3vkXhZx0QUXXJDido8nbvIcjjrqqLTnls8h
MDJ4f8WXPhvvq32A5oVb8E4g0EpkTz+O0Adr9kZe0/rh1Z/FDnIlsU3+HXEE
f5zcx2xDfgS84nqVE6k5877grJHmNXhG7wiuoyerC+A6MTyvHk8Ez3jkTCMD
vVxc+trXvjbloLQ03KL2mz/bQKBV4EOizfJB0ljECOrF1rLYWQxh1nvJEdmv
K46gBdJi/Y7rVT+JnhG99O6zrOeei9XVkXCY2IVvL89RzB5g9eysAQUGBm7B
1/wt9g61ore//e3p8416S6CVEB/wPNi3+B6t26yp8kOa06HPXgyROSLzimuU
X1ePPR7CLWpJNBN1Tvpuq3N4sZL+OL24+vppzHQDvbiHH3546pmjGcU6aY7M
Lep9OQ6kt9x1110RtwRaCt5bHgG6iBwcp4wdOzatVbwifpYL5Vwcv8if5CNm
Eopr+FWsb+vcY6gl3XPPPW3P3z0P+y1+EyvluImGTDOiD+HAWDP/D7iFLzdz
Cy+kWBC3hN4SaBXs7erA9np7V64Du954q/Smqs/kGCDrMmo6zrRxHh8vthlD
2a+r7w3vqCUtL71V7EUvltPpudVPaT/2nPTZ8m/qeStjr35FI7fYS9QEv/Wt
b0WsF2gJ5DV4hSbi3Cs9hlljoZXytomdy/Uov6HdykcyF+UzJORPHsd1yx/X
KT9T1qTVvukIPO35DC88I6bCjfJA3Lc0N3GQNer/4r7sRc4375PbcB/PfT2O
99otzxfIjy+esO597e/6Hf/2d93Hv31ND+ONK/+273s89/N/vkQ8i1vsIzhY
3479IGrQgVbAdSQXMhuVXmKeBk8Jvjj66KNTHz7uyf2muW6jh4h26376AMQs
ajh+x3qmsXRDj2r2AZoHYq6a2hd+UVM3R8TaUvtSj9W/MNIbDYrvz/uEa8V+
5c/NTON/He7j4XiPY3atmz6uW2+9Nf0d3/fe0s197e/6Hfmqv+s+/u38eP2j
fI/l36aBi+k8V//nU/J+2BPELXQqOVHELYFWwbrDE/QSHKGGTId1DoQesTIX
sufp43Dd2//FAurN/G2uU+vU79g3u61Wk8/OoM3QiMRY9CE8w6enr1ada6Q3
3kLatz5dvZzqa40/l58N9/Hc1+N4T91oXvq3PI7vm1mBx33t7/od//Z33ce/
fY07/W75t+0F+rc8V//PuazYhd4ibgm9JdAq4A71Zdxihpw1p3ed/1v8nTlC
HK5GbY9Uo3bt0nvxC56ZN29eqjMtjY9leUN+R9/V58a3ow47atSodAa7/EB/
zUA3+g292vvl33leo7WqPubn5f2tXxzhd7xn7ufmd9wfR+Tvucnf3Mrvtevm
OeBWr93nLs6KnCjQSshh6J+rrLJK8snpVZaTZ41FLsQHK8Z+97vfnfZ4nMJz
5Wv9hWJ3v1M3r6x9Wu5hr7fG7P38x2YCD3SjC3uPaNb+Lb8Q5+Eq+YiaVXl/
XG3WGg43E9T93Gjh7m8GRf6em/wFh5ffa9fN36etee28dHSoQKCVkBvxoLnG
9Car3WbgGD3N2dOfe4XE1scdd1zSANSo65yr05/5iL1+fZNyO3WngW7iHtoT
zdq/8aqc0b5PSzZ7ory/+4rp5GVZa3XzO+4vLszfy+e0NH6vXTefrXqf1+6s
IN6hQKCV4GvVF+gaM7tSjShDjcEeJ2ensYj9cw+hXAmvdHsuNBS83ryHi8/6
6Tw/XEjbXmGFFVJ/NF0tEGglSn4RozgbJMM+am47j5p6tZ7nXpuBUPILj7K4
okSOM+o8L8Q+4DNr1G3lvXT8FVdcsdprr71Sv3sg0EqU/EKnVVvJsKbEzOY/
qd2aidxr+p+arbzQ63/f+973OE8vfTvPvqrzLEU+OTNFG+cqyPUmTZqU4hc5
Mo9CINBKlPyiluLfJXBM9mz1ItRM1G29/pNOOulxfh31MH1M1l6ZN9YJ4kx6
tBx35syZ6TyxDPqQ8z28djxDSwsEWomSX8TK+v/6CYPxCx+b2pIaszpTHUGj
VxNbaaWV0nkiYpmMkl/oMHrQA4FWouQXe7Xeon5CmR818otaMX8Kb6sZD3UE
fuGdbFaDDn4JtBslv5hrWeov/YCs79rf9Uc04xc+OR6XOqLkFz1i6n4ZmV/o
L3qz9IAEAq1EyS/+328aX+aX0aNHJ+9cqTNlfuGnpfHWESW/6N/Q65mBX5wJ
pH6kNs+nEwi0EupDrrt+9ViV8Yt+77K+ov+Ph16/Dy9hHVHyi/kapf6S/S/4
xfxLfBMItBJiYrUD15/zefs1P2rmf1HTdZatXiNcU0eoH4lN8KeZX/oXM3iP
9Ybgl9BfAu0AD4Rrq5l/tx9Q6rtTp05Nvv8MfUM8y2ba+LqOwC/q0vK/Rg9d
/uzpL8EvgXbAfCi+F+tr/PjxyfPRT3AWE3+y16+Oy1OXoR9Rzyd+MTemjpAf
5filkV9KfdfZDnpdA4FWgt6i98T60uvWr/Vpa2z27NmP01+yvmtmgz6sOqLU
X+RHznvPyPwiP5Ib9lPvVWD5gOefruv6M7us9Hf2A7L+MmbMmHReUrP6Ua/w
S2MPQMkveiP0mwUCrUTZo9+P/rrMLzx0NJZy1nfpfzGvpY4o+UUe3FifzvmR
M6XpvYFAK2HeCV+n62/vvffuO/0l84u5Nvp0yj7pzC9mEZtJUUeU/DJhwoQB
+wP60ZsQaD/EL87q69f6UeaXlVdeOeUI5QyDzC9m9DqXpY4o+aVRvy/5Jfqn
A+2Aeom5zmLkfuYXa4xHvvSYZX5xNoI5OHVEyS+D1Y+iPh1oB5x55rybfo9f
sv4kX8yguZjN3Q/84jy10vtTN/BFmi/BL9prM4rqjNL/EvzyeH4xk9sc817l
lzxfSv3ImVX6BeoIPanmn9GoaWheV6A7MNj83X5AyS/q86XHjKZr9kuv8ot+
RrmxsyP0CdDi6ggeC/zIR8AnGHOyugf9zi9lf4AzEcz4z+j1/MhZse9617uq
1VZbrbb6Cz2ez9ocMHU+HiZn/Qa6A/3OL2V/QON8KZyy0UYbpbMN9TrWESW/
NM73cXaK/V7vQB35RW/VwoULl5yRZw7APffc8zgPU6Cz6Hd+KeOXE0444XEe
Vp5dvdPOXqzzfMxyfl0z/25d59eJU84888zUg+pzsgeEB7m7UPLLnnvu+bjz
SfoBpf4iVyhnLM2fPz+dS63H0ZmKdcRg81/qzC9iF+eG2hOdzeV1lN7kQHeg
5Jd+9FiV/KJ/uuzxK/27vdB/xJ/drD5dR37Ry6Anc5111knzBS+77LKIXboQ
/X5+QMkv6ps0iYxe89fJA2+//fYlPyvPV6sTv9ClzftyTvFaa61VTZky5XFz
hQPdg5JfGuuzwKskZ+Bf6kXdrOQXPsNyxnWv8cs222xTXX/99Ut+5kxHczHV
p+vEL2bwibVxC850Bl546roTvALZXydW5rcroZ/6kksuqa688sran2XfDCW/
WGulN8tr3nLLLVNvdV3rR3QKc23WWGON9DrvvPPOJT8r52PWxf/CP3D22Wen
nnazS33Ngx7oTlhPzq3I9ctyPitYY7vuumuq4cp3b7jhhrTHl3XcOqPkF7P0
S2+W2pIzc/3c664rzCZ3tjgvXdkjXfYHyJO6Pccwm8dnQkd6whOekPbFe++9
t2fOQu9F4H4eiIH6G2n01hgPCC8IrnGOBx6ihdb9sy35pfH8DtwrduOzq7Pn
XF+Rsyh9lrSLjFLfNcOw27V9s8+OPfbY1M++2WabJc29fD2B7gNdJfdPN8bP
IEd3bZoPsummm6ac1//t9c4cu//++xPP1FGbwY36Vnbaaacla8zryTALxtr0
HpRzYeoGr1OffLkWfY/eUs5evvvuu7v2daoN8QtsvfXWKXZxzdZFL+pn4Jfp
06cnjY/HWp2vnHEN/s33edZZZ6UZrvJemoRzgSZPnpz82XSbOmkzYm0xybnn
npv2QmtMbHbNNdf0RFw2GGih6oT2h3z2tvfgtNNOS3o/Huqm1++58OWaU2QO
mP2AptsrOXovw2c0b968asMNN0yxiTPQL7/88hSLlp+ffU1/rfrm8ccfX+24
445prj7/wXbbbZe0GbGAmmc5w7bb4LmJSfhAeD/pSrRPa0wOKEfCsXXjy+GC
PmrG94knnpi4xWfutetB4iNR67366quTDlXO2uok8J3PSi87Xff000+PWZ41
grVmBi8fmTXmutOLI1eyx5e1P1+rM6jdil1ck+q3Pnf6sP4yjyen6KacKecI
1tZ5552XciG+/7Fjx6bXzafLZ77++uunGJyXl16BU7uZL4cDr11+gTN9bnJb
n5e9wfmU22+/fbXFFlukPcYaxrli2muvvTZpNJ18/fa1u+66K+Vv8qKJEyem
nqNuurYCg0OOJC7hX3Vt6XPfZJNNEl+ImWn21lmOZ1yvfodWYW6kPX/zzTdP
a9OatXbVcx988MG0B3Y61ra27MfydzG2nmhrC6/wT8yYMaO69NJLUx3Xv70O
60+/sX2zG/lyuLA+xaJ8L2YYyGm9PjyiH4lG6rP33tBO8+fvvdl5552Tlq9n
xD7TiXzE33VdZp+unK5b4qrA8GENqk+qR/PZ2c9oLPrG1I9oLzQYvJL5wv/V
n6w/c31yTCAG4ksTa9MzrO1OeLfFWvbfG2+8McUjO+ywQ+qzVX/IvKJ2Kx4T
2/AW0rI9b140r8PrwZ8XX3zxEr6sA3CB127eLo6gLYlPcEf2GuAVPYI+RzkI
vXTBggXJZ7jVVlstef0+fzyrZuh9Wl77hfdavMWnu+aaa6bn1ejPCtQL+EOd
UvxhXdnr6Sz0Pz4Z2gzPXVmLyJqG+sMZZ5yR+iTtjznWdn6gNU6bWR71CX9D
vGHftbbEIZ4LvrQHqsfjPXzayHv0iewp5BHFr/Zy2jfekTPw/3RrzmTt81v7
LMSeehrV+3yG8iA9EIN5Jb1+cSk+Vbv2fvld14FZd/rI8dCjjz7a9teBz+RD
9gR631VXXdX2vxtoP+QB1qe9T31lv/32W7I+xQDibPu8uWGlNmPN5XjB/uh6
dm3KNXAOr6UYqJ31XnxhfWR+xA/0IToDfjST7r777hvUN5HXKG9MXqPyRfv/
LrvsUp166qlJE9Cr1E05k/3e/u415lhSLiR3NftTboE7yxi0GfLnr2ZjRqj+
ERzl9ftMzeGizXn97eJZ77/3Pu9T9q2yNyxQf7h26PS33XZbikFwi/jajQZs
Xoqf0WZKvsi5Fk3DHijXkj+7RukfznLmY2ulNuOx8B1u4/O371pbNFvxC26z
XuQDw+UE3Cn/zzUzcTq+xFk4x9rrBo3Jni63u+6661J8QmOxF4g5cKznKRbI
udBwkf0/8l+6Pd8sb4LXj3M8rp9Z963cL+R2N998c/J7yovCp9vbcP3ii5yb
yxVcv+IS1xxtxvXbeJ3ZgxYtWpTORZTDu7+c3toXA1kPtI9liXn9rnyFHwIH
7LHHHolX/B0x08knn5yu1WWpA/kbnicfr/UqLqDhqDmVr2N5a0xeD/1LLEGb
5o8UY7jh1HPOOSfVy5Z1/Wdvgll/M2fOTDmvucQ+Rzkkjd9cmZFw92CQf3lf
s6brfa+L7hVYesgp8IXPW7+KnMM6s0/aY/QY026aacBqvTwXu+22W1r/NA25
hhgo+2ZGsgY8Lh+ctXXKKackXtG/4HHVl4888sik3aqdtMrHYv3Yr/kv9PPk
2N1r8jqsP69joDqL5+w1Dncf9jhiqMY16/dxhvzVey4XElPk/EUMo17Uar0r
7zNmbU2dOjXFsPYMHCBOFa/y5w2Vfw0Gn5XHUR/gzXFuSr/NDOlnuF6ta/Eq
Xx5fgliBjyRrpzRgfXTlnuO6kb+47l0z6jM5BvIYdB6a43D2P9evvIQPDq+J
2TPP4T2zQbIPtdXIGpPXoW5Kd8yvY999900xg728kdO8Z7gXH9Kg5B2DrUHx
gtwz558Z3h+xkphBfVkMhVPN2ONHsvZpLO3a77M3gc5DJ6Zped/tGfIyvC7O
XRoN3LVl/6Lpup6yT7dbtfRA++BasD/edNNNqRdEzpRjc37evIc25iVyCNyj
DqGnSZxN0zAHwb412BwPj+O6dV07azHXkK1t+ReOsraXRw01a0xeh/VA58kc
pxbeWEdVy8K96uL0cq91sDn3/CjyO6+r7N2Wc8pHaVp0a3/XudJ4je9M/Wd5
6RT+ln1G7dpz9TnQ5tTCxXhiPfrdcPmBpkzHzfN0aT7h0+1vZL7I/ix7Dr6w
n7nmsm+m7IHMe6Drz/WkJ8E1ORC/lPUstRt/Q8zgb1ivzo2mwboWl3d/ntfl
edF55Ezq+Go11noJPVrqOGb1jxkzJvVxiWUG4gK1E702OIuOml+XXEGd3GPR
W+gu6lyD5WXthOdFA5fj0mZ8Nmpt3gcaMM4fjm/G4/BxumZouvYe8V431ecC
nYO9LPvPxcy0mVwz4keXy6gbu87yNZPrM3hG3YeuV15Pvs4asXqzXMi+Zs3Z
v8XiPBGNdfLlDfsznccamzt3bsrPGvfdfBaBXp/c64QrB5qNpIdBLKAvCHfl
XM/7xxtw/vnnJ71H3agb+vxy3wjNyz5DBxLPyNtoM64L8d5AeZt8kT5vb3LN
eL9inm6gRO7xydqM2pJ8Kefm8oPca13ygb3Lv8vYw7/xjZiGr0v+JB/wePIB
NSl8tTzzgaHgOeOVZr0Ejfyy8sorJ32bvtCsfoaPrU3xDp3HY4LXKq+i73bj
TMjciyFv9LmJY+hDak780mptOLHRNyWXFvvgU3xEPwsEmgFPWGdiEteUHEY+
Y29S2+G9E9M3y81zbeSOO+5I+5l9UJ7g99Ur/K442r7frbNKmqHkF/3aciTn
aphLKQ9ofC00cnp5Pnu+Tt6yspeUNiNvFMvQVXCqnE5u6PMXe9lvaHiuD3Gv
PaUXe9cDrUXWQHnx+Wbsx9aUeo/ah54lPjv7d9Zk7FsXXnhhqsW4JvM1pydK
LuTx6ugRL/mFLq3utPrqq6f6th6GxnyqzvySgTvkv2I0r4Hub5+g6dNm5ID2
EVpw7mEwi7zb53QGugu0A7kMrU89JPdAqq3K1dVHaKRydDyUZ/bTCfl9eTzU
herssZIX4kn8ovZlbam5yX/kDvKGMmfI/GLuFz9+nc9U9rktXrw49Sc4pyCf
KUWj1qOvl9LrxLlRjw4sDegR9mj7ldoITxqOsX/Lu/GOXMj36JpqI3PmzEna
r1yo7tcc/rB/4xd5n5pbniVrromehnK2OI3Y+mvUd+sKnz+OpM2JT2loOEau
uOqqq6b3RU1N3tyNulKgHhAz87CYkWYft3dZY3IF/7eny8Vzn3I31EZaAbmd
15b5RS5IR7KmcAgdVF+W7wP9ml4hvqFhlefW1hl4Rq4nZlWLdjZK1rzlTfS1
3DtZJ30t0D2gtaj78OCag8DrIEfIHnPfr/t+3Qg9Fbx3mV/kDLgEp8gTcYxc
wf5tXalz0yryuZLqt70E+a6YFb+IWfFrnmsmtuH5yb1T4YEJLA1yrdW8BPMO
/H+kPb51QanvZn4Bfjnap9iNX0gfE03U/e3nvcgvcl31d/oSvZ8/isbGz6S+
qI8M5+beSfsNLo54JhBojoH4xVrjy1PDFcPQaPQQWW853uklfrF36I/SJ2Zu
FO8CTx6NTe0Q7+Ab/kk8I6fUS81Pw2e5POfnBQJ1wUD8AmI2tTV8oq9PP5M6
i3yh1/jFazWLx3vh9fEw59eWe+LN6fF+8efhGfVGNX2chGc6NW81EOhWDMYv
eU/HI7QINz7lXFfpFX6R3+hpUDvkL6Tvqic26ivul2tNuU8885Eao/5RMU+n
+0ICgW7BYPwC6mR6NPVUjB49Ovnqcl2lV/hFX4C5MfIi2pIektz30Ax4R95k
lrAZPzwLvJnqauoBtCq1Jo8ROVOgnzEUvwBtwZqz9nqNX+hMeuzlO+ZG8SbQ
bYfDC7l/VE2b15BmI77DMzRg761YJ3gm0K9Qh6cjDMYv1oY1p/+c58598Yyz
zurYH1BCLsObjVv0IOlnHGmfh54kPa+8UeIg2i9/pt5ZHgdnHgxnfp64yH3E
Ux6Pt5h2TNcZTp2KF8nf4ZHksxgK9g3352Gvsw870L3gezfjRi+5nr+B9El6
gvkLZvrRe/UUmB2VfXd1BC+TMw30BSzr+a54I/cbeMx8Xh6vEL5R4zZ/TLzT
6PnO8+rpx3RifZY81GIpmrreJ30celUGm9esb1dfKh8oX8VQfMRH6W/Qp+Vz
gUCrYTaKniPapJmXg/U7iFXwkf5PdWrroa79EfhAjwfPHN0FH3g9y+plyWfI
mF2Gr2jF8s/cb8KfKD7JoG+JH3wG5n/x2civ3F+fgh59+rE5IM6GyHPRmvk8
9Wu4v35M3D9YLcvrtJ94bDkd/2gg0GqIS3jpXOM0y6F0AvGK/gi3OnuZ1ZvV
oK0vfjr+llbWlq1f85B5iMwxNOc9zxsXh+T7eN9xu54uvbPiSPGUmJKmzn/k
/B18I4cTE5lnLD5p7FER52RtTK47WI4kN+Lfodl7XPWwQKDVcA3ax/Uh0SKH
6gU371Kcow4rF6hjHda6VhOz5ulJ5nq26ywA7w+Nx1k0YhRaDG0F8I9cSO1J
nxtekUfls7f1IZiP7L2mH4th3E/O5dyvxnkRJb+oYQ3GL/IxeZE+suCXQLuA
I+xjfGL2vKFmnOAW80D5PcyH0TNQN1jXYgbxgjVt/kY7+1XFhDRjtTY3X+M4
nC42MTNQ/crccDO9aK057/S7WT+WG4mB3F+fgpym3A9GEr/4G7jM/Ingl0C7
wMNhH1cPUlNVrxgM1qI1qf/P+UZ6s+oE61afg9qOGQy00E6cYyR+wAfWNv3H
rFFa2ED6D57xc/UpGguPIy+1/SFjJPwiz/VYkR8F2gk+Vbm+a5LWKUcaDHm+
lPs7z4QmWifgEvVosYu5UeZxdGLWhnoNvzCepvvyDw+lLeNGuZKYx34ghnQO
T0bJL+pPclmxT7Ob90E/VeRHgXbC3AX+06XhF7MkxfN1gbyE30duYd6CM2g6
dY6RuRhmP3gf5SlytuFAfVv/kxyJ5utxsiZf8os5rvInOVizm3MezI3GU8Ev
gXbB/uc6Wxp+MRfGGUJ1AS/ZwQcfnDRdtV7PvVO+WusbP4gf9BgM19NHL3ae
sdfgPCYcmWeOl/yiRi2PlQc2u5k5rWfVfYNfAu1COX93pPyidqq+UQeUPQ78
KOZE+V6noJbE30L/oNuOZFaV+5vJg5vo1NlDXfIL/lGXEiM1u/G86OUMfgm0
E2JoM/JdZ/zxPO5qQvwtfOk0RX4wOiIt14xiuUWd+MXapXfwHpsb5cwHZ490
sico8wuO4MMbrq/P/dzf3K/GM6hKfhk3blzy24mTmt3wkhgm+CXQTohf8nXG
26VfBoeI2V2v5s7yh9EIeMNoi+ou7s+bqv7U7aCxzJo1K61n2gsPfqfnJ1vj
Ygjvo/eY3284EHP5XPhgxChm1mQPUskv6mI4VGzT7GYOqHohfTn4JdAuOJsj
z6OzH9oXxSc86vjGGjCvTTzuOjTLLl/D5tTyqncz8Ag9U60Iv+Q6cKeB43Cd
99E6H65Ors+Uro4XGn39I/HX8c3oT1XnDn4JtAvZz+Ka5KvwtRhloBvtInPM
QQcdlM6h62boJdbzR8tUU5HPdUPPFJ8RjRmnix/x/FBnQ+rHwCc+I3Uf+awe
xYyR+F/8LTmSzzL4JdAu8K3n+pHZSPoWb7311gFvfOk5rleLMd+uW2E90nHp
S56znr/h5iHtBn+bHJTXX67DiyLXHOjccnmRWt/++++f6kbiSXkrj0vGSPhF
D6acV507+CXQLtjPd99993RN2k8XLlyYNMSBbvoAc72pm/0v1qj6s9ekTsJr
zAvYLWeKeB68R2ZsWt/6LPlxFyxYkLR0Ppd8E4Op23m/s3dXrQ8flbrwSPjF
z5wPEflRoJ3Qy0unXZr6tP5evZHdCPu9Om4+v965cN02e1sMwz9Mx8pzFXh5
9Vuq/eSb/jCxpT4CeZ6v8Xzj7J2R6C/ei8iPAu3GsvBLt/rr6Ct0CfVoerUZ
St2ax8nXzO0y+4XeK46hQ+PEfJPbufm5eVO0YTWxxjxKzKL+jof47gbrhce/
egjUAvGavu5AoNVYFn5Rrxa7dxv4dehE6l80UHPdunmOBI6Rpzr/mjfHc9ZP
TcfVYyROVG8Wg5nZMJCGRKvHU/IsWs1gOjZ9l1Zsj5Cj0dYCgVZDfsMntzT8
Yi2Us9i6AXhEf/S2226btNM5c+akuVndDs+bL0UPgxkY8h/r3+xwe4D4i49u
MM4Q06hLmVk11LngYh8+SnzFYzDYeQmBwNLCfDU5/3D1WmvX3ur+5n0PNS9m
eUO9XD2anoE3m51j1O3AIfIXcQreaafP2GPH2QaBdsGcNDqFnF89YSi+sF75
wfSw0B471X/cDLx0ZgPzB3o9fMh1iF0CgV6FGJnnha+VTjHUTF0xPD+sdUzb
7SZdw3PnCcEv5nWL/esWuwQCvQTrj0dL3DKcM9r93P3MeRzKb7q8IX5xvoHa
rP8P5wygQCAQGC7ommazdXL2QiAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQC
gUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAg
EAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgZHh/wDk
A1Df
"], {{0, 199.}, {280., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->Automatic,
ImageSizeRaw->{280., 199.},
PlotRange->{{0, 280.}, {0, 199.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJztnQusbGdZ97lpiMZIjEai0bQNTdsUAxiVTwwoBCJEI0k/BJGQCLblolBD
SwuFll7Ooff7/UZbCrQFwqUWhCIXobRQrhIUxYDFAu2BAsX7fT5+K/mt7z/v
WbNn5uw9c2bOfv7J2jN7Zq0173rXev/v8zzvc9n/hUcdduRDHvSgBx398B/8
OewFxz35Fa94wav/7yN+8M/vvvzol7zo5Ucc/oyXH3PEi454xf954UN/8OH3
f7A9+sEPetDDfvA6KhQKhUKhUCgUCoVCoVAoFAqFQqFQKBQKhUKhUCgUCoXC
QvA///M/o//+7/8e+59N/O///u+Gm+AcfvZf//VfY+echv/4j//otmzDpPYN
7V8oFLYf4IDkhlk45z//8z97rtoTDP2GHAXvAc5N2/itPf2dQqGw/vjXf/3X
/n0rV80COSWPc8vzTUL+TnsuAEcllP9mPX+hUNh3oHyUXAEX/Pu///vc52rP
Mw+fID/927/929i5bEd7HvmqUChsT8AVe8JRyjvwFNyX3DKLPunxeRzc9c//
/M9jNiraxm9oH2t/q1Ao7Pv4x3/8x14mYvz/y7/8S//drHzQ8g3v4ZfUNTc6
NnU+jss2KP8l+L+4qlDYnki5Cq7wf3lhow3ke+QfzjHP+l3y1QMPPNDret/+
9rfHeAkZUJ2Rff7pn/5pD6+4UCisI+CW888/f/TSl750dMYZZ4x27tw5Ovnk
k0dnn3326KSTThqdeuqpG26nnXba6IQTThideOKJo9e//vXdsa997WtHF154
4ejd7373TG1o+RJ873vfG1111VUjUuK4PfKRjxyde+65o+9///u7HVcoFLYH
khMe/vCHd68PechDxj7faHvYwx42euhDHzr6sR/7se7/Bz/4wd3xfM7/v/3b
vz360Ic+1Ot12KZA68OV4POzzjqrO5ftYYNbC4VVArqEeoBzqPqFvjh872et
XrLRlnqK9hU+Yz7P32vtL2lTybUsxqDjL8+X52p/d5WA7QpO+KEf+qExDpqV
r374h3+4f8855CmP53v3efaznz26/fbbx+xkrgPSDu328trll19efFVYaQzN
t9g3Lrjggm7+Zkyw+d5x4fM8z/h6xCMe0f2PHoNu853vfKf/TbhHHlT/aNsG
HzF+fvzHf7yXJWgXbW39hiZd2ypATkhZKftp2ka/c93JLcpp9A2vP/IjP9LL
b3fdddeY7wKwb3i176688sriq8JagOeWZ9q5FnvIpPGi/jHL2GIstp8xhhin
P/VTPzU67rjjxtakUq5StktZ6dJLL+3Oye//6I/+aHc+2go4B9ewqjwFuFZ5
nH7wGrJvN9rsT/ug5TllrZ/7uZ8bmy/uuOOOXv5sfT+181988cXFV4WVxpA/
DoAXkId4/nnmlWfmkQPSbsvYZM5XDthvv/3GzvfZz362k6scU6xFpd7JOOc7
xlSOWdpIW2e5plWBvGT/wFlXXHHFTOuD6sD0xXXXXTe67LLLRrfcckuny8lR
2rWYD9JGBuhH+D91b/kd3i++Kqw6GCe59sPzzDpU6h/KU44Bx5nfTdocK2mz
+Ymf+Ind9EX2+9znPteNydRdtLMIuIljUragra2v4zzxv8sE1+L12jds8/CC
Mmer46Ffv/zlL+/6Wn3QPqLPzznnnH5f+zV919Gri68K6wBtr3ID8728IM/w
/o1vfOPozW9+8+jaa68dvelNb+r+32h7znOeM/rJn/zJnrOc+9l+5md+ptdh
HLusawHGHmMy4014z/p6ax+jrUPXsKqwL5UT4ZaLLrqo9yffaDM2WY5BToKf
0//8iU98YtfPysRyF7+nbVDkuc4777ziq8JKo31+Bc9pq7fAWynHzGIncgwx
LvBN/MxnPjP6gz/4g06PS1sW5+Yzxpq29zy/chc6C8cw9hzvk8bUpGvbm+Ca
htb50HNnkQntk9ZXAcB3/k8fc251RH/zU5/6VOcX6nmS24uvCusAbe25xnbJ
JZf0cpXrgjzLyjCz+gsoDxCHAtQ7n/nMZ/ZrWcoB/v+JT3xibBxlu9AH0zZN
22hr7rvqNvfkBK/FNYPNwn6De1qbI/x+5pln9lznPVSvhDOVxX76p3+6O/51
r3td9136PaS+bWxhC/1VvBetrFwo7AlyXKc/4aT1waF9pyHlBtf/4C98vF1r
VFaCs04//fRunxwH/h7clOtovIfDGAuOnRyH6ljZDm35bW46f4/9F2n/WiRf
yQ/f+ta3drMRwj/PetazevnVa/S6kaVomzZK7gk+Dn6fa7Xcx/e85z3d+u7x
xx8/OuWUU0YvfvGLx+KLPM74xkJhs1g0XzF+Wn9Un3lsVK7ra9difkdmct/2
91q+Yl9sWsJjUheEn3JdzN/3NdcikyMXpU8ukq+yf9PHIddIzBMoX8nd2Ntb
GwDtYn995Y466qjRU57ylE7PTB+w9FXhFf5irlDOAqvqw1tYHyxDvgKp38EP
PrvY433O00cb+aBt5xBfIY8RQwfaXAfqIJmDRXz3u9/t2wL4Ltfb0i601Vgk
XwE4l621PeY9hKvoH234bPhG5P7wkWuKn/70p0e/9mu/1uvsuenzwsZvuk7D
dx/72Md634l2PbNQmBeL5it5SbuSUN5hjLQ+pchYrEG2vqOT9EHWId2Xc/Le
tt16661dXBx6y3Of+9zR0Ucf3XPU0HyPTLVoOWDRfGX8TcpVvmdecJ+Ue+Ev
+cqYBnVtYnpsq/q7cUDJTW457+R6yKqv2xZWH4vmq5Rp5JIEz7JrhenjwDhp
fRon8RX+V8D9yV3wO7/zO4O6UI6v17zmNd3+rJel3qLMsSjeWiRfKTOBdh6g
L+iX1n4FeJ/+V2z777//6AlPeEJ/LPdJvzfOzWf0qX0Mx2Gnt68zDuKmm27a
kusrbG8sQx9Mzsp8u7ySU8Xn3bg4/m/X0TfiK/y8AWOCdUftxYyZXOP0d4xj
VJa78847+7Ytwwdi0XxFv374wx/uzy9v0Qf4zaUOnHGE+rAYi5C6oeeiT7HB
79q1q89JSr/97d/+bSfLuk8bX/qoRz1qoTp2YXtgGXyF7NKut8lZ11xzzW7j
l/gd4kuSOybxFX7bcN5tt93W+xp5PsccPJUygNeS8gd5qFr7yqL0l0XylbLh
b/zGb4xxjP0Cr2QOjlxfwJ/B9thG+st1jRe+8IW9zAnX5XqI89CXvvSl7v6p
D2Lvsp///M//fEuusbB9sSx7O8gcDOpuzOlyCM+1vvCt3jiJr9ie9rSnjcW2
GO+jz1jqhW1MkPbjQw45ZHTzzTf3v7dI27AyXtqD8NGfJT8PSP+CIRsf8k/a
j3iVs1o/0byHrg96L5LPsfvN4j9Fe775zW/uFhMPh2FHLBQ2g2l81fphp8wx
S/wISDmJdb+08fpMy1NySBsDuJE+KB9lrM9HPvKR0Re/+MX+eNrA3E8OTr5n
X68t7cPwqPb6IT/IrYAyT+atmjd+MNt2//3397HQv/ALv9DJlfJxypPozbPw
FTwlvx166KGjI444otP/ZvVJo32/93u/1/Vx5h/6zd/8zZmvsVAYwizyVcbP
gnl8/9K/MPG1r32tt4m78UzDVYxd/T/btrV8xTHqH4yz5z//+d3++KMyJuGp
5E0+u/feewdlR44nXkWfy0X5Y7e/y7i++uqrZ+J/6+rIHVwTuiy+mvTDAQcc
MKbP8YrtCD9R+xFsxFeZ4wYOJ4YKzLL+oK/tDTfc0Nu9krMKhc1gGl+5Zu3z
D/RXNsZ2mv5iLI4+ivg+4RPturlrgo6Tz3/+893+s/gzmAeK7bDDDuv3b+3m
mROYccdvwI3tWvyOHTv6GOBF+LgjA9FuuIXrTbv2LFvmD035N9dYua42/8Pf
/M3fjPl5TOKrlFmV0bL/psE4T34v2+29LRQ2g3nsV/pGz4PkDZ5lYjiIaW7P
rX0Jm257XLZtSB+kXYcffni3n3lW+K30S0V20UdUX+3f//3f72008tbjH//4
/jcXFZOTPhbqbPLWtPw8rTwqr+irAVfJaeqC99xzz1h+a+9F/g/gK8+p7/rz
nve87rtWPt4I1rHPNmonLBQ2g2l8lTZX3h955JFdjiVi/3j/x3/8xxtuxMsS
w4FdJceQYyrX7rBhyTVD7ZwUj8P/d99991j8n9eR/Jrv77vvvm4N0vGUMd2e
Y1F8lXp2ctAsOVtzH32dlK2UtVJGwmbntaf/wkb6oP5VvM9c07OuQdjPmefZ
+1QobAaz2q/a3Lt8pr14Fnkg8/xpW0/5AN2FmA/1yNbWPYmv9GFwLKl7tn6q
rkfiG6qs8IUvfGE3vSp13kX4M8CBabdKWUtf1mn9KZckP8EH/E9/8D3+V0J5
chZ9MGUhfo86Y2krm4aMN0+52bYXCpvBNL7KnOHteGljXSdtrtvxqiygPcMx
h72jrTvctnOSPkisDWg5CmhzB4zXjEMx12crR4JFyVb8Jrpf+lrANUN57idt
ri/I/X/4h384et/73tfp2l5b5n6x/zKn2CS+Si5kw7dtXr8Vz9vqt8VXhc1i
FvkqZQ99lx3j0+ojOCaZs3PNymf5RS96UbdW6FiSU1rZZhJfOaaEcbwiz5M1
rQAyl+1IH9NFrg2CRccPboRZ7e17mq8vfVWGOLhQ2AxmtbenbbuV8zfa2jp7
6CvYcN/61reOvv71r8/VzknylWO99WGdRS5oxxNjLPOsLALFV4XCnmEaXzmm
zLH+rne9q4sT+8AHPjB673vfO3r/+9+/4fanf/qn3b74D2BHyRow87az+Grz
KL4qrDP2NB6nPXYS0KuwgZO7va0ZPY/fafHV1qD4qrDOmMWfYZK9dJovdp5b
Ozc2o4wjnKedxVebR/FVYZ0xy/qg4xjbdtaNmgXYuFvfBHXCedtZfLV5FF8V
1hnz6IPwVcZlzDOejcXZk7XxbFvx1eZQfFVYZ8wSP5j5XoaO29PfnSe2p/hq
a1B8VVhnzOt/JeapP5jnt47dvHxXfLU1KL4qrDP2dH2w5YZltLP4avMoviqs
M4qviq9E8VVh1VF8VXwliq8Kq47iq+IrUXxVWHUUXxVfieKrwqqj+Kr4ShRf
FVYdxVfFV6L4qrDqKL5aLl9xbnKB4YebeRCXWZuPfmmvjbgpcq9+8pOfHH30
ox/t6qFRF4ecP8R7znq/9a2jRi45OTgX56Tu0Mc+9rGZ2kd+xbwH5AQBxseb
hz+vx5rVhdWHueWyJkDmksx8wJP8PHk+zCnMvvCCOa6yhouYJ7fCZtDGHiaP
Ms5pI231+riGeXKDeq5lXqN5v7g228q9mqemw2bAb7UxnDwXOeZ5pvhM/hnK
UT2E9pyZj3+W4/0tNmpV3nLLLd09eexjH9vXWPQVZJv9rLC6GHrG87mQn4by
A/s8+nmeE92greNCrr5Z6zptJawRRf6sc889t8urbE5h2khbW64Gs/CW12jN
MmQf+ITrnLcW0KyAC8xXYXuRR5aF5ATntfZ/YW7lWWPU6bfN1Jm1viIgv1rm
/bd2EXIg8J6z/6J098Ji4P1ybOczI8cwPqw9MHRc3ndqit9+++2dTM+GbvDB
D36w+86cw4sazxvhG9/4xui2224b3XnnnZ3OQhtp69A1zPIMo6dwrjvuuKM7
V9ZpWASUB7gn2b5lyauJ5CmR+TeMn/Jez5ojOrkvn5V55N6nPvWpY3Y06whQ
qzKhPL1M20Rhc3Ceyeeufbbe8IY3DB6TnMN9b+vAs59jyX0XVad9CPw2z6Sy
Ps/lkNw/1PZZ+HSI0/iM616ELOm4sg+t5zOpLYv6/fazrR7vk+yi05DPLbXg
4Czr6bKR4xaZ+M1vfvNuNo69MX8W5kd7v7QZ8D95PcmVTk1ycqcDa7TnnN7q
BnzneM1xlPPxrPXotgJtGwDcRBuVqYbkhWnIvnOsLPq5b7nQGljLkLEm8UbK
RLSHtlhrZx6wvzHtQ+efBp4pjucVu/rpp5/e2xetEUndRWStt7zlLb1tvrhq
PeA8rdwOXD/hnqPDaXvidb/99uvWadr7m89r+3nWrctxvQx5IO29rW2lHXuT
rmEWOEZAynKLAufm3u0Ne+AQhtYMs9+5D5utGTSrPphrP7t27RodffTR/Vow
tSqVtX72Z3929Pa3v70/jrm5sNrwGWqfBWqQs5asj4/zkvf66quv7vZLLhLa
G5xrh3LC7A0M6bvKkS3/zrrelufx/IvUL4Z+ByyTtzaSQ73fm+Vq70s+l7Oe
M+3oglri1pdly5qWN91009LWVgubx/3339+/Zw0YPPGJT+zup3V/5Snu9THH
HNPtk+OFcdvyUP6vjiiUD5YB7R9D9VSH2jwvn7a6ZukWexfe2/ShQGc49dRT
u3lXvzWeZZ9tajFpG/BYgL44z/pLYXPwfllHFOSrYzNtScw5aaO0ZgQy9V13
3dXtw3G1plJYReS8yHqq/7O2vWPHju6Zpq61zzYb3HXjjTd2+zkWWn5aFd17
X0feP3V77wX3QN8dOO2XfumXOh1f3yRrlnJP2Y97njaJQmHVgHybcUOp6913
332jnTt39muF6efLPI0vvOsZzuXtunFhscj5wn5H7kr9iM+xPz7ykY/s7h1+
dtoo2eQpMCSTFQqrhNa+xjOb/gv6OjAns/adcQrveMc7ervAIv1+C7sj+zpj
D5S54KnLL7+8W/dDlmKOUb9n/kG/xx8YblMu28w6WqGwSOTzPkku0sYoZ/Hc
s/6tPsFnH//4x8fksrTxFhYP76NzhkBnR5dPm/ojHvGIPs7kK1/5yphPu5xX
8nFhFTGUNyLnVn3V/Pz444/vnvMDDjhgzG+H7eabb+59HLZizbMwG3JdOG1Z
1157bcdT6bPA//DVscce28WuCP3/RBvjXiisCob8U/zM55+4AH1Wzj777D4G
lNef//mf73WMpz/96d0c38ahFRaDVm8zXhm5SlvVwQcf3K8Hcs+OOuqoXtfP
+5RxhGCZcTWFwqzIWIUhH3uebXU93rMxP2O7Uq9AzrJuJuMElE6xPKTPL3Iu
94It49exXR133HH9MclPctM8uT0KhVWEXJZriHAW8zjzNbJVjgvGCTJXzv3w
XetjWvbc6WjzBeUc4Hfo4L4np4bzhq+uk8BV2qdqTaSwr2KIr3jemZ/RJzIH
EmvmjhNyaBmfKMqvZ360Ol/q7+m/gFxL3xuLgL1KnfCMM87ojzfGZm/FzRQK
i0Y7Zox51l4iX6VPD36JT3rSk/pcasL3pXNMR/qog/RZgHfUAbFR5doHvgr8
z+trX/va0Ve/+tXuPCnjVv8X9lW0fJUxkeZ21geejTyMvj/kkEO6Y1Kuqrl9
PjAnmPcTjkp/EXyrWPPTVyHnjC9/+cv9frl26LkKhX0RG+XUMkcavvDE+Kev
D/oJdvn9999/bG437rawMeB4c1K3oD/p15Rn9eHlHmhPh6dY3624zsJ2R9pv
0fOYt9VLyP8md/GKXQVow6pxMz+Qq5BlyUWGHIVMi86nvcrYZWIB9VnIfk5f
0orvLGxHIFcBxgdzORyWNni4ivUp1gwPPfTQikubA8qkvKpPk4MMOTb1vlyf
JYeZuvpQnoW9lVu9UFg0NsrnnDkHncvVXfR1UDdJ/1LsLTVmZkfamogFVH5t
+xe5ynzfmaMBWMNEVOxBYbuitW1pc2GcIQOkPYsxZY4H/BrB3s6nuDeRnN/m
xBTy1dve9rauP/VXYE3DvDDksc4Y50KhMBnJWY43dBh0k4wzxB/IeDZs8Tfc
cMNYDRAwVMdgX0Xa8dq6pmkj/63f+q3eTyHrd2KvOuecc3ofkcwNVLxVKAyj
1RGzJsU//MM/dDJW+lubk4Y1LeqzII+lTLHddMWssWy9bj+3r1Kvhv/lKuYE
kNxXXFUobIzkLDdlBta1XvnKV47Vy5W/GHvE52KnT1ljO4y5lpftL9YAsUE9
7nGPG+uzjLM5+eST++Nd5xBVe7tQmI5J9nlhbofM96eeQx0L/U+3k79D5m7J
GEH8bOUpc8Fot3rVq141WMMt3283GbVQmBdDfqXM/eYrZRydcMIJ3fq7taS1
H2Pbyrqs24WvAPqfNUrhmfRX15dNjjfHmPFQ6beeeaxqDbBQGEc7Jtq6im1N
WmQJuIuc8MlVOR7f+c53dvtqw9mXYS4X++3WW28d86dybUIZK3Mn6seWMeUl
VxUKk7FRzVeRtTWVv9BlTjzxxLFYQ+wyrNdjz7ryyiuXfSl7Bem/hmxpLaLW
Z50+IYbJ/bNGc3suUfHMhcLWgvUtxiX5L1OW4PWaa64Z83NUVhuqjb3KSNt3
WxMecC2sN2TuYvRAa6fxPz5YmZsaFB8VCsuFdSy0ZVkbzxxz559/fref9UDX
bd0rOWUojg+d9/rrrx9bf0BHzvwWqQMC+Lu4qlBYPhyD8FLmGXA9DDsOcYfa
mIG+qOsyZtHZ8DXQTz1zS1933XVdjkPzFaMPGxd4ySWXjO65555u37vvvrt7
hefWjbMLhX0FKTede+65u+VGcTMnPHbldYqXbnVX/TV4Peigg8bWGbBRWW+Z
+BrlKvzc5easCVkoFJaPlJUuuOCCsbwO6EXIG4xh4qRTzloH3kpdznUG3pun
WD3YeCXseGeddVbv/zG0RjpUu6tQKCwe7Toi3HXRRReNreXnGiIyCUDmWIcc
pXLLvffe26+PwrvmXWUdkGvjlevFnxaeQu/TVwReVpeEo4qnCoW9h7TrOEZ3
7NjRc5ZcpY7IK1iHeB3zT8FVcLM+65mvyv+pRSsHe22ZkzVz8FQ+40Jh+Whz
yViTFZjbIWPm9HUgX906yFfam5CJzF2sbz/XYh1T5C8hF3l9+q/5f/mtFwqr
CfP+pSxijIo2eGB9MbBsH29t6MAYb7mFV+rC45vBph+odnY+k6va/FXrID8W
CoVxkNvBHCpyFz7gyCj4AiDDyA/LXutvfSrIQ6V/Pxu+VelPxntkQ2s1WnfD
NUNQul6hsL5AbsK+rnx14IEH9nYs+Ivxb86t1KWW6Z+V+auU7ZCrtMHBr9Y7
02cBrjLGOe1UWRO7UCisD3LMEsMiZxmnoryFbkg+mvaYRUOfirRTwTesceqj
gFylDyicRZtF5imEu1K2Kh+rQmE9oY3Z2juZDyrzlpLbAQ7Q12kZMBex9qes
A6gPhu3Dj5/9U/8TrW9HoVBYL8hTjHHfM9Yzvs66C8ozF1988ZjNe5FIXoFv
lPnSb931TNoF1FOtD885eD+pTmChUFgfZF1ixjZyDDJU8oFc5fobuuGybO/4
3MM7tIFN3/UDDjigb8+xxx7b7Zv1BNvchvAY+mWtCxYK6wl1rPTTMiYPPkq9
UG7wlbW5RUM7v7pf1v5R1jrjjDP6+hvKVvBS1ufItYH0ay8UCvsO4AE5wjoy
+f7SSy/tOWFPa7S28S/WVxSuUWbco35i2N2tV5b+++uSW6JQKGwd9FtIn1L1
MXzHkXuI7RnCLOuI8ouykLnoQdZaZr0SfuL39FvIuoCtT2nlhCkUth8yVkWZ
xvpXbsTBkPdB/YqcB/PwBXzY+pxfccUVnd4pN8FXmQMHpG8C65pp/y9dr1DY
fnAdTb8nbEXasOCtlLvIOZw8NW9OZfy/4B3quxoHiF8Vv6XNDLtVy0u0TZ5L
+3rZ1QuF7Qk5An4wH426WsbxkM/Tuqyz6IOp/yGXkVO+redjDgm4Sv2PNcPW
R4H3WRO2UChsL2jLhhvgCu3p5JFyje7ggw/ufct5Jb8y9ZNnBbzG/lddddVu
NRN5j2zFb1lji3jHRNYLGqrPWCgUtg9STpIL4IcLL7yw5xdtWqkjznNu5Snj
ang1hxW2K/gsZTZztoOsYSq2U/3qQqHw/5G+Wda8UofDLq7eJtfAV8hDbNMA
/xmrqF1MPZMN3TNrjmFPm5T/M+1VxVWFQmEI1G5Qf8scVKwlIiO5jieHyHXI
Sqz5sX/a7eEvPkPnLBQKha2CPpqvf/3rO26Sd7I2KfnUjZFOHwR95+U33rPB
V8hVhUKhsJVIv6mdO3d28pJ2eHgI27vxyO573nnndfuoP8Jb+qDCeaecckof
q1woFApbBeze2rqRoeAiZSzlK3M94A+PjofOKKeZ+0H5Cp917ejrkD++UCis
D9p6M3AW8TnG+7V1apS70PngLW3tyGW5xsf5yke9UChsNeQVY4zhmjPPPLOv
A0/Mn/zV1pXWfgX09arYv0KhsAikXJVxxrw3dgdOSl+H1n89fara/HyFQqGw
VTA/le+NM5R3Lrvsst4+ZQ6rlK1Am4fG/DSFQqGwlVCHSx9OZCx90eEkfd/N
qayt3ToRIGvvFAqFwqIAV+n3LtDthnKB+pq1t3jNeGogDxYKhcJWIes2WHsZ
7jKmxnrw2NSzhoU8xvfGNJtrSxtY5YMpFApbjZSr0OngpYyvcW2Q3MmgjZEm
bkdOg6/gvaq3VSgUFgHt7OhwWUcHTsLGjo8VfqTITa4FGm+IjJU+DrzXjlU1
twqFwlZCX3R4SN9P5Sa4Cs5ijdD1vsy3h6096++knlgoFApbDWxVcJFylXIT
r3AR+WaUqXbt2jWWVw/ATcY7c0zauAqFQmErYR3TjF12PZB6y9qhWPeTq8jT
7ufwnRynnMX5lNUKhUJhXhhro1yErQo7kxwDR2lH18Y+C8wB6Dqir8lfQt2z
1g0LhcIQrE/vezmD99ip2A488MCxuhDYrbCVz1OzFD8IY6M5nhph6bulX5Zy
WdWVKBQKLeQE/QwAXCGP+CrXpK18Fj5Bn8x1wDbHDLn+eI9+2MZVl89DoVBo
0cpY5mrPHMj8j91J7kFnmzXfHvYswP5srW6oLzznh6taX/pCoVAA1l5GtoJX
9KdSrtJnAc5SnsLPap7cVVlzR7hmyLmV3eRIfSIq70yhUBgC3KAdHB0t/aWQ
e5R34LZJ9WuG4L55jDWbsdm3tSj0QS0UCoUW8BDrgMhT8JK+oAcddFBvX1cu
Yj9lHmOg5wH7Z5wz75GnzEPDK7qhtq3SCQuFQgJOMF7GmjeZzwrIUeqA86zb
GeOcx8BbaS/z91t7WflnFQrbC1nTQV8F675j+1a2MYeCdiU4ZBlAv0TOss5O
2rXSP4s2K5tNq1NhzfqqW18orBcY59/73vf695mDXZvRYx7zmDEfTmpvwQnL
sHenPwX2LHRS7O/G/yjruRapXLaRf1bxVaGwvmBcp2wFZ2X90vQpsDYEWMZ4
T1kJTsr8M3CVNS14z/eZi3kju3/xVaGwfsjc69jL5Sr0QOxEyi9syFXKX3DB
suzd3/zmN7tX+AWdT46SS42RNucfW9UCKxT2PcBX5loH2LS1a7Pha2Vu0NQV
lxkLk36n8qs5ALG7Z/4sePbb3/72bscVCoV9B8gkJ5xwQu+vkHms4KqsO69c
tQzO0kYG99gGdFf86H/xF39xN/8s+IvP1PVK7ysU9h3AAehbN954Y7/2hkzF
ehx8RZ1AfQ58BXujHgS/L/dg16IN2LK0veufpa+D+xdnFQr7BuAf65qmXR3O
Qm7he23e1ulaNrCVKVu1sYO0Dd0QjtKOlf5ZxVeFwnrCOECgbef000/v19fS
ts6Yv/rqq/dmc2cC3AWPPupRjxrTYfVt3RP/rEKhsHegjXwoF8uOHTvG1tf0
DUXHuu6669ZKJuF60F2RD7M+j/yVNe8B9q91ur5CYbvAMapOxdjeuXNn73/Z
1jS98soru/3WIf9B+riiN1pD2ms54IADetkRO32uF2wkZ5UOWSgsH+bZc2yi
E5100km9PwB2KuUrxvdVV121VN+qrcB9993XvcJb+OqjG8K/ylgZ/2juh2m+
pG1MY6FQWA6Uk9CBTjvttDHbjuMYmcR6pmBdfJfglvQNlWfNp5z+8K1/1kb1
DYuvCoW9B8Y09qqUO7TtYPN5wxvesJtMtQ71Sm2zNVZ9z/aEJzyhu970KTNO
Wj7KPM8JZaxCobA8oCMhK8FV2KOH/Bauvfbaft+s57BOaOUh9D2uu/XPQjdM
/yz05I04q1AoLA+MuVe96lVjOpG1/ZA1WAcEyCPauNQF94ZP6J4g63shR2ZM
JNdPHnjzyytXaoOHo+WsQqGwPKQ8gC7HGDz55JO7calMgX7ke+WqfRnmkj/0
0EN3WweVt7PuD5hki0///kKhMD9yLb/9/5RTTunGpGuBbsQzX3/99XutzXsD
yFDY4Mmdlf5Zcpa2Om335s+Cy8z5kEi//0KhMDtc/0vf0FNPPbWzURkTqA6E
fKF/1brZqPYU6XcGP6kbY8+y1isbOnHykHNAylRp66o604XCfNDO7PyP/xF5
FrSnZw5O5Cz8q7Qzb5fxtmvXru7V2hmHHHLIWF4H+QtOh+szf1bKUMb0yF/b
pf8Kha2EeTUZi8hVGe+bNbAuu+yy/ph5a9esM1KX87pTrsr8WXAY/qfIUMif
8BbH6O/e1sUoFArzA87Ctp61TPU7Qn7Att7aW9Zl/W8zkFPgcvU4rhsuwj8r
+yv5neOSm+jf5Kd18v8vFFYFjBtsxchVQ/5VbG984xu7fXP90BoT2wWtbGT9
H2Qr86maR9V+Yx91yczBCoqvCoX5gQ3luOOOG/Ov0mcBOeEtb3lLtx/yhePN
1+2gz2TsJGBtos2Paj541ybS5+PMM8/suT35qny2CoVhsE7l2GDMOG4Yd3AV
40ybOj6R2pGJ+y1sDPoS7nnsYx87tjaBzGXNRfxtRerQbYwl/J85xgqF7Qw4
64EHHujew1WXX355b0s3d7nyAb5GhenQjg7XwO/UAIKrtP3pn3Xeeef1uiH4
zne+0726NtvKqvBWrR8WthvUXTK/HLodOdezppU8BX+hG4J1yF+1CjDPMxvy
qXqh8YbG8ZAzzPki7WHpk5WyVfmTFrYb0tbiWGBuT3sLsgC8xfhCPtC/ocbL
bDB/ljb1gw8+uO/PXDfEFk/MgDzV2rCQpzKnxXawDxYKLe6///7uFe76+te/
Ppa/ythAZADWB7eLz/pWIvU2fdqNN2QucK2VPqa/X/3qV/f+pOqC1mRM+Wqj
nICFwr4In39zZ6YOmLxF/a2MdwPaWAqTYV9hR8/c7rx//OMf3/Uxenb6itDX
xx577Ji9nWPSx6H8HQrbEdqg/v7v/77jJvKRtz5WxPCK4qo9g/qz84P+WQcd
dFBvv2LTLx4b4YknntjrkiD5q2SrwnYF61JtrJt+jdhaALymXqMNJXNuFiaj
jdfJ/Mq8h7NSH8yaZ9RD0wafx5XtqrAvQntt5gLQXgX/oAPmGDG/L+vszPXG
LW+U27ew54DLuDdPetKTep8R9XA2fB9e9rKX9ffRe+d9adcLy8+0sM7wec4c
CwB5Cdu6sW3IVYwTxgtcxTogKL5aPOxT5gf6PXMqu+ZBTgzrWHA/uH/GIHJf
W18s71mhsE6Qo6wzBdDjvvGNb3RxIvpYZ51A7L1Cv8Qhriru2hok3zhv6E+C
zwMbMu9rXvOaXr7yODApf1bxVWEdkX6d2J7gq/SvJv+4c/njHve4Pj+T+Xkn
yVbFV1uDXLugz4kdMBei/u/6apEjA3hPUq7if/xNvC/lH1dYR+R4YH5Wx8hc
AczpxLhp/8i5u+q7LA/KRPg6aHM3XhoZGF3xiCOO6Pah/5GZ1QvbGj7l81BY
Nzj/Yvv47ne/O8ZP2nbZ8F8EzM+uPfG8T6qbXvXUtwbKvsi8WeuMeeNXf/VX
O7mK+9TWZD366KPH/Bsqf1ZhXwG8QjwgvMR6E/M0PlXpZyX0UcAW77GTzll8
tbVo89Kgkz/mMY8Zi9kh57RrJOQku/fee/tjyj+rsO5gzoWrnJ8z1yWcpX+V
NqtZbbTFV1uHjHFSps049F//9V/vuarNQXbaaaf1/llwVMYrFAqrCOP/gflA
fW6Rk5SltLFbM5335a+w+sCf9+lPf3rPVcw9WY+I/FmZf6ZQWFXk3Jw2DGxV
X/ziF7u1Jv0Q0QVZc1LWevvb3158teJI3wRipYgzxOauj6/zzute97pen6y4
g8KqQnt61i+Aw/CvMs4DruI5zzrxt956ax/3X/5VqwtjcIiX5j6R809fUvPC
u8570kkn9cepVxYKqwbmU+dWdEHsVTn/ph8P23ve855u33YdvPhqNdHmRWbu
YY2EdRN8e+Wum266qc9NViisIrKuCpyFraPNC+o8zPbhD3+4zxOQeS/Lv2p1
kWt9ylvUCst4HbgKcC9rbbCwqvDZhLPMC4otVhsHchVzMZ+hA6InuIHyr1pt
ZH1tX33/K7/yK939fd/73tfti81SlA2rsIrweWbeVaYyf3HWM7jlllvG8u2p
M5R/1eoj69P73nhQ5iDz+8BXrreUP0NhFaHd6hWveEVnp9JnXZ2Q7V3velc/
L2demY3qqxRfrQaG5pWUj/0uc2OlfaCwfGS8rmPM+2V9NzEUP+X32myAtma+
Ywynzp/j2nufx9KGnL+y9gLf+X4rYyLkl6zHaUwyNQLlKHO+IVfx2Qc+8IE+
DwnI6yy77L6P1g/Y8eMz2+Z2yOP8LOtfKMv5vc+TdRPz2cz/0wcWqK9mfjZr
AvndutcrY50e/nCctrnKGMt8n/3i9ed6v/sDa8NpG8i1GPut7Xv7kT6Ww/K+
tnGnm8UQ733rW9/qXl/60pf269muc/v6zne+s3uOco5u660Utg+Yu7B1ffzj
Hx996lOfGn3iE5/o3rt98IMfHH3oQx8a3XnnnaO77rpr9OlPf7rb773vfW+3
TsNnn//857vPOQ+f8f0999zTPUuMHcbMpHw2fE5+yPRtZux5/s997nOjz372
s107ho5fByR/OL7+4i/+outf+o9X+pdX+v8zn/lM955+/+hHP9r1Jxt9zP8c
yyt9wufs98lPfrL7/u677+54zH4XQzKJ7WI/2kX+c85NGz7ykY905+WzrUDW
PGf9z+fhmGOOGYtftp4dr/iCJn/Cq6UfbE9w/5m3yAWYPnjaOvGRx5Zg/MPQ
lnFc5qBNOynnoxYZ8RTqHvrcM9+39n9zftE24ozS34b3mX97HaEcxLVbByHX
czPntf1uH3P9eS/a/duaMMcff/zo7/7u73bze7HPs2a4uPjii8d8CHg2zjjj
jC259nZ9mpww5G7L5838uTx32NZzrai1d1Tuye2DvO/5zPvM5DgyL+BQTW8/
z3GkzUF/GZ9/6pF96Utf6n4zcxnl/Jlc5Pkyv1Huv45IHStlCvoQm01yRXsf
6GP63XzY9EtbD7ztM/b7kz/5k9Ff//Vfj9nPMr8jfMaGTn/++ef3cXm+UmN8
q65dUC+F3Li2PWtu8j9ylTzb6sBtX67z81CYH3JP1uPJfKa58Sy14yifOXNy
OVayHhkyPq+Z4yNtKu18n/FiyVfrDm19clXKqK1cxX3gNcdzW1ud77wn5lmR
7+xzcm/S73CAck7a48UVV1yx27295JJLNn3N6n5yJrq+Ps35DHG/3//+90+0
T7V+DKD8Cvd9aAfgtR0Hjhd0AcYDz7/jRXnK8cA8zqu5aN0vxxSyvfO9cUPI
WdqFU74CfI5N2pq8qVu262jrglx7U7bNvpZX+D/z+rpl3v6UNx3zyl059of4
jftN32a8HlDGuuyyy/pjPAcy11agtbe/6U1v6s5P3gV+j9ebb765l6tsU67x
0N6Wr9bZPlCYD8xXWf/WcXH55Zd3OhtzV7ue7vPh9wA7w4tf/OLRYYcdNjr8
8MNHz372s7tz5hyateHYfC7Je9PagmkXPJm5Q3i/zr40bQ30Vu+D6+GGVl7I
dbFcxwWsr7G29tznPrfLL/usZz2rzy9Ef8uD6pvKqOnTkPPFkHyFTWuzSL+J
vD5sVLYPmcs+4pmzjZPuOedcx7mrMD+MZ+dZMP9M6oRnnnlmv29bv6K1c/r8
scbn/z5H5m/W5y/1yRtvvLGLvxfwl88mr0N25GzHuqGVIx2nzBHKR8oy0/y2
W1izzWOz3q5cry2b3GhDOTs47tJLL93NdnbhhRf2Omz682X7fJVvUg5uOcWc
k4B5jueAdci0abJP1UcpiOSe1lbLdsEFF/T7ts/NLGNIOYDnlxqKqcNok+G9
MRgt4Dw4Km1qbJ43c4fYnqwn5GvrS7Y35+NF8pXHJP88+tGP7uyJ3N+sz0f+
Ic6LXJu1KjfiK+Hchb3eNspl/j8kF6nL5f/OmcxznGOSf1XJUIVF8xW2c+Uu
nl/qKzFWXMP3N1lv9zlun8u0QasbJlLuS7/Tdv1eHlOf2lvz9SL5Km1DrlvA
Aax3pL7vq+uFrU/9JL6SX9qct967zGGb+yhzA2tjDrXZay3/qsIQFs1XIP1l
zj777O68rh+q51177bX9uc27Jhw3OXYAz3M+945PfZuE8z7IMbS3bGCLlq+8
duRVORuuYb0j7fT0vWt+HDOPfEXfpwzH8e19y/fJPcpNylUic323/lVlSy+A
ZfCV/OF6lGOFsWme+ec85zndvml78bVd22L/BONSTjKuI9GOG2Wr1u69LCya
r5BxUnZEZ4O38r7an8973vP6/kl9bhJfMSckT/Fe+bmN85E3rXniOl/2Qxt3
P8RL8lvJW4VF85WxE4m0/zoeWL8CQ3yVYwZdUn+GXO8G5JkkNiX9mNjwT2XN
6a/+6q/6NgzZypaFRfKVcYOAe6rM6TzhPZav8Jnw/mTs4DT5CsBVX/va1zo/
qZe85CX9OfH9Yj4ij8Kf/dmf7XbdopW5Es5xbRx/YXtjGfJVcgN2jDa3vFsb
Y+v5k9fkImUBxsyxxx475kvqOFTX1DcDvx58vP/yL/9ykEeXhUXLV/Q391M5
iPU21385fxvfA9qY6kl8pZ8cMY3mU8+YBv3u9ttvv36dmbnobW9725j/QptD
gvfaFuXcVk6s9cHCMviq/S25RR7Rt3vo/OqDxqgYCwTwNdU/gs+Rvaif0vqx
+hv+HjUb8ffZW1i0fNVCOcZ72sZ6gln9GcAf/dEfjfnQ5Vwid7VzBffsqKOO
Gn3lK1/ZLe/DNPtUu7Zb2L5YNF9pn3A88NrO8/g1Mh+LofVB61irb5CLoG2r
49B8Sa3POMcfeOCB3XvWJ7/85S9vouf2HBvxle2Er9IPfk/011wDxVaXcYdy
zC//8i/332eb+H37Wnn1nHPO6XxR+YzzOBfQ166feA/4Lp8nfw/O0sbY2tQL
hWmYxlc8txlnmvaRtDW1z1v6zqRv1Ve/+tUxTuE9MhHrhhmHk7AtrG8xr8tD
1p1LvZK5/2Uve1lnRz7yyCP7seacr76i77exdCD1j0XmX52Fr5Bvcl6YNT+m
fe090YeDGjNyj/0Hp9Dv3pu8j+eee+5Y37ERQ6U8Je/R9/iovOAFLxi98pWv
7GIanvGMZ+xm10/5dufOnV2cM0jZaW/aFAvrgWl8deWVV3bfD+W8NH5r0jgy
51X6PhMnwu+g38E/xCbyO7fffvtu49g1oZan2LSdsD3lKU8Z3Xbbbf1vYtvK
mpnYTlL2Ms8N9ixkhqH+2Ft8ZYwB8TD2b2vD2Whr4XU885nP7GWhjG0itxXH
YZfPa6ZflGtbO6NyFdzkb8KH/p4cSQ3fNo5b7kpuqnW/wqyYRb5q/Ywn+R/z
nA49e8pLzNk8/64PMj55z5zd6iMg7S4+8+iNKSM97WlP6/yzATE9reznOcln
Zx1zNnkSLjQnrz5Ei/bLmiZfsRFv3Orfs+QfyFhM7wd5YNTXvMfKWnJLe25i
BW2L80v6xr/jHe/o9svcDtzDvH/o/uibqa9zD7h/xnnl72YcTqEwhGl8xXM1
q22hlbWQc/RrZi5u52t557rrrtttbKbPOu1q20b8NHEmgN+Qn3j+GbPKV2kD
2rFjRy8bwFe0g9fWD14sav1wI76ifbQLvmjjNX2/0dZeCzk75Rn6O/t/KD8M
/cXvXnXVVWO5a5JP3/3ud+9mA2h1uZTVDj744O44uMq2wMd5bNU/KcyCaXx1
ww039LV2stabep41KvNcvGJT/cIXvtD5Q6l/YP9obRnocjkv+5o+6cpBvGYu
O9qQnJLHmHNX3mI/NmSytjbUNddc083tcuui16KmyVe0j3WO9N9sfS03wvXX
X9/lY1GGzOuVr5/61Kd2+2Yu/MyJwFpg5gxKuU/QZ/ZvxttkDkDimJHvMm7U
9Y/kPPhq1usrbF9M46u0uWpfRR8wdtYt89f6fdortI2nPsjn+PGkf2PWPgDa
r/wdz5F5I4TPe5vXLb9PXwfOQ7vavE5Zf2MR2IivMq4y5ZtcX5i2Ice0a3Nw
j5zPeen3STUoaZf29jwH90vuzHmqvS6hLQubfuZFMy9q/n7ZsAqzYBpf8Yw6
NzrWh9ap22eb55LjGCPuI08pb1HLAGRcjHpB8sxQjjrGU+Z+ELmGr29jzv3I
LYz79C9lvSp12UX7+2zEV+bWlFtSPtLmt9GWx+lnLod575B5kd0yNrytX0Te
syEubGOb1LnbekhDa7y55qHfl/e7fNcLs2AaX6WvjvqFW/rzmPNlUt5xuYox
xD7EcMAzGz3n6nGZSz7zzyTaOn5tXA9gbGhHts20H/uZ+m7ry7gITNMHucbM
pZq58r0fkzY5yXXA9OFku+OOO3q7dvqm2D9+bn5R85Tx2xdddNHUa0s7ZF5n
+mrYzqG+KBQ2wjS+gl+G5CrHVet/mFyV+Y/lK+Jh5p1LPae/z+u89XrdHxtz
yim812+7Peei1gmn+TPwv7WNk/dTzpy06e/vfVOOpP4MyHpqAF5v1zqG+Ir3
xVeFvY1Z7FfIVdqd1FlSP7EOTo4V7BXY15///Od3z3lrK6FG76zrb9uJr7Rf
mS8/+R89dpo+yPbkJz+5y41MnWT8Mtt88bluClyXyPYZj6N+mv20EYqvCovE
NL5iXRuucU0wc0Za10Y/H+0Yk+qWqH+0+Y2mYTvxldd39dVX7+ZHMsv62VA9
Z5D5Drhn6X/Q+iYkX2W7iq8KexvT+Oqss87q953HDyv9yz02c+5m/oBp2G58
xUY7jYNq8wlP829Pv12OxU44zdc0OavlK9uUsaSTUHxVWCRm8W8XacNmHrcu
SeZQb8dPq/Pl2JvVR3A78ZV2K9bnlEf1dZvFv505IHW9jXzGuW9DfgR8bj8V
XxVWCbPy1aQ1/nlyIu/pM7md+MprRb6ZdOws8lUrkylnuZ+x0MVXhXXCNL7S
nzmfJ8fDpGeslbPasdPmYZuG7chX+F1ob5rHRyn3z/yr7XUN+Xp6XPFVYVUx
ja/wTW71hrZW5xA/5dhv16c8dtb8IduJr7Yiv+hWtG+WfMhDKL4qLBLT+Gor
84vuKYqviq8KBVB8VXw11L7iq8Iqoviq+GqofcVXhVVE8VXx1VD7iq8Kq4ji
q+KrofYVXxVWEcVXxVdD7Su+Kqwiiq+Kr4baV3xVWEUUXy2fr/RHM2bA8Ws+
Md4bj5NtWFZOO/zg8RPOfDbk3SCv8Sxo/ez43xxeWZuH77IvCoVZoN+mvJD5
QHNONQelWMaciM+1z7h8xdgBs45fxryx1tTJMl+Xuf+oWWYfEO8oRyyCr7LP
PD9tMC+o1whfyFfLzr1JX2V+UXkLn/tpaJ8JfYs53nzOXitoYx8KhY1APKxz
m7ndMkcoedKte9Dm8FwWsu6BHDNrjnX4J/3xka/kK/OLkjPHa5oUU7eVgIPT
t5/roe8dx7QJfayVZ5cBYxUyD6vPwiz5+hJZZ9dzcY0+Z6Lq4xTmxa5du8bq
8/mcvvWtbx3Ljw4WVedqCNTZsVYquUppF+2cJVdBC3IXYBciJ17m8eSzrOkC
jAdeBLKuBr+ROdeVa5ED5TQ4hBjAWfPvbBbcb/okdTf6bKjGR4uhuHba3tbK
4JqH6owUChshazEM5R6Bx8CQHrPMZ8ycgYA2Z+2ujcBYYdxnW5N/87rgYXTe
RcqPeW45t+WhzFm8bFk2f4/+MAfjvFxJ+7k+ZdWhawQ571TNicKsyJwJPGNt
LPJQPpll2Ejbui2ZP3MeHcJ5X1sW4JozJruN6d4TGW6WdojMiUB/K7e6j/Ww
bN+y6vOljJTvZ+nvodwPPkt8zjX6f+qL/lahsBGGnpEhe8ky1s2mIWt9bVUb
JtnWMxfqVoP+zRzRqXsmhyV/LsuGZXvS5pfz2Kwwl2OeQ77lWrPu2rz5hQrb
F8lX5lX3+XFctWOa12WuPz/wwAP9+8w9Pst8P4l7W05IuSvH6FYjf0+0cpVc
uVG990Wh5RjbM+v9buVHof7u92kDbfMOFQobwXHRPi/tmG25bRlox7JIDpsV
0+SV/GyR8lWhUCgUCoVCoVAoFAqFQqFQKBQKhUKhUCgUCoVCoVAoFAqFQmF9
8f8AX55fDA==
"], {{0, 241.}, {300., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->{50.24609375, Automatic},
ImageSizeRaw->{300., 241.},
PlotRange->{{0, 300.}, {0, 241.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3Qm8beX4B/DbgKIoU2ggpUJEaTQ0q0SaLkLRPFHppoSkLs1FkwZCs1IK
GUJxRUJlHlKaKaXRECrr3/f1f1m2fc7d57TPXuvd+/l9Puuee/a4zlrv+7zP
83t+z/Muvs3um+4w57Rp02bM88g/m269z5p77rn1vpst8Mgv03ebsfOOu22/
3Qa77bX9jtvvuco2cz3y4FmPHPPMMW3a3I/8rAKBQCAQCAQCgUAgEAgEAoFA
IBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAI
BAKBQCAQCAQCgUAgEAgEAoFAIBAIBAJF4i9/+Uv1u9/9rrr77rubPpVAINAi
3HvvvdWFF15YzZgxo/rsZz9b/f3vf2/6lAKBQAvAFnzta1+rXv3qV1cLLrhg
tf/++ydfIhAIBG6//fbqPe95T/XEJz6xWnbZZavPfOYz1cMPP9z0aQUCgRbg
sssuq9ZZZ51qjjnmqLbccsvquuuua/qUAoFAC8B3+MAHPlA9/elPr5Zeeunq
lFNOqf72t781fVqBQKBh4BjEEssvv3z1tKc9LcUYt9xyS9OnFQgEWoCf//zn
1dve9rZq3nnnrdZcc83qm9/8ZtOnVCT++Mc/Vtdcc011xx13NH0qgUBfcP/9
91cf+9jHqiWWWKJ61rOeVc2cOTPG9yQgL3zqqacm3ubkk08O7UigePzzn/9M
+cxXvvKV1eMf//jq7W9/e/IlHnrooaZPrTj8/ve/r3bbbbfE7a6++urVpZde
mq5vIFAqcJJ77LFH9djHPrZ64QtfWJ177rlNn1Kx+Otf/1p9/OMfrxZffPHq
KU95SrXvvvum6xsIlAicJHvwspe9rFpkkUWq973vfdX111/f9GkVjd/+9rfV
O97xjmq++earXvKSl1TnnXde+GKB4sDv/c53vlO97nWvq+aff/4UM//iF7+I
sfwoQUsmXnvFK15RzTPPPInz/c1vftP0aQUCE8Kdd96Z/AW24dnPfnbi08I2
9Ad33XVX9ZGPfCRdV8fRRx8dGvURhDW48ygB1rivfvWr1WqrrZbWuLe+9a3V
z372s6ZPa6jAF+OTPeYxj0l61O9+97tNn1JgQLAW/OpXv6q+/OUvV6effnpa
ez/xiU9Un//85xP3b/1oc82jGHmXXXZJMTI91Pnnn189+OCDTZ/WUAFXKddJ
h/rkJz85cZWRMx5u/OMf/0ixJHtgzV1ppZWq5z73udUznvGMpBvAR73pTW+q
9t5776RFpD/0njbhgQceSOdP64Bjf+973xsc+xThxhtvrHbfffdU6/bSl740
1cqHHR5OuK9ql7bffvtkE+acc850zD333MmHdMw111wp9+0xtY9y4XLgbRoT
P/rRj6rNNtssnWPWSZYSF5WIWbNmVWuvvXbKH1tTfv3rXzd9ShNCSXFzk/jl
L39Zbb311kl/zAYstNBC1YYbbljttddeqUeCg45ggw02SL7EtGnTqsc97nHV
Flts0ZrYnvb3oIMOSv6OGqwDDjggxUKBqcOf//zn6kMf+lD1hCc8IflsJ510
Uqtr3tgCPqZxce2111aXX355Gr/GjsfDVvwv6GaPPPLIauGFF04+wote9KLq
gx/8YHXllVf+Fy993333Vd///veTz87HYCO855hjjmk8znBfL7nkkupVr3pV
Oi95zauuuqrRcxoVuO70qdaLN7zhDdVPfvKTpk/pfyB3pZ/gxRdfXH30ox+t
3v3ud1dbbbVVWu/e+MY3pnXQ4+zFn/70p6ZPt1W44oor0nUyr57znOdUhx56
6Lhck7gTH8XHYE9wEvyPJoFjkM9cYIEFUn3mUUcd1aq4Z5hhfTnssMOSz2a9
cO3xl20Bv8Ba9/73vz/ltJxnjpXz4XePv+Y1r6mOO+640HT8P+Qhjj322OqZ
z3zmhNbdH/7wh+m13rPYYoslv7KpPkx8h6985SvVKquskngH/EP4DoPFD37w
g2qTTTZJPIS1xprTBl/dObAN2267bVo3jFd1OGIhvMnGG2+cfE6/exznZjzj
4NWpjjr4XHKBbCgtERsrjpgd+GBiEj6E977rXe9qzC+7+eab0/c7/2WWWSbl
3ZqOd0YNeAjXHW9tHu65557VDTfc0PRpJV/XXH/qU5+a5j7/WI2eOhJ8/I9/
/OPEsct5vfnNb07cFRtCj8+Pvueee5r+ExoFnQOO0TVZcskl0z3uFd/4xjdS
3Om9r3/969O1HjT4LOecc04al2ILHKoxERg8rDViPHlluohPf/rTjWpW8aSf
+tSnqqWWWiqNUef04Q9/OI35Tv2O3+W+rJVsiTVPLEJnN8qocw8rr7xy0kT1
ClyleM17l1tuuaRDGjRwz/JqdJJqAtzP6DfbHNS8rLfeeinOsO40yUupxdtu
u+2S34A7NfdvvfXWMV8vFhEnbbrppilOpa+z3tx2220DPOt2gT5gjTXWSHOc
XkA+u1d02ofPfe5z6fFB6bHFM0cccUTilaxZYqPQ8DULsam4E58ljsdt4Qeb
gLgh57Pk2/gSsxuTzlU+LsfNfAg5j1EFnwqv5BquuOKK1UUXXdTze3Ney3tx
PeI5XMBZZ52VrvEXv/jF5OvLI/TbXvBb2TL7WFgf1l133erb3/523z4/MHmI
M6dPn558iI022ihx2U3gjDPOSHyDee48rr766p7eVx/Xiy66aOLeRxU4Wv55
vhYnnHBCT/65+Ynj8R7vxQ/rL/T1r3892WxcgL4L/DvridoN98drcIeP1lbw
HWl7fY/48sQTT0x95ALNQ9xvbKjttA7LfTbhQxjL/EraX/y1PGwvwE/gKo1r
eygdcsghI5srpxujd6KbFMPjnT02O/ALdtxxx3Tt5Q3oFtkMdX377bdf8kVc
W3Ecff4KK6yQ/BRzGleM22Sb8MPZv+gVdU7Sd4sRoxd1u2AcWHdyvvN73/ve
QL8f3yj2ND6Ma/sa9MqV2hNFjoN9cP60U6OsmRJTWOtdj1VXXTVxfOPNV3Pa
ei2XmPMeuGrvMXfx2GILGkzaKbaBreDn4Yn4HPw344fOKuea/vCHP6T3zy4W
4TvstNNOyaY57y984QutyLUH/gM+hHot9bN5H8Ne8ub9/H5+C/uQ/Ydev79u
H+imrD+DPPe2gc/PRtKH0NFvvvnmyUbQqHf2f5DTpi+T63DtnvSkJyWdKh6j
E/x964jYgo+28847pzjwec97Xrpn2V7gsfAH7iHfgtZJLMKPyfYigx3nN9Kz
qCuW22aPAu0De7/PPvuk9RsHPlW1cj6Tb8AmZN2Lx4wTuUrz3FrUa2/BOidn
fKsvaXM/g6mGOciHUM+N62Mj+BHW9tNOOy3xwHqKffKTn0y9B3N9J99L/gJP
OF585l65b3IL5j1fw+fII7Ez9DS5TpQ+lw6Sro3dxnN+6UtfSj0daHB81/rr
r59ez6b4ve5zNHEEusOctTZk/1HsOl5+caJw7Y0J89mYEuPyWXIOS75dDGqe
67fdy54nxpLPyO+zlpkDow7XlJ/PRljTre38CbbAtX35y1+e1nm2w3PW7te+
9rXJt59ovZ5xo15CHQ+9xeGHH55qR/GaeiP73qyL9z24DH0O7XnF//AYW8LP
YdfYL/6O3Lv8tXyUe+zcvvWtb6XH88GeyMPqW4unrj83mQM3z38ZVf5qdnCf
1dFaA/QPcV8ejT4l+wquuTVLDGGd4U/SK1g7xKrAd6XBMI58/4EHHjhbjlL+
DUeWtdZqCJrKv7QFeQ2038kFF1yQekCI68VunTUs8gXWA/VvOKd+1OGwL+IW
85WPQscg/qOzkZvI9oI9z4d7x5astdZayX7JU8t1up/+r5e986Tp8Hg+vJ5W
4wUveEGyR/Xn8uF789Htsfpzvg+/2y2+CvwL6h/oa/P+I5PpEWGMGp/0fNZz
vog1K2sdfbYeNbhxPc5ALMp+yGF4Db+UT8Hn6PT7/C4eEjsbG7lWUZ/NUeYe
cAx0TmytNVCcJTdh7RV36QEj16PmCX+AR+Az9tNP7IT75xzyWHDPnQN9LBvl
3tXtVq+H91pHutXviZX4JXJyfMp86HXhqD+WD7GQ8ckG4WMD3SG3SScl14mb
xlv3On74GnwQvgI/RDzr2vNj+ZDuDZ9BLGxt41fU/RP+pLGLA+EXWx9w63xX
897apJex14lPrB1i12zLmq5LbhLsKy2TOIE/UK9pZU/ZCjbb4z/96U/TfRp0
3ZN7rQeFc8BFmsN8GOs2rf9EDnZGjT9OQ76r/pwxS6ehz6aemw62Ccdlb778
WP3wnM+zB8RNN9000OtSGujgxYjmHb9OjDdWvtHYM2/5GWeffXYam3yFnAOT
t8Jb4afcA+ubud6NC/I5+Ctcle+2DoiZ2Qzjia/qc8wBNVm5vlNvJFqeUeYl
XVfXha+OD2yivmoiEHu4h/KqtHHG10SPzCmyO/XH/d6ZK4HxOMjgKHuHa4xr
4uPLCaiFwDnX4TqKV41DNvktb3lL8vX5CtZ+foNxal7zb3vd/9M6iJ+i4fMZ
bEQ3/9Ljnve6bBv41KNYCyx3KH7ga9PKq2Vtu/4Qt2W8vPjFL26kFmxUMVk7
2PkeY44Ogn2gmcn1EGwC/9D9FUPgna0DYkDcgVya2FaOgu2gv5loXag5To/n
M+RcfYc+aTgLvgU9uPy6c/I6r6e541/rxUwDMCrI+0Ood2SXXatetelNgubS
OYd9mDhwALh53B0umD+efSfrgscc4qTOn2JLOQGH9/d68E9x2DgB842vr3+A
Ndrc19cNlyXes5eG+cpXwBXggcxZMZ98lHPvR47IZ/h7xTviwgsvvDDt3eF8
2a+6r4Avx1ewIc57VPYAYhfxv/IT8gNi7qZq7CaCsA+TAy5QbCYvZU7KJYvp
cTAO/rrHttlmm1Qv0/nT+mGNdXh/r4cYnraa1tX3iPXF/XxWcT6/lb4Sz8u3
5zPgGXwvnZw522TeOOddrKH0XWwZ7nyY4XqrXWCr2WljpjMObCvcHzn0sA8T
g3ojMXXO6RvvY8Xg9cPrcIJyNHVNSv4MdTUO4wjnn3l/j3mfXJE1qDNXVM9V
e697ahwef/zxKU9B39yGvRDxFuINuXLnqUbZ+Q1zfxHX3pogN0TjwL8q5e/l
48lJ80/lDwK9wTjHt8nTyFeLv/kPNKzjHfwK+gE5Gut/fvyd73xnek4+EIcl
tyh2d8gReYzmTU5z5syZyTbJXeT30zPlul/3UgwhjmljrkBvGPEPG8H2iYPE
Jm2wX/0GHkgsIabg29GTldRPhc+pP6D8FJ4p0DuMZ3aC3gX3QEco3h7v8Lpc
Wyv27nzOfMYpiOdxhvIJ/G+P5bhA3Gruq33I76VplTdjH/xsey9hNkJsJLdu
3uBKnXMp62qvwDfx4/iG8kTiq5Jyc9l/cJxyyilNn05gksB/mWPiDH5FCbX5
9FN4GNqLvG8jveUwAQfM3/N3qj9o0x4FvYCmBm8S/kPZ4G+IdfAYNIpTqcXt
F/hEan30JcCnlLJn+URqDPmDcru0pW3XOnTDmWeemepwgn8oG2IM/QRKsg8g
JqOFkJuR/2vjPmF18Afw+OJy523+yGP5P64fJ6WmFZ8yDP1vsj5KHmyUewKW
DuNW7zHxRUn2AcwjeVfa78yntrUHgDyl/LH58vznPz/VL4nNzSGP0wrI29oT
kQ0pIc4bD1n/QHsXNfnlosT4Yizwx/VFsK9Gt0NuRu0emzjo/Iw5ryYu1zvn
HHT98Dg7LWchV1Xyvcj2IfyH8iH/aXyqsei1r1PbICdkHBqP9CT1Qw4ga0To
CPCxfP1B7sfF3+Y3sA+5J4OcZT7kZHCRamC9ho3w95SqE832AYfMNgfKRd0+
tGEfvskAZ6meJGu92AI9Seix9RZRi+b3rB/TewlvNigbQT+ee1nQsKpjr4M/
QzuvH07WtNJ3lHo/+GnsoOtNgxMoF8PgP6jVUNdovTK35DT49DSHajf0UfW7
XkL0iP5eNkPt6SBQtw90cd1yErgR9S16Knkd7Rq+skTIyfKXwj6Uj2wf1GBN
pp9UG0AfRW+Y/XP60c46Jmu0vjV6JHqNmINf38+a8bE40E77kHtk1A9/g/4Y
+ml4ndxMqfuWh/8wPMj2Qfxbao8m/AMNOV2luSVn223e05XSnatNxQfusMMO
qddSHbPLf+T+3PTOuE46WDyB2lh5VnpOtRP1+jEaoaxjt8+E/eW8rn6o41Zv
QfOlzkZ9fal7lmf+IexD2TCu1XPQGamPbLuOYCx0sw/dcp5sxtFHH/3v19Vj
KvlS9XO53zFfX88J+uD6oV8OTlHti/mMM1BHI0+sJtZ1pEWlMc48Q90+0BSq
p/W6+iGuYBu8Rj/Vyy+/vChNdR1hH4YD1rhsH+TeS/Vn+eZym2oyzC9/i17X
et2YZ8YrLkIMYg7La6hr9bdbo8UiNJk4wdwH2fjWH03/i/ohhsm947sdeW8I
NiP3OqzHF7Pr2ep5PRvbXgszHsI+DAfq/kPJ+gd+gX7NxmPev9aabC8ePXNp
seUy5DD0zFL/iW/J/fLxAUcddVSa/9Zw2iXaHlyA95rbeAs9NNgXdbBsCY5g
1113rWbMmPHvQw2t3tvqRHKMUbcP8hP2huB72PtH/ZXPEnfoycv2sC8+p1S+
OOzDcKDuP5RsH9Soihv0S84aJOt4XQfh99xz3RzEPeAJcv9n+yLop+1z9JrQ
V1WfTvkPfDz/Qh8ueUj6Ru/F1+Ah5CPy4Zp29tKp2wexhXoR11qPPofPEtt4
HZthXrFlYqYS9dZhH4YHmZ+0LtpboVTQ8Vr/zUHrvtyEvhj5wEvy23OMjweg
mTA3M3Ieod9xf90+qNvu5EQz2BW1GeIaNo7/c+211/b1XAaBsA/DAbzewQcf
nNZVvjOfuER08pM4w87+czgG6z99kt4lXidHj4OcatT5SXY47xvUDeq07DPl
tXJKJdrssA/DgXp8oT9MCb2Ru8G6q7dWji/Gym/yDcxN+QUxhziDnzHVPnxn
frO+v01Gzp2q6eRr5J4c9qcqDWEfhgPy+HJ17IOejvJ6JaJX/QOwEfbqy3tW
4gmnumd+Pb6wD6le4vyC+uExMRL+gd1y6H1TYv8H+g7jKexD2bBuWj9pCeUE
cW8lgi1Q8ygvYV6xebkvZae+SY86eQZabDok43eq5yC+M+8JpmcKG+B61w/x
nbpvc8q52b+wl33O2wbXXf8o+SJ9kfUMDZQJ+j/xuPFozpSq15N/MM/lKMx5
+8vjHXF71uW8Z4ie11637LLLJv5Pz4VBzMEjjjgi2eBuWof6786JvsIet86V
3tPf4T6VopViH+R+2Do86yD4ncDUwH4K9tuQv5DH10+qRNTrN9k6OgKagqx/
UK/lEBPjAcRTYhG9LzprKacC5oj1VE9Ge0jbv4Sugp1Sc+539Vhqz/Eocp5s
nn2x1Ws4T7xJKTaCVlz9W+x/UTayfTCvStY/sA/2de51/3gaJRonfaEHATpK
+k08xKxZsxLPg8Pzu1omv7MJajjqvWhzH0f5Fn6PWo8SEPtvDgeGxT7gH4zJ
8fYQE9/rBaHmggZKrUnb984QG9FsiJvoN821EvbXozPDPYR9KBt1+7DVVlul
OLdE8LvpGDtzAvWD9oEWMu/5WwLkn+VG7QmEvxCD+Dvavt+HPAyNelvsQ5O9
R0sGvQP9f2ctYwmw9uMO2IVhvvdsttoSfJ81Wd1om+vo3Bd1rosttliqYbFX
o7ipqUNsZ5zT/hkvwzxW+g21jXoq5X4k3XQ7bYR7LBerR4LcQIkaol7BV2Aj
5Gxxq+YdvXjb9gRyT+TLjSm8sDwtjYmYTq1LU4c9ptXq0f/RApbaw6AJ8Lvp
+UrzH6wL6inwjHKU9gkcdrDd8hvqT+2jrH9mG2q32AX6EfljvWjxPLmHZr2e
Xe5W3sjP8Y6xXuNxeTY/O49u/cDza+vvF+/gqQK9oUR+Ukxubxm5dXqoXIc5
7DAP8SfsIS5CflRNaRMcK58GT0pfomcfP07+WO2beZh7hrMLfpdvplPVE2OP
PfZImhu2zk+9ANXL2C+a7lWe2p7TnpPb9Zzcu8fzHtT0LfwpP2lMPd7t8Bqf
4bN8pt4+pfZQbAIl2odLL700+axZi4i/aztf1y/Qr9GGizHkNHARg9rXO9e2
5r3A2ARxhP191L2IJ3AktKA0ubvsskviH9gH/S74ePwdGhv+H027n3K2eAE1
rfZFo/vgj3jO35afy/3ArA8+x7n4/3h7AHiNz/BZPrON+9K3GaXZB2PTekAD
RRNgbdE3YZQgzrBu0lvRfHRq2vq9b5j36p+jlza9hmsuvplvvvmSf4BjYCPY
A7rJvEcgXpCv4Hn7eejvVep+Hk1gvHvW616u3cZA53gYb3yUZB+MUWMsxxXq
nPDSo+I7ZLiXbIR9O3G09fvOVtBa8fnNVf40/0p/f71sOmvex/sOcYt1V09t
vr1ac3bBfFf7Kn+p1w1fwbmwB519cWjX6UW9Xl17yT3zBgU21Lw0tv3kX+Gi
+Y58IHuieG4ih3o5WkDxoN+NG5y+tZXGxmN+51/Ve7XS7MlbsA/sRFvtAxtg
D1uxN96JTtq4DX/xX3PZvVUXbl8uWm2+BV9fLQcff9VVV032H6dpHo9lUz2u
5oOum23B7dhLCCcqTvDT7+IaNWdyhsbvWOuP8axvn/Pg78kfTHWtbMlgB/Q1
pkUSk6kRwMvgavjNuBo1/56byKGWVv5GDsLvdDQ+x9xXf+AxHA1tv9y0vWLk
yLyG74ffZef5j8YPG9YmTYlxa43S4xnnYC/NNnD3TcN6zfbj9eiZxV25R651
Xo0JX8tjfqr/sN+5eV3nNX2OtYHNNYflIPTANC7wCjTe/Ae9hPTc41f06rep
G8FT8CHUmgxqT6ISYV03Z9niXmsF+nUYO/o78wv1YbMHXc4/GQf6OVtn+BHi
2rPPPjv1bzVu+KVTaS/yZ+PE2VBrkjFb/07rDhuhJqEUncZUw3WwrqhZdR+t
7+qirAU0VXpi4PutD+a4cec+Wxfqvj5NprXJe3GNmd8xn+UO6CHpBibDH7iP
+nfKLbJZ1sNh1qs8Gug9YHzLtfAFHXgdeydYA/JjU3H4HtwyP8Ph3uOYjCvj
xrhwZM6Jptfr+JgzZ85MvoW1SvyDZ+6XveCDinNcF/ko49FYtV8eX0svV3F1
W3yZtkA8qreCPpruofXeOOID8AFz7MXWsgV8R3taWQtyDXz29dWX8iP5GLhG
vOIJJ5yQ8qfjxSO9gr8hD+l7+X80lfikwP/CfXNt8uEeWaPNufrj/T7YJtwy
n4B/Kb7gR7AH8tP0Rnqs4P2MI75G1p+wG7ipvPcLH/Scc85Jn2dN4atOZv7i
zHyneEgvSHYJv5D3pzCW6DuNZRxL8A3/gX7bOL98f/S9dT/G0kOwseyJa5rv
uXtofXcf6AWsW3w087mfugpjg65SzxvnSysxCpq2UsEm6SkgPhVvmKPyzPx7
nKaaY/vL07Ooz+BLGINZj8aP1btATMrvsdaLK+mU2IscG4wFz9Fu8o35scYr
m8BeqVNkn/RbYiusjewGO2Hs1uufRxWur7yEmMH9wD+pF+/MH3QCH06n5Lri
FcQh2T+gE5jKWhZ8kbpZ95Xt14/IPgGBdiHXQvMHzHdrN/++G/gd1imxhXhW
zwRrQI4lM7fBXpjXeCjxLn0jrR9uqltOzbjgr+Q+0vS4/Bbv4x+zT/hz9meh
hRb6tw5PHM3/GfVYgx1gk8UDuAKcYy86Kfde3iHHJO6ZGGJQMB70xnIvcaVy
LqOWn247+Ac4SOsH30Fc2ksuwLqNe8Bl6S+Iy1b7guvmS2R7QVcrp0bjaH77
/DqfiG+wzx0dgzFqPaG5wX/Vx4qeL+yWeYA3s+axIzgKObhRBk6Bb8fnUtPp
GvXqV7mm1gTX3h488lmDsrfiQzGN9cX4w20FV9kuyEvgoqw78qyT1R9meyFX
zr8wb+VraRPEuHxIvq+YwR48+BWw/vMDPJfrlZ3DWOsIHo4uSoxjTMu/68U0
ylyEOSVOYI/xkvi+2cUWGe6DmNC1tD64toOs4aCVxl9lrtJaYS0INA/8AI6b
X2oe8y17HVfjIdfw8RPEwXIR6q7pOvCZYmXjkg2QK+PfGtvyI/yR2a1fdNV8
DDbNuYthRln/ID7DHbCxYjQ+Wq89b9jV3KvTe3GWg67xUutpD2Tnj1eSoxr1
mLFp5HyYNUOdHRs+lX66cYgHpenM38NG+F5zHN8pb9pLvzR2Jefg2BXx0Shz
W3gh/A0f3b3kn/Vq5/keeCT2QZ0XHnrQ9sE4yHVmxoF8at7rPDB4mF/6peME
cEP2b6NXHvS4wDHIyRmbchTihF7Xjbwnk/dac+jGRxV1fnKie4DJK9LKTuYe
9BPyV+LbXPeJN404oxlYx2mkjCU9iOQzm9jrQm7CmMxzHHfRK/ik8njei9/C
eYwq2AfaMflNvK2cEn3B7CAG4S/kvf7UVzW15ypbICdOM+9vwJmWur9jyTAm
zCV6GL65+0BH0wTkLenzJuMDGMfGcxN5uTZCnxP5ZPcU14dz5KOP5Qt4XH5R
vzBxP+2LGKXJPU/EGbhV+S55bLr+tvXNG3YYR2r31OzIJbgfTflxchf6tWcf
gC6/F9/Wa9Qoy3N6r88Y9bUGv4MHzvt64nzpH3Pf/qx/9385A3vWyCfKd4gx
2Wc+WdOQA8Njy4urz9X/JzAYiEntRSDPrdZCTqzJ2ibcmPxJrhXGT/aSp5RL
VScsH2ouqCOJPmH/ihvponO8kHVmrpV4Xgzi/3I//Hg1efwN8b5x0Ya9f9kv
/aj4t+4vLkUtQGDqIXdIa8A207zr19GkXg3vbjznfgJqWWn/ZweaHjYh12YY
74PYF6/tyL1iaE/Ur7imcsDsBG6CvsD/cYC5TywfUm+4Nu11woeQrzVO6Srl
qkrZo6RU0NipwWKT9efQM6Hp2gXjmY3Sw8RcN1bx8OPtSyAe9Xfwi41x44cG
r+17Xw0KrptcL1+B74DfoUk113J9Ha2aOMRaoea7bf18+ZC5vwf7L68RNfxT
C/UPehzLWeCxrMFt0KDU+8ThyWiz1WdefPHFya80VvDz6rz4FnTctJPWPraO
Nkp9YeC/QUeCf8QBq2ORo2JXHccdd1yquVHn3YaYohtoZMRCeV8dY6SEvQRL
hDFgLwJ+PL+TZrEfOsl+gT+Jb6ftMe/FxdYONcriDz4Frm369OlpvOT+NuIR
vGQb7Fyg/5DPUs/Dt1Q3PKj9k0cJ+AUc8FprrZXmFK1h23w1sYF5jovSpyb3
QBM/sBVyb7m2m4/BRvA5abzaZOcC/YVcKx+Cxle+U81wW/2dUqHGXw8v+Ypc
Q9vGWN054d/5w3wF2id8BJvGJoiL+D54STEzzUbT/Elg6kGPry4Yf6JGQw42
/MX+wJyTE1enYA3mw7e1J3UGfwcHqT+UnJzYAh+vpgAPqWfnKNdijRqMYX22
8xim4YpcVX+An9LDmm9uPZ41a1ZRtjfv0WSM+FnSuQf6B2uaegD8mbyLNa+N
PnBJyP279I2UzxLbj3oflUCZyHlwa5x8J024tS8wOeDs8v4xrqcai+jDFigZ
ajP0EKHdoedQFzDKPYEeDehj9PvESdLa4vPiWgZKR+7RzT4Y05G7mjj4CPZb
VO+kBsv1dF0DgdIhX6Vfhb1E25ajLwU0h3wH+SC9FejkQncWGBbwGaIWY3Jg
X+kb9KrHO+j1qu4iEAgE9OjS/1V9XvRMCQQCGfwuvcHVNapb0ve1yV5AgUCg
PdDbQb81WmQ1LfoPRj4zEAjoLas+kw5K7ZJe4d32sQsEAqMHHCRduj3tNtlk
k9RzMBAIBEC+R320eiZ1TZHPDAQCdeAacJSx/3EgEAgEAoFAIBAIBAKBQCAQ
CAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQC
gUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCAQCgUAg
EAgEAoFAoB/4PxA7QG8=
"], {{0, 166.}, {264., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->Automatic,
ImageSizeRaw->{264., 166.},
PlotRange->{{0, 264.}, {0, 166.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJztnQmYjWXYx7v6vuorJTshoexLKAllDW2EslQSSVJISiqSJUKJECJlS6Xs
S7SQLcouCtkiChUVJS3P53ef65nrOM6MMTO8c8z/d10nMufMeZ/33P/nXt/z
5nvgsfoPnXvOOec88X/H/lO/eaeqHTo0f+bODMf+p0G7J1q3atfywVvaPdmy
VcsO1z/wP8f+ccSxx3fHHv977OGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQggh
hBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQggh
hBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQggh
hBBCCCGEEEIIIYSIEf7777+4R1o+BhEsQdvAmX5/3ufo0aPu999/dz/99JM9
Dhw44I4cOeL+/fffeF/3zz//uD///NP99ddfCT6Pn/Ecfh+vSegYfvvtt7hj
OHjwoL1OWkwb8Dn//fff7tChQ+7nn3+Os0NsLD674fn8HNtJCF6P/SVkqwnZ
YEL2nRx4T47rhx9+cF988YV76623XNeuXd0zzzzj+vfv7z766CO3ZcsWO6ZI
HbD27777zs2ZM8ctW7bMnhMNjp21LF261M2bN89eE3kMhw8fdrt27XKff/65
GzFihOvcubN79tln3cCBA91nn33mvv/+eztOcXYSbofLly9348ePdz169DAb
eOmll9ysWbPct99+a9oMt0P0snnzZjdjxgy3evVqs8looL/du3e7+fPnuyVL
lrj9+/ef8P7omOdgp6NGjXLPPfecvf+AAQPcJ5984rZu3RpVB8ldN79z0aJF
7uGHH3a5cuVyl156qcuePbu77LLLXObMme1RuXJl0+aePXuO2ws4H2PHjnV5
8+Z19erVc19//XXU92EP4T3uuOMOV7p0aTdu3Li4n/H72O84h40aNXI5c+Z0
GTNmtPfPkSOHy5Qpkx3XfffdZ1r8448/Umz9InXg7XDx4sWuTZs2Ln/+/Pa5
8/ljB1myZHHZsmVzt912m/vggw/MXjz8vW/fvi59+vRmw/H5Afb4SZMmueuu
u87VrFnTLViw4Lj3x9fhH1q0aGE26HXAMaABjuGmm24ye//xxx9TxCfyvr/+
+qvtLxwTdn/NNde4hx56yA0bNsyNHj3adevWzVWvXt3OQ758+VyXLl3s/T3J
1SDH8Msvv7i3337b3pt1onf2Hn4ve9ETTzzhrr76apc1a1ZXrVo188vxxSQi
9sAGyH+IpdAY9o5O2rVr59544w2zg+eff95VrFjRNIH9YBc+JkquBnl/Yt0p
U6aYrfv3b926tRs+fLj5HvxhlSpVTI8FChQw/7x3795kr501sO/UrVvXfjfa
W7t27XHPwdaJG/v06WN7U9GiRd2rr75qPhuSq0HOy9y5c23tl19+uXvqqafc
zp07j3st70Xs0LhxY9Pogw8+aP5YnB1gGytWrHBNmzY1jd1///0WU4bHe8SX
2Fb79u3NVrEjXgPJ1SD2RXx6++23W7z16KOPuvXr1x/3WnRAHIr28EUlS5Z0
Q4cOjdNBUmB9+/btcy+//LL5OOx748aNUZ+LzyVG7tevn8uTJ4+75ZZbLG+E
5GiQdW3bts10x95DDEA+GA32yffee8+VLVvW1ahRw02dOjXJaxepB+yQOgH7
OnZYu3ZtqytEA3snJ+M5xEyvvfZaXB6TVA3yevbzXr162fuzD5BzRsP7oxde
eMF0EL4PJAXyWPJetHfVVVdZzplQnomGyFPvuusud+WVV7pXXnnF/j1Sgxs2
bDiunusf+NyFCxe6OnXqxGmQc/rxxx9bfFmiRAmLL+KD9aNPtPfuu++ads92
0kJfBjtctWqV+T7ioO7du8dbd/N6IxfBVtjvo2mQ/CqaDWKr5JLs416DvBcx
FnZdpEgR820JnXNvx8SOhQsXtn0gqWD/M2fOdOXLl7c4Fy0kBGtFA8TF5I3k
zZy/cA3iH1kX+0rkY8eOHaafm2++OU6D7EvvvPOOreXWW291X375ZZLXczbB
uebzwT9Qu+M8na1a9PtwrVq1zD8R65wqXoMXX3yx+TFqpNFskFhy5MiRZn9e
g9hvuG+k5pcQ+ILt27e7Tp06WX3iySefjLcOezLCtVO/fn33zTffnPQ17C+v
v/66xeP33HOP5W3+9xBHX3HFFe7OO+90DzzwwAmPZs2aWbxNLO01SMwwZMgQ
d8kll7h7773Xeg9pGXRGvEFtnpiL3IP+0PTp060Oxh58tmkR+2EfLl68uO3P
1CdOFa/B8847z3wZOoxmg/ha6g7UPL0GsUF0ie0SE6Lfk0EN0cfO2HVS7Za1
v/nmm3Y8DRs2tD0isa9Bb7yGnqHXILq86KKL7Peha/9gbf5PnkO8EE2DnJ/I
fk1awfeksSV6o+iO/OD888+3c8q5e+SRRyyfxjfy3LNFi16D5CLs0b7OcCp4
DZ577rlmS+RqnLPIB/+O78IGvQbxK/QAsNsmTZpYvHYyeA0x66m8Jhqsnd5D
7ty5LRZOjP7RDL1z9E8fD5/sNcj6brjhBtMoe7h/EGfw54cffmi9durLXoPU
WVg/fSB6f4mpdQY9u5TSENvwmX711Ve2t5Kr+L2sQoUK1o/iM8JusNOePXta
7Zpe1tnQn8F+Jk6caDZBLkNudjIibcBrkPOGjokbsLvIB7kXPQ5q++EapP9A
LkoslhhfRB9j8ODBcTWcyHmTxEKfm/yUz5UYIL5alMfXj7ABapj0B9iTk1MX
Dc8HOXfk5gnhfQUxMOculuF8sn5iiTFjxpj9ZciQwc4texlzIdTnyMGJ/8nZ
6cuQi1eqVMn2LvbNyJmRWIN8kBoLmqA2MXny5ASfz77D/oMNoAX+Pzl1UfwA
8zjFihVLVJ2Tz433pi7CzAB1EWLTpIAmfG+Q/jf+K6F9Nbx+xb5MfRaSo0Hy
G84/MXq5cuVsT4hv9sDXRQcNGuQaNGhgz41FP+BrxPR6mAvC/xMfMZNRqlQp
9/jjj1u9OlpvjPnBMmXKmA7RI+eS88DvitVc0dsVeVWhQoWs3o6tRIP1oTfs
jdiN2ih6TI4G2QOYjcEHXHvttTYrkpBdcWzEy9RD6JfT20sq2Do+lM+VPA1t
EVtGe39f/2VegN4kcyzElpDc/uCmTZus70rs1bZtW+tZRtqSryujV/wFnxWx
QDS9plY79DkfsQPngXoa5yxdunQ2d0HNAJtIqMaGztinmR8pWLCg6ZYYqmXL
ltZjJp/GplPrOYBony31Jvw+dQZyHPrj0T5b1o9e6SNjs9Qm8WPJ0aDvvT/9
9NNmg5xLdBGfDpiNIS4hfqRPTZ6VHFgTv4Ocg3olvXKOh2PlGPyMOXU69lv6
eJwnagZ+nSkxJ0P8wb/xIDZnnf5aDY6B96JvwewQPoM6bmQfw8d2xKipzSf4
nI88DlvD31144YV2LolD3n///ah2wxpYV/ha/Hwhc1133323aZA6BPE88RE2
mhpzRd9v4dgi5335vOgJ0DtmP2IekjoHz/N2yN+ZIWF2kjjsxhtvtNkySO6c
DL979uzZFu/z/lwrQP85UgfEGxMmTLBcgBoPx8IekBz8jAI+nXjY56Uc55o1
a2w/wk9TI+e42XuIGfiZx2sQDaONU9Wgnz3gPchxsaWOHTva3kC/n5klekbY
G/H39ddfb7WkcH/hr8lg5o08iRibzyXo+iHHxfkhbyPnoweG32MfIZagRhWt
DhV+/QB7Iv4tci38bn7O72U/5tygRXIq+sa8J/YR9F7EcWLj5FDETtQTsYNw
fIxJjwD/jn03b97caivsW+vWrbOZZmbIqP1hI8xO+poAr2WG61Q0yGfhNej9
G7ZDbwN/QlyIxrF16mXUdIgZiVfxl8x18u8pAevnM0YT1AL8jDh7DfGuv24B
jRIzcj4i1+X7O8TI8c27oUHq7vRgqI0y6+Lx+Q5z2pwD3h87RdfsC35+Hj/M
5xSZAxN/ca6I54jPyJlefPFFO9aUvs4kMfhrYLA7Pkf6OORwPucjhqIfG1/u
y/kgRscfsN8zO+JroZGv4bnUdVgvexvniveizsbnQg7NsQRxDvjM2WPw2WgD
/bDP4q8jj4e9mPNFvo+ds6eQ8/IaNIE98P/UjZkrC59pxB6Y80Ib2GhCGpw2
bZr5O+KP8L2A80pMjD9iH2NuFTvE/rBDrwtyMXwE/iGlYd/F75ETU/NgrXym
xKm8J/tQtFqk1xbPodcXX38BW2FvJo/Dh/OacPzsOn6XmBgbolaEnpilZ58j
zoqWL3l/Tn0Lm+WzwhbxD+wtfF7sxafbDv21n+S0rIP8ls+Q+Q2f8/G5n+wa
SK9BzgO2iHbZt8hD8IuRtVD+zvrIFanp8F7UWIl1qZ0TH2BfZ+I66PBeJ7ON
xHUcD7E3tuyvP4vvtayNdfTu3TturpEHdTtyIHwXWgqH1xBL0kMlJotvjtrP
hlPb57rYSH/hz6PPGZg3YT/wczW8P+cyPo2nJOwJ2BIP9qcg9lDeN/wYEgPP
ZX/Cf+C3sUPfR2UvPp12yDGyh1DTpGZMXEC/ij2cOgM5L34ssfg9C1vEDthT
2FuwS3wrcWjkWvw1cNSuyBmIY4jPiO/QM/0nfMbp+kx93kv8Qa+TWvcFF1xg
vo9aGj6GvTKxBG2HQb9/rOJrFsTv3g7xI9ghvho7RCsp9Z0EPufDb3GtG3ke
fo9YhhiC/Cep11n5fZlaOD7V+zfW9Nhjj1neG21uxsdV1NmJIdAA54DYhjiH
uJ+9PCXPAb6Jfib5KT6DvJfYDf9NvYRYRDactvB5NnUsej/eDtmbyTn8nF1S
7cLX+chhyPPpV5GzkIuR8zOPkVI5Q/j3DJBzE2Pi34jVyQOpDbDvRMsViV3p
YZHjeF9KnE/MxrEnZx7c146Yl6R+QpxJ7Mw+Qd7UoUMHi+uSOtMszg78nDv7
PzVV/50E6JKeAPZzKjULX2vAz+Br8U/kOfg+6kmtWrWya1tOx3f/+LoFvpX6
OJrH3qkxMGNBPT+apjhe6stowsfo5KnECcxvRYtrT3YOfN5Lfkt84c8BNUt6
bJ9++qm+c0QcBzbDnsz3YjAviE+k9ku/F3uhNpxQzO9rDeRTxIb4OWYFfK2B
OhY5X3Kuq04s4XMz+EJ8L3rEF1M7I/aNlivi96kVkR/7XJFjp5ZIjQxfm5hz
QCxPbYP6pO91omn6w9S7+bkQ0fC1V2po9EGpl2CH+AZqtdSFo9X/sUt6bdg9
vTdyHOyOejW1BnzQqdQaUmotaIqckNkFrqtmX2FNzDCiqWj7iu+lEj9Tn6dm
RV8RLdFzI36Oliv6c0Atkf4QdUpeR97L36nBJPX6AZH28PPnzN5RP0BLxGf0
NbAvX//335uKbTHbT60auyOvIr+i/sk8RdBzAMSEPif134tHPoafog8Zn6aI
a9lT6Av7uBY9UVdhXX5OxZ8DYnf6uz7vpcaDL+Y9VG8RScHX/9ES30+CttjX
0Rr1f2yLeR36Tvw7+sNnkgOuXLkyVdUa0Ao6QVNVq1Y1nbAe9hjqUswiRct7
iVnx//TNWRvaoq7CXBLxQrRzQCxPzx0fnJrOgYhNwmM6YjhqCvgR6qjs88R4
9PnIm6hJ0qON7BGnJsI1RW/Za4pjp29I/SVyrjv8HFBTYq3+HNCb5Bz4vJcY
nl5raj4HIjbxeRLfeUN9hfiU67Kpo5LzEbcS88UC/vs6mUeh5kSOiKbQEjUp
5t1Za7RckRySuJbZX3ypPwf0M8h7Y+UciNgFu6R3yMwX/g9bjO9amtSO79Uz
N8qsvO+REnczi0V9JVqu6GN05gA4B8Tm9CBj8RyI2MN/7zd9bfpdzMeHX5/i
6/PkVrGSC/lePT1SX3/Bt3HtALM05IqRvTx/Dsj/OAex/r0FInY4mQbJt/CT
5FbMp8UK/loO5t65XpW6KbkidVSu06HXHr6nSIMiKBLSoL/mm/kwbJdrrWKN
8HlaruPg+9vor3Ntfvh9VKRBERQJaZB8iNk2ao7kVeRLsYrPFZlrYR3UnMJn
fKRBERRpRYMnQxoUQSENhpAGRVBIgyGkQREU0mAIaVAEhTQYQhoUQSENhpAG
RVBIgyGkQREU0mAIaVAEhTQYQhoUQSENhpAGRVBIgyGkQREU0mAIaVAEhTQY
QhoUQSENhpAGRVBIgyGkQREU0mAIaVAEhTQYQhoUQSENhpAGRVBIgyGkQREU
0mAIaVAEhTQYQhoUQSENhpAGRVBIgyGkQREU0mAIaVAEhTQYQhoUQZHWNMia
uNdU5D2VpEERFKeiwfbt2wd8tEmHtXCvpd27d9u9dlmX7vkiUgMn0yA2261b
N5clSxa7n2aswfqOHDli9+PlHtcdO3a0+3tyP/ADBw7EPU8aFEFxsvsucf/a
UaNGmd0OGTIk4KNNPP6+idxbifvwcu8o7nPNGgsVKuS6du1qa/NIgyIoIjXY
smVLu1dt+M/xI/iM8PsUpWbI91jDhg0b3IgRI1zNmjVdhgwZ7L68NWrUcIMH
D7ZYNBxpUARFpAa5PyZxWyzeB9rnfNxnd8qUKa5JkyYue/bsLlOmTK5cuXKu
S5cudo/r8HvS+9ft2bPH9ejRQxoUgYCP477R3Iude0dzP/edO3eaPUfaa2qE
Y+R+wXv37nXz5893nTp1ckWKFLE9hT9btWrlFixY4I4ePXrc69Aevn3Xrl2m
2bp161rtqXXr1tKgOKOQ/02YMMEVLVrUYjbuVct9aqlhcN9MbDc1atHnfPjx
FStWmC8vX768S58+vbviiitcvXr13Pjx4+3nka9jTfv27TNtcn/h4sWLm2ZL
lCjh+vbt6w4fPhzQqkRaBJukdoEvqF+/vtkv/qB06dKud+/eZt/YMXlWatEi
x8LesXHjRrunbu3atV3mzJldjhw5XLVq1UyP27ZtO+54vWbx+6tXr3YDBw40
v0/8yf2x69SpY78rvFYjxJkEuyYGHTp0qKtatarVMLDr6tWru5EjR7pNmzaZ
3QeZK/r4keOcOnWq3WM+V65cLmPGjK5s2bIWh65atcrWEvm6Q4cOuc2bN7tx
48ZZ3MnasmXL5ipXruz69OnjtmzZEtCqhDge4rT169db/Z56Pr1B9NioUSM3
efJks390cCZ9Ynj8uGjRIosfCxYsGNdraNGihcXOkbXb8JxvxowZVvfNnTu3
abZMmTLW81y+fPkJmhUiaLB5cqLFixdbjaJw4cJmt/nz53dt27a12seZyBWj
xY+VKlWyvJWYmdiZnO/gwYMnvC5cs507d7Y1eM02a9bMzZkzx+pOQqRm8CPk
R+SKDRs2dHnz5rVcsVSpUq5Xr14W96GP05ErRsaP6I3YkUeVKlVc//79zSdH
y/nQ5Jo1a0yzFStWdOnSpbOcjxh09OjRyvlEzOFzxeHDh1vNI2fOnOaLyBuZ
o6E2gl5SIlf08SN99FmzZllvIU+ePHF1og4dOtjsS/i8p39duGapi/Iar1nq
ndu3b081dSUhkgJ9OOZPunfvbvlU1qxZzcbxkdOmTXM7duywuZqk2Hl4/EgM
TD5KzwAfVqBAAde0aVM3c+bME/oG0TRL/upzPjS7cuXKE/qDQsQqaAV/s2TJ
EtemTRvLr7B38jP+f+HChaecK6Ij+uJr1651gwYNspyPmQHiR/oO+Fr6J5HH
EZ7zMQfj6zT8Sc43d+5cO1YhzkbQzf79+61W2rhxY8sViU/JFbkugVwxMX1F
Py+HjyPn43f4ngGz1vT5wmPcaHWaChUqWJ+PfYCcb8yYMaZNxZ0iLYAvIs8a
NmyYzUTTr6Of4fuK5IoJ9RV5PfUTen34U3I+3zOINl+GX6NXOXbs2Lg+H715
ctN+/fpZPhiL865CJAf8DTkZWuJ6Q2ak8WXUbnxfkVlqnhOpD/zk1q1bbWaV
eWn8YWT8GK3PR5yKZulhcl3gsmXLLBcVIi2DFrluiDlMcsNixYod11ecN2+e
XZeR2FzR53zMY/M7yfno8xF3Mo/dvHlzN3v27OOutxJChPwW+dikSZPMD5Ir
okVyxZ49e1qdkh5dfLlieM5HXjlgwACbx6bX4HM++nzoWQgRP2iMXJFr8GvV
qmWxKf0MckXyR3JFfFh4fOrnscn5qK1QFyWu5VpAepPkfLxOCJF46CuuW7fO
vh+KXJGaDbUb6qkTJ060XJG+H7NjzAJMnz7dcj7vP8n5qNMsXbpUs51CJBE/
g0ovjz461yuiL65X5HsTyeuYu2a2k9481wH62U5+FivfpSFEasfPoFIrbdCg
gcuXL5/1BKnblCxZ0v7ObJrP+cgrhRApDzEls21cr0hfES0Se9LnY7aTPp96
7EKcfujpMYNKP5/vPIs2jy2EEEIIIYQQQgghhBBCCCGEEEIIIYQQZ5r/B8g5
hOY=
"], {{0, 154.}, {225., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSizeRaw->{225., 154.},
PlotRange->{{0, 225.}, {0, 154.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJztnNlTVEcUxq0kD3nMv5AHK0+aKisPqbJSpcYYNSkDgwhIUso6IpuyCIiI
IqKiiIiKiIAiKkgUBCSkBJXNpSRgAEXQAmRXFlcW15P+OlxqmEG8M1zmMkX/
qq7IDDCnv9vndN/Tp/trZ1+N+2ezZs3y/5L9o3EKWujn5xRi8xX7xtbH30Pr
4+a6zCfATevm973z5+xFe3bZsOsLdpFAIBAIBAKBwrx+/Zry8/PJb+NGcrCz
o+CgILrEvu/v61PbNIvi2bNnFBYaSr8sX05RUVGUde4c7Y2Opp8WL6aonTup
s6NDbRMthtTUVPrd0ZGKi4tpeGiI3r59y/tpSUkJaaytKfHoUa63YGJevnxJ
rs7OtG/vXnry5MmY9969e8f7ZdCmTdTQ0KCShZbDw4cPyXblSjp75gzvi/qc
OX2a3Fxc6ObNmypYZ1nU1daSxsqKcnJy6P379wbv5+XmktOaNVRWWqqCdZbF
8+fPaS3TKiU5mV69emXwfmpKCm0KCKDamhoVrLMsPnz4QNu3bSMvT0+6e/fu
mPcw3qzTaili+3Zqb29XyULLorW1lWu2ccMGqqqqosHBQWpqauJjjgsblypv
3x7X/wWGoG82NjbyMfvHRYto7pw59MP8+eTj7U3V1dV8PBfIB/0O/bOrq4s6
Ozv5/Bx+PTAwoLZpFknkjh1Uyubn6Kfw62Q2Hj148EBtsyyS8LAwunr1Kv//
rVu36FhiIjWKObpJCC2VQ2ipHEJL5RBaKofQUjmElsohtFQOoaVyCC2Vg2t5
5QoNDw/T9evXhZaTYHNICFmtWEE/L1lCfzg68rymWOcxDqyfpaWl0Ypff6WE
I0eopbmZ59l/Y7rGxcZSd3e32iZOe5CXRHzUurnRrqgoarh/n6+fIU/05s0b
6ujooINxceTm6kp5Fy9yzQWG3Ge6IW8e4O9PN27c4Gs9+rlzaApta2tr+c/6
+/nRP5WVXGcBUW9PD8WzvubO+mJ2djb19vZ+Mm8OjZETLi4q4r+HPtzM4gC0
nolgbMa6rbOTEx0+dIhaHz3i9RnG6AHN+/r6KO3kSXJeu5Z/7ZuB9UaY4yQf
P041NTVcV1P7lBRLsba2e9cuKiwspMEZto6BPgUNlFpPhKa4J8b2bTWA/2Rl
ZfF4pgvqp9C/mlm/AMeTkuhQfDxf99KltLSUTp44weOaOZBrh9x2KQnGzR0R
EdTW1jbm9ZSUFPL18eFrrcDLy4u+mT2bTjB7decs6enpFBgQQP/euaO4beMh
1w657VKSQDZXwXNHm959xr3zZnajJgCs9/Cgb+fOpWVLl1J5efmov51ic2/U
pd6ZAtvGQ64dctulJMZoGbZlC9mvWsXvedOIj6ihpRw71NJy4YIFvMYUczrp
Qq3uageHMVruj4nh9Wg21tY8NmHerYaWcuyQ2y4lwWfa2dpS9J49vC5XulDf
48Tmdrpaxu7fT48fP+ax397OjvsYYry5tZRjh9x2KQk+M3zrVu4rmMtIF/Jg
qEnT1xLjIubN+B4+hto0NbT8lB1y26UkxsRLqQ2I9xUVFWSj0XCf0bq7q6Ll
RHZM97FHagPAnDz+4EH6bt48XjuthpYT2WEpWr548YJ/RZ0a6ilX29urpiVs
QB5U3w41tBwaGuLPaPrPfIgteE/K7eD/Us4RzxjI6eIZA6+jJtVctZO6dgDU
zJVcu2Zgh9x2qQ3yunh+MNdz40TorsNZIkJL5RBaKofQUjmElsohtFQOoaVy
CC2Vo6ioiJKSkkZzh2qCXCZqkywNvh+nspLvEfX09KT6+nq1TaINvr7k6OBA
11jfxPOMJYB1EzyvIV+1MzKSjiYkTIt+uZX5OM6ZwDP2ltBQunfv3rR5PtRH
qqdycXLi+dburi6+L9mbaYr6Kv0zCswF6juwrx+1CMgPIN+CGgfYibwRch7T
Bd16Kqz1P2hs5HkB+DnyBZcvX6aVGg1ZW1lRQUGB2fYz4rPLysp4rSHWz3Ky
s0f3o2MNHff6yOHDPH9+/vx5vm9dTSaqp+pi9zt6927y0Gqp4NIlXqeBfHZQ
YCBfT0V7pgqcMxG6eTPfM13O9LxSXEzr162jbeHhfP8vbJTqvOqZr+NnsX6L
Nox3LsVUMlE9FfpdxtmzvE4IOUDU+UE3vD/AtP6L9U2clxGzbx+1661JT5an
T5/y+Iy/j3N4epid+Fxc2Mf/Z1YWtyuB/QzWgQB0xViEM2Y8Rvb5415M9f50
aJKbm8v9AutRj3TqqfDZ6APItUZGRPCxWzd/qPs3EDsxV0LMyszImPT5OIgp
6PsuTMMDsbF8PjteLQxew77puAMH/q/hZG2R6hCgN+5FBrMHa2ZJx45NaYzv
7++n/Lw8qqurG1NPNepTbN6BtT7YN9F9leqp4G+YQ8O/Ktjv4W8aC+IF5jsh
wcFUXVXF+9hE9UQGNZws7mA/tVTDia/IsaMOWf+MCiWBHbi3kk64j4nj+JRc
JP9Cf/b08OBrBdgfLse/MP7uYfEYaxF/Fxbyvm2MX+rWcGLcxBkKmLuhjdK9
Nsc5FNDTwKdGxm5TkOJZZmbmqH/1fMS/PhaPTQWfjTPi0k+d4vMmnNmlX6dl
CnLPo8O+ENQ/oG7pUz5lDJJ/YVxC3EUtpTSHkhuPTWKkH7a0tFBMTAw/+0jy
eVPO6DPmPDq0a6p8QIpnuE/+I3Oo7AsXKJjFNrnxeLKfLcUpU8/om27n0fF4
NjKHQl+4wObTxsbjyWKKJtP5PDrYPzTSDnPW+5qqiTiPzhBTNRHn0Rliqibi
PDpDTNVEnEdnyGQ0EefRGWKqJuI8OkMmowk0Rn4BuWc87yKXgfnITOuPuiih
yXTfu6UGQhOBQCAQCAQCgcAy+Q8sueNT
"], {{0, 88.}, {83., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->{46.51171874999994, Automatic},
ImageSizeRaw->{83., 88.},
PlotRange->{{0, 83.}, {0, 88.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJztnQeQlNUSheW9hwWlZBAByRJERZCMCBIkJwklIChpCZYEyUmyQgEqIDko
SVDJUJIRlAySg5KDkqMSldCPr6uuNU7N7hJ2mX+GPlXjlLszy/z/7dv39Olz
72Rs2Kpak/888cQTbePd/U+1Bh2Lt2nToHP1xHf/p2bLts2btoxoXK5lu4im
EW0KNvzv3R+WvftYfvfxv7sPMRgMBoPBYDAYDAaDwWAwGAwGg8FgMBgMjzXu
3Lkjt2/f1ueYeJ3B4FUQu1evXpVffvlFli5dKps3b5aLFy8GfO2tW7fk999/
l5UrV8r27dv1fQZDqIE43r9/v3z44Yfy1FNPSdGiRWXu3Ln6c39cvnxZxo8f
L+nTp5e6devKoUOHgvCJDYaHA7G9b98+admypcSJE0eSJUsmTZo0CRjPxPzY
sWMlderUUqtWLTl48GAQPrHB8HDwj/knn3xSXnjhBfnss880xn1hMW8IB7iY
b9WqlcSLF09Spkyp3KVEiRKyePFiuXnz5j+vtZg3hANczLdu3VqSJ08uFStW
lHfffVfjulGjRv+Ka4v58MXff/8t58+fl5MnT8qVK1eC/XFiFb4xnypVKmnX
rp3MmjVL8uXLJ9mzZ5fPP//8H45jMR9+YPzPnTsnq1atkkGDBkmXLl3ku+++
kyNHjsj169eD/fFiBf4x/9FHH+n1du/eXZ599lkpWbKkchxeZzEfPkCjZjy3
bdsmQ4cOVb0ufvz4+siYMaPqGAsWLJATJ078i9+GA/xjvmfPnnLjxg1Zs2aN
VK9eXfl948aNVcexmA99EOvk7wMHDsiUKVPkrbfeUq2OR+7cuaVYsWIa8wkT
JpQcOXJIt27dNBYuXLigvchwQKCYB3/88YdMnTpVcubMKdmyZdN1D64XXcy7
Pq3Be4CzHz9+XObPny8RERHy3HPPSeLEiTXWW7RoIStWrNA1fty4cVKuXDkd
Z2K/YMGCuhbs2LEjLLh+ZDHPz7n+Xr16SZo0aeTNN9/UXtWoUaMCxjyx/uef
f8quXbuUGx4+fFj++uuvYF2WwQeMJfUp49K1a1fN3wkSJNBc1qBBA/n+++91
7JyfhHV+z5498sknn2i8o22wDlSoUEGmTZumaz6vCVVEFvPAcZwaNWpo3Nev
X19/HyjmySFbt25VrYe1kTyyfPlyOX36tOX9IIEYZr1mXAYPHixFihSRp59+
WvN7lSpV5KuvvtK1O9D48F5yOuNP7+all16SRIkSSdq0aaVhw4aycOFC5fqB
+vVeA5/RtyaJKua57kuXLsnXX38tr7zyimTKlEkKFSqkHN8/5vmbeBiofTNk
yKD3J1euXNKnTx/18PB3DI8GjNu1a9d0XCdMmCDly5fX8UiRIoXy9X79+ulY
3YtXkNewRsybN0/H/Pnnn9e/Rd8SjWfdunX6e695Dl3dAt+As7HGnT17Vn8X
Vcy73/M+fs4aFzduXF0XA/F5Xnv06FEZOXKkciHWA/hi8eLFZcyYMbJ3714d
C0PsgHGGTx47dkxmz54t7733no5BkiRJlLO3adNGNm3apK+53xglp1ELTJo0
SSpVqqRrBbFfoEAB1bPhtP59+2ABzsEaBGdr2rSp9lhLlSqlHkoQXcwD7tHa
tWulZs2aGvN40aKqYeFE3ANqgbx582p+eeaZZ/T9jAXzgs9liDkQk2fOnJEf
fvhB2rdvr1zdaS/ojnAReM7D5mNigdyFpoG+ybiSC1lL0IGCyfW5B+Rycjp6
E/eA/EyfCQ8l3mHgfJVt27bV+QAX8YfjOJMnT5YXX3xR80a9evX0+ng/1+gf
w45L8u+jCbAW8j54z/vvv69+ZOP6Dw/uH2ND/h4wYIDmXWKdsaQOY8xOnToV
o9zD+c83btyo8wvemzRpUs391MSLFi3SNeFRcX13D37++WcZOHDgP/eA2qNq
1apat5D33T3g9dQx/Jy6kx5cIDCHyOvUQviIecZLT9zPmTNH8wv31r9/wXUT
2/R269Spo2MB3+E+9e7dW7Zs2aKf12t80OtwcYe+gn5ctmxZ1VbIu3il8Aii
wfO62Lq3juuzhlDXZs6cWceWvNqxY0dZv3699nhjC+4esO58+eWXqq+6e/DG
G29I//795ddff9WYDHQPnK4eXd7l98Qxz/A3+B25m7qe+tX1L/z/Df5d6gK4
funSpZVHkRvg+ujBcFBD9HDcEX44ffp0zSOMMbGGX6RTp06a8x/lXjbGlvw3
ceJEzatofKzp5NtPP/1U5yVaaEzB3QO09BkzZqg/jJzu7kGHDh2Ui1PDxvQ9
4N+Fozgtk/WkcOHCMmzYMOXz/v0LpyfQ74brc0/gglmzZtU6yPfzMa94P2sH
84G1kv1bj3MdQGyxZlKL4YNFE0Z7hDeyRuMRYUyCtWYyNuRVOBb6kC/XJzfC
CR7Ww+N7D6hB0RJ97wG8KjZi3RdcJzUw14mOSb3KdVLb078gt/vXNI7rM18Y
u2rVqim34nr4e3At5in3Cf8P/B/v25AhQ1TzDxVdOKbg+Cr8+eOPP1YNBh2B
9dVx9kBrazDgdP0NGzbomDmuny5dOuU/aCnkr/vNXdwDrhG+5H8P8MhwDx6l
Zup4FddJffzyyy/r2sZ1urlHHPvHKdfBnCT+mResAeyzhee/+uqrmieYx9Tf
PNMXYD7TS2S9DPf613evMlovnBAdAu8f/P2LL77QNdCL98FxfWKcXmaWLFmU
e6AjwT1+/PFH5frRfXY3h+ANw4cP1/WDvO6Ve+B7nXAsaho0XLQetCP6F8zV
QEADo19IToeb0c9q3ry57r+l/oXzo7US8+gD/C5cawDffgp8Fc2A2ox8iReA
e7l79+5IazMvwZfrs/bDgZm38G70TsY8ENd3PJg6/JtvvlGdG/7Aw4v3wF0n
fAWPBtfJHIfr41UiR/tyfebKb7/9pmsW8U4/Cy2IHOd0B9YI9Gf6iqwB5Avq
hlD2fQSC66fg42Ves1YSI2gE5AO039jmq7EBrgtNnDh/7bXXNHbhAvSKmA/8
znF98h/xQ+8XLwtaH7kTPgOH/+mnn3Q+ePEecJ3oSGhGcH3nVapcubLGtDtT
hNjG68e9gP+hOwXi6/yMOoi+GTEAXyQ+wgHOB7Z69WrVv+CHrOFwutq1a8vM
mTOVC3hxnO8HxDW6Ev0g1nJiGQ3vnXfe0ZggH8J7OnfurGMMZ4cXwRvoZ3ql
bokKjo8xlvSm4DnwcvYgomvye/I385/6l/GN6gwR5hFrGusd9QO1QKiDeGed
g7PT20T/Ig7gq3BYdEkvcvYHhetxUufR20S3I/aZ3/AfvHDEOhyW/Dh69Gi9
P16PdX8wZvSFWa/Qavr27at6D+MNZ8MLQsyjL9/LtcVmr+VRg3UanzY+DdZB
xpw6nnqeOR4u1+kPxh4tmvyFdgePY58WfIBaFX4A5wn1sXaf3z0YU9Y6eius
YeiRjxuoSdBm8ew6vko8hPI43w+o/1jbqc/wceFnQJcN13tAvcL1wefQZOjT
Piic9ycU92+yDjL/wzmvRwdf71Y43wOuD8+N03AfNM+7XEEtTA1k/jWDV+G0
GPoU1LbU61H1WIlj6lZqAPRrp3ni/UHzQtdy/jV8duZfM3gNxCMaHD5N9jdQ
x1OjRwa0LjyceFbRKulpuJ8vWbJEvR1oHmgf1EHU/HhAbK+KwUsgXvGjotPR
d6CXFajXRI7Hq8HeZOYHPgv24jtQG6DvsgcADcT17XgdvVzbq2LwCohlfATw
EXq27E9etmyZ5n/Hc4hnYhYNO0+ePKrlsq/T36fnzipye5fxctMDRgejr8ne
SPQx4/qGYIO8Tt8KXwX+Ifq26LPwFeKX2hTtnt4c84L4JafTw4Wz+2s1xDS9
Lnp7aEJuXzr9TXzM7rsljOsbggnilzxMjLK/Hl8lD/pxPNOv4Ywp8jcxS4+L
OTFixIgo96qwtwBeX6ZMGeX6+Dzg+pzVA9e370sxBBPEKByGM9Toz9CLZq8b
z/Rqydv4kuHl+EubNWumHizWBXylO3fuDHjWFhwI7g/Xd+cUoROxruBlCRff
jiF04fq0br+i2+vmv4+XvaD4NfEtuLO2mC+R7VXBt0qf051TBNfHn88+Latv
DaEANEjn04P3EMP4kfGgurO2AnF9eBFaDt50/C1wfOfjdv51+lr0AeA/PFP7
etXDani8QAxSjxLjnGPEXhV3tgtnbeFlwafrDzQh1grmDH0xt08BXxN8p0eP
HuoDwgOK35N9iuzhYA6E69nshtACMYx+Tz+WOHX70uHveBr896r4wu3RpYZm
3ya1LnyfeYPPm2d+xgMfFFqqV87tMhjc2ZnsS4fro/mwr873XF3fM5Odps9e
evzaxDrfNUG/AI83GhFnCrBXhb/H76mr2ctkNYDBK3D7p+EteHnQ5Yl7cj95
nHzu9mkRt3jV+W4JegP4GvC9+fN2eA/179tvv619YJ5ZOwwGLyHQWVvoO+y5
dWd2wmk4V4Q9XJwtwHyIDMQ9fh98PfhAObPIcr3Bi4Drs6+YPeTUo3jZ6Gc5
3wPnh8CB0H+i4unkfvQbfA9o/Hig+X+DwauA6+PjIc7h9M6/j97DvkzOZYgO
1MHUBej71MrseTEYQgVojnAcOA36Dpr9vbyHNQJ+RP+X/fgGQ6jAP37vNebd
PLnX9xgMXgH+BFeTcrbWt99+G+17qGM5qw2fGj1cPJ8GQ6jAdz+u4/PReevx
KLCvBe8nfB491GAIFbi9V/gt0dw5Hy8qHYbX4012Og8+/tj8XgGDITZA74o6
NH/+/LoHi7MDIzs3FI8+38Xy+uuva5+Ls2PNc2YINfh//zP7cvEjo2fC3cnt
PPMa4h0PA74b1gT8lgZDKAKtnrNBOC+fuMdT/8EHH+hZ/nhu+E48zhzErwwH
4sxQ/JoGQyiDfhX+Gc42h+O47yjivFD20rJHke8D4mw5/DkGQzjAeYo585i9
huR2PGfsq0LTIbfH5PeBGQwGg8FgMBgMBoPBYDAYDAZDbOP/R7aOxQ==
"], {{
           0, 72.}, {189., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->Automatic,
ImageSizeRaw->{189., 72.},
PlotRange->{{0, 189.}, {0, 72.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3WWUXFWzBmDWd++P+/P7xx/cJbhDcHcI7iQQgiaQBJcEd3d3hwDB3Z3g
7u7uvm+efdfm9tdrMpaZ6dM99a7VkMx0ZvY5vWtX1Vtv1Zl+yIhB2/1riimm
GPU/E/8zaPCey44cOXjv9f498S8bDB+1w7DhQ7dddfjoocOGjlx0yH9N/OKQ
ia+3Jr7+e+IrBQKBQCAQCAQCgUAgEAgEAoFAIBAIBAKBQCAQCLSBv//+O/3+
++/phx9+SF9//XX68ssv01dffZW+//779Ntvv+XvBwKBnsEff/yR7WvChAnp
sssuS0ceeWQaM2ZMOuSQQ9IFF1yQHn300fTZZ59lmwzbCwQmD3zYm2++mc47
77y01VZbpaWWWirNPffcafbZZ08DBgxIiy++eNpoo43Scccdl5599tn0888/
N3rJgUDT4s8//8z2dthhh6WBAwdmW9tggw2yjzvhhBPS4Ycfnrbeeuu04IIL
pnnnnTftuuuu6emnn852GggEuo4vvvginXXWWWmRRRZJCy+8cDr44IPTI488
kj7//PP066+/pu+++y49//zz6bTTTkvLLbdc9n0HHHBAevfdd9Nff/3V6OUH
Ak0FPu6JJ55IgwYNSnPOOWe2t7feeit/vRbyNzZ46qmnprnmmistueSSafz4
8enHH39s0MoDgeYEPvLSSy9N88wzT1phhRXSww8/nDmStsCnPffcc2mzzTZL
c8wxRxo7dmz6+OOP+3jFgUBzg80ceuih2cdtu+22Oa9rD59++mk66qijss1t
ueWWHb4/EAj8J95+++00YsSI7Of233//9NFHH7X7fnU7flFOt8oqq6SXX365
j1YaCLQGXn/99TRs2LDMR+It+bH2oEZwzTXXpNlmmy0ts8wy6cUXX+yjlQYC
rQGx4U477ZT93EEHHdRhfib/u/jii7PNhZ8LBLoOsSSuEhe53XbbpTfeeKNd
jcknn3ySjjjiiBxbqtnhOAOBQOeB67/22mtzXU6seOedd05SY0IbRheGt2Sj
nYlFA4HAfwL//8ILL2Sfhbvcc889c47G7mr9nfrBe++9l0488cSc+6mN33bb
bZXXgLkGZ4V1iou9nDOhGQ00Et9++23WNPNz9F377bdfuv/++9MHH3yQ6+By
PFqv448/Pi299NJZG3bggQem999/P+9bdmtP06tUqfeAremLcIY4Hy655JKc
i6rlP/PMM/naqrTeQP8Bm2FfJ510Ulp++eXTAgsskHUpo0aNyrzK3nvvnePJ
RRddNNvkHnvskfds0VvyG7Riaghq5r/88ktDr4cNWcNLL72UdTN8+Oqrr559
sxfuR23RGfLkk0/msyLsLtDXoPX68MMP05VXXpm233773EfAn4kjcZpsbZNN
NsmaS36DDrOAH1QnZ698pJpfI3WY7O2xxx5Lu+22W163mHmNNdZIO++8c9pl
l13S+uuvn88Vr6FDh6Zbb7017C7QENhzfBabuu6667J9HXvssdn/sUW6THro
n376KcdtBb52yimn5PoBnfRVV12V93Aj4Ox47bXX0ujRo/N66Nn0H9133325
FomX1QdI082XzzLLLNkPPv744/2iT8JZ6Dojn60eSo5Gd1JszOekl+DBBx/M
db2yR33dnt1www3TtNNOm7bZZpvMy9TaZV/hm2++yf21/BsdtvxN3lbrd+01
OewNN9yQ40x+EAerH7dVUT5PObg8QJ9IbawSqCbMapCzbbHFFrmnrraXx/fO
PffcHI+qI5xxxhnZ//X1WVp0NXwcDrbwPG2BjR199NFpxhlnTGuuuWbeh/X9
FK0AZ6L4/6677so9kfy7Xix1nvB11QYOkC3xC4sttliON0sMyfZwFurqfN26
666bz9O+jNfsH3kcnmT++edPV1xxRfbTk4K9aB+qTYqJ5XVVr310BeIMZyHO
WXwtp51pppnSzDPPnHbfffd85oTNVRvshx2JIaebbrrch8DOim+wv9XX8Zvy
JPEanUtffa722O23357PhCWWWCLHux3Ft3ybcx9XdPbZZ+eYs9nh/PNZvPrq
qznOHjx4cI49pplmmnwe0R3JzftD/toKMKMI/yCGxGf6M/8HbOudd97JMd30
00+f+Qs20Fe+wx66/vrr0wwzzJDnudCCdmTvr7zySp7/Yk/ii1xfs8K1ytFo
F+SqeFs+XNyx0EIL5b/fdNNNuS7UijF0q8JnxTfg2H2WePdazg9PL15bccUV
s93hD/uqdmAN6t3iJ/wJHqej38tPb7rppvkMobERizUjxMm4Ij3Haqr8mftf
dLHycHzupPqRA9UG/dTVV1+dczpcBc1zbQxJB61eJ74UZ/ZW7YD9q23gKtm6
/XTPPffkupvf68/tcXPW67zgj/0b+Wlt/icuxdlWWaviHoiHaRFOPvnkPDtq
1llnzfa21lpr5br/U0891VJ5an+E/aefgB5FHMen3XHHHf98rvaqvcwH8oX1
ed/kgu9iC+oVYsnzzz8/62LYHR8s3+S3zCxrjztlr3gWvoDuDf9S/IC14jxd
l2vBN1TJ9qzDPcDT0rSJO8T67jd9Hm2CM6e+ThJoXtjf9iN7Y3fsrzaGFKOZ
lSmPWHbZZXtED22f2ff0MvLEvfba6x8tlzqc855tyMvkZ7gRtUS2VW8rfo45
nTvssEOORf1f7aO8j7+78cYbs/2KPdX88X+uqxF1x/q1iyvcf/ddvMHW5ptv
vqy1cQ65lkavM9DzUPMRV4ohfe61MSTbo/nAXcrp8J2TU4Plf/gsvsjvXG21
1XJdTWzLj957773Zpr3PezbeeOPMX9J60aDYo2yy2CXeTo2KbfILet9r40o+
xDmBXxen2c9q/WwbH+i9fe0/imab33UPaEjdA/ffuYB3rdflBVoL9jcdVYkh
xTe4whJD+uz5PlyGnKs7e9S/KfM1Tz/99Kz5NP+IHay88sp5vjsbYwPFR/ld
7F+dWw3A+vAKF110UX7JNdX1+WA1BXVxXF69VkVchv8zQ9d7/U7vHzlyZPaB
+EH+vrfjTeuSQ7uP6qPsSzxsPezOPfA5OE+qEvsGeg+ldiCXwFuIa3pi5qW9
w2+xWTW/HXfcMde52Tb9tXrEzTffnH1tfZ5oj7IXmlHnAL6czeBKvNS/cSb2
rvhXbjqpOIxN8WsXXnhhrinY62LplVZaKde5HnjggV57hkO5B+ov48aNy/eA
v1VnE1fo99BjjLMK7r//wGeNH9H3gzOzz53JkwP5Clu6++67MxcgH7TP2TWd
y+WXX57j1o5iKLaPWxEP8nPDhw/Psea+++6bzjnnnOwf5Wcd+V973zWJR485
5pjsW6wHP8iH6hcquV5P7f1yD/Ag1ovjoUPADZV7gEMK7r9/Qu4j9mMj4q1a
n2EP8hX2rBhRDNge/8eO/Cx7G9eN35CveD6J2JIN1caRHaH00tEW4vjUqMSR
7LGrsa5r8XNwF+xADZDfFb8OGTIk9/36+ZOT6/kdYmP8vph37bXX/kevhc9h
37iftnihQP9GqRvZ5/gNtS91WXEi/zKpfY+TlJ+U/ElPnmdxPfTQQzmOrQLv
zbeI9/AuZqaxOfGeXI/WQ8yrztCVXK/otdisnFMciwPi26p4DwLVQul/xWOY
V4uzx+fLgfwfp0jfLhep30M4bnnSOuusk/lOuhL2WUXeW64lrtZPITcUa/J7
uJ36XG9SKHotNoqXwdfIi6eeeup87uBr3AP3M3K2wKSAz5ZDrbrqqpmLZ2v4
fByAGQ/8ga/jIGn5azl6vk9sid+XszV6zkNHYDNiZr5bXdA1y/XEw3I9dT3x
sHix3mZq9Vrm16t18mt4Grwq3gZ/E9x/oD3YV/gE+ZfaGT9HV8uO6InlKXQT
+BZ5Ck4DN9nsuUnhStXqzaB3rog38T5yPdfsDCn9v0WvpRdfnsbOSh+fvht8
jRy42e9Lo2AfutfO8yrGSD0JfkksZK/hs+md689pfoHmioaEbdpfrcK/lV56
elSakJLrqW/oWePX+T25Ld+P+y96LfE2Hir0Wt2H+yZWcr6513h0570zrlVj
czmOGh1OW11MjiJnc9aUM9t9kZ/gU7xaUafk7KEPUJOQ6/H5Yk71QeeM84gt
skm6M/ehnvMNdB4lL7av6IfUb8XquGUztXALait8X6vFDrXPkMQp6Jd0pos3
7Sm5XpnpWl6teqbX5nqlridfm3LKKfO9kc/SEziH62f4BjqPkhfTF9Zq4qaa
aqr8KnEEfa46Dx0g+2yl+82u5P+uHVeinmt/4SH1m8jv6Ji8rz+c684h/KXP
2yxe81rwLfZIdzVxgf+7r0UXSBOHD6cLLDohsbq5PaW/CX+g5uve06y20v6z
h9SOxZi4bty5/E68Kabyd7kMjr0/aSlcp9iHDtm53Kr5RW+j9DHR7BVdoDlv
Ygh5M3+mRupe08XrLWF77JD/sw/xwnR/zn522+znnvWXflI1XjyKOi8+nDbf
3ARc+sCBA/OZ05czUwLNi9LLZb+o7dIF4uH4NXmx+KFo4mp5O77Mv2GH8jz1
T/ZJd0vTgGehcWiW+L7YV1kr7kDeRn/ijCnciXsgvhIH8H9qv/RMOAVcXdXr
cIHGwj6yn8SEzmmaOPsHN6XWQhdIE9eeLtAexGeqHavfDBgwIGudcC1ifnop
evEq9SbXomi7+Hf8XMlLnCc0gTRQuCJfr4dr8gwS9WN+3v1q1jkkgd5F6WMS
j+OAN99888wR8G00cWInsWNnNXFFr65OjFswE1LMxXbXW2+9PLPc9wrXXgXU
1j9oKPkrORstYPFxekxcg7PEvapfu5+Bx6TTcP/M7CizwwIBqO1jEhfpCREL
lphQH0vRxHXHNuxBe06MpXcZz8fniVHNbJIL0XCwz0blekWfT/uo1siu8K/O
G/7K1+S11qjerTanR80sLXYnLpCr8nt0J/JX75HT6QstOnn+38+oqn8P9C5K
zlb6mMaOHZu5D5yjmMg5zh7wBD2Rj/hd7BoXw64LF2Nf0jDg2PuqN7kWzhE8
G1+m/mFeAX9Mr6QGYAZQqWs7E2gErRdXiUdyLTRO/KL7JffFXfJx/CSfWebp
i8nV0cWroTPsX5CvyDHEds5qNV7xEp5f3CcW9L2e1sQVHpRenS/AaerpkC/y
J+aL2Psd6dV7AmUugrUUPYW805nDZsTSbfWXsB3PdSs9Zvq7+UQzTJxZYoPS
90J773pB/iqe1lOA23W+NDuHG+gYtX1MehD5MjGeGErMV/gNsVJv1lfKs2PM
uTC/w3611/kG2vwzzzwz+4S29OqTizLLrtQ/6JDxq+4BWyk+t70em8IR+ffO
CdpD/c24WT3b7i2Nb7E3cI7Q9zrb2Kp/O7k954HqouQrzlb7yb5SR7LP9DGJ
j4omri9ruPa02LbkUNbE9uRCdHpyJ7x8T/QN12rW1D/o43FDfh8trtkiZR5w
Z2Np90p9XD6q/uj/bBW/ySbFrMVu/W4cFC5JPqvW6f1RO249yKPENfhr8ZK+
SvtMHOUZuPIV8VUjZ+Haj2p9ciK2xgbU1PneffbZJ+uK2Ep3dWRljlvpA1P/
EEfXz/zoiVhaHFF0qOJn9778TPGD38Wf42BoVaKO0Doosyf0Vvic6bLotdib
WI4uVQ2uKjPMynNQxZQ4CesVh3nJgcRlYlG+o7O+uNQ/+JMy2wr3wZ7Vq8WB
PR1LsyGzGP0e55uesxJjsn2fB200bSqOhq33F31Yq6K+xkQz4kwVz8gj+A3n
cFVnmFkTGyh+GZ+BR5R38st8Q0c6str6R3mGi1yNrdFE+ru6iFi6p5+ZVJ4j
Yt1id3G83LGs1Rkn3xM/izXwKs6RKpx7ga6h5GxmT8iPaCHFZvwam5M/0E74
/Jvh2VzOfrkRHZl9i5dnM/gO+SebaUtH1lb9Ay9a5lfxd7j63tKf+Zl+P25I
7FqeH1n4ErYnR1VrZ5N6p+V5UTtoLtifeLHaGpN9hofHgetjqufRmgX2Il6D
rYjJxGyujR/Bs7Ite9y14fVr6x9qbHyk2NTXcPz8TG9z9GXGuf7Ntp4jYq23
3HJL1rA6E50NzpfwddVHyVdqe2xqa0zszwyYzswWrTKsna3oF621J2cKDsQz
auRN7LLMBOATzWvFT+Ip8Yt9GUuzf3U/+gLrwaf4HNiVFxszP8s61ffwyT0x
IzrQu/AZyXvEKfIDOVuZTeEz7K5eq6oo/ZF8Gx9X/IS4UQwp7vR3dWnxpxmN
tFiNiNuslS5MPu1zKbNRSlxftND4LL5QnZA/r2KOHfh/2H906+pacjZ12TJv
t5XzgzILp9S02Zh9zafgWUq/XqPnAasbsHtnAf7V7OLa2gHuBIfiez6/8syr
QHWBu1O7oiUSx+CduzJ7u5lRagtsS0yJ93f+yKOqMhPAGvlZfBa+RO0Ap1lq
oXya2gHtjZk9NN/qGlVYe6BtlDxHbtAbOqlmQNFNiqOr1BdUgEuWT9K8sDt6
G3xrbe2A/kVOJ0/F81TtGgKBZoPzwEx+uaY6Ya3Wku2p7+hJ4K/9OfxcIDB5
wJfgj9Us9C+pleoJKnFJ6UuvwrOFA4FWAb2nfgm9P+JMeXgjNa6BQKuDL8OP
qBfiWsu8h0Ag0Hvg18SUEyZMyH4v8rZAoDEounS1O3pYdqnGqgbJNqPvIBDo
OeBX9DfQpdNFizvpSs2npSfSF8QvqoP0x1pQINCTYG+0z3oZcZp0ezQ1ehG8
/Fndjv3pmaflDrsLBLoHtqPHUd2OfdGHqSHoATHv2jPe2CKdJk2bZ5KonYs1
+4POKBDoaYgV6UL5NnVyffB65MuczzIDUB+7eLPMkYheu0Cg6+Cn6ETpLPUk
mbWLO2krblRP0Eeh/8B79Wfpow8EAp2HPA5noj4urlQfb69Wh880p4J2TKxp
XlPEl4FA50HnZa6LeNFz+syKaM+G2KNZFHp0cSr68EIjFgh0HvI0/Ii+Of07
+gzaA3s0E1qPHS5TP1DkdIFA52FOhJ5H+ZlZtupz7YHN0Yp5xoN5AOZQhG4s
EOg8cCBmuPBznjnWUX5Gh8LO1AxwnOwvtCmBQOdh7pf6mxlmegw87749G9KP
rCfBvDMz3eR/odUMBDoP9kLP5dkFfJdZZuLNtnwdrsScRDEozmX06NFZjxII
BLoGNqYObmaZubK0J2ypPA+yPDeM1tmzWtUVzAgeN25czOQLBLoBsaTZJ/yX
+ZflWSV0JjSYZpaq4Y0ZMybPdcOfmC9YO0slEAh0DXReeH/aLtyI2hvbUz8w
M5D/UzM3x9NcM9qwqBEEAt1HmRdoRiIfx9Y8L4gG0zOf6ZpHjBiRewrooZvh
OROBQDOAzlKdHK+iJkBzMn78+KxvNs+hv8wsDQQCgUAgEAgEAoFAINB6wGng
Pryajd+wXjW+Zlt3oG9BA4VX9zyQ+pe5kX05m4emXx+pZ+7g8HGOVe9nK89Y
t256TS/3LuyuffwvS0EfNg==
"], {{0, 98.}, {221., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->Automatic,
ImageSizeRaw->{221., 98.},
PlotRange->{{0, 221.}, {0, 98.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3Qe8ZlV1Pn7/aoyJyScaiTQpAlIMHQSpglIEBASlE6oDiIB0EQlFAYNB
kJIIOvSiIiUgbQQp0jvSERDpCIEgASkp++d357/wzOt9Zxi4c99z3lnP5/M6
OHdm7jn3nP3stZ71rLU/stVX1t3mne94xzt2e+8f/mfdLb+64q67bvm1z7//
D/9nvZ12+9K2O4374mo77T5u23G7fmKrd/3hN//3D5/3/n/veMe7//BrSSQS
iUQikUgkEolEIpFIJBKJRCKRSCQSiUQikUgkEolEIpFIJBKJRCKRSCQSiUQi
kUgkEolEIpFIJBKJRCKRSCQSrccrr7xSHnrooXLjjTeWxx57rPzP//zPoC8p
kUgMGK+//nr59a9/XU466aSy9dZbl7XXXrvsvPPO5dxzzy3PPPPMoC8vkUgM
AP/93/9dnn766fLTn/607LDDDmWxxRYr0003Xfmbv/mbMtNMM5VPfvKTZf/9
9y/XXHNN+c///M9BX24ikRgD/O///m/53e9+V6677rpywAEHVB748Ic/XOab
b76y6aablj322KPGELPNNluZY445yjrrrFOOOuqoctddd5XXXntt0JefSCSm
El599dVyzz33lKOPPrqsu+66Za655qrcsOaaa5Z/+Zd/Kbfcckt5+OGHy+WX
X17222+/sswyy5Tpp5++/P3f/33ZYostyqmnnloeffTRyjGJRGI48F//9V91
XZ9xxhll3LhxZf755685xHLLLVdjiCuuuKL89re/rVqEvOP3v/995Ymzzz67
fPnLXy4LLrhg5Yklllii7L777uWSSy4pzz///KBvK5FIvA2oQfzHf/xHjQf2
3HPPstRSS9V4AT9Y9+edd1555JFHat7QGxPgCbrD3XffXY4//viyySab1Hhj
lllmKausskr51re+VWsdL7/88oDuLpFIvFVYt7feemv59re/XVZfffU39ISN
NtqonHDCCVVPeOmllyZZx8QZYo/nnnuu3HDDDeXQQw+t3CD2wBXrrbdeOeaY
Y8p9991X/1wikWg35Ahyg5NPPrlsvPHGZZ555qkxw6c//elyyCGH1HUuN7Ce
mzGDGMLf83W/ih8C/hzt4oknnigTJkwoX/3qV8uSSy5Zcw65x7bbbltzkaee
eiq1iUSihbCeaQgXXHBB2XHHHcviiy9e93m/7rXXXnVdP/7445U/etew/29t
0y0/+9nP1tzjzDPPrL/XhFhDXPLggw9WLQMv0C5nnHHGqmV8/etfr7nMCy+8
MJa3nkgk+sDaphNcf/31VWtcYYUVarww77zzlq222qqcfvrp5YEHHqj+yEnl
ErxQ//qv/1oWWmihMsMMM5Sll1668spll11W66FN4CK/d8cdd5Rjjz221kJm
nXXW+pHL/PM//3PNbXzPRCIxGFh/6pVqk3wKc845Z12ja621Vv292267re7l
zVyhH+QXdATr/Qtf+EKZffbZK8+sttpq5bDDDiu33357zTECeEks8uyzz5ar
rrqqHHjggeVTn/pUjSU++tGPVj2TJ5M38818/0QiMTqw3p588smaA2y55ZY1
xqcFqE/wPIrx5Rq9GsOk0FzvV199dTn44IOrZjHzzDNXDcN6p2vSJppxSGgT
ejbkNrvttlv1YopB5DY77bRT9Wi6ntQmEompB+tLvZJfIeqV1iF+2H777atG
aP2KK97qWmyu94suuqj6Ka1z/LPIIotUL7a6KB5pAmeoh9x///3VR7XZZpvV
GgcNRM7DbyXOePHFF0fjR5FIJP5/WLM8S7/85S9rjXHVVVetPgQfuQB/Ap8C
HWK0YvnQIn/1q1+VH/7wh9VX9bGPfazy0fLLL1/22WefcuWVV/5Jb4bvj8Nu
uummcsQRR1SvtjxFfdV/82q71vRqJxJvH9FfKbbfcMMNa6xvjUa98tprr631
SutyasTvoUXiJpqGNR7ctMYaa1RtwtdcZ8B1WP9yCrGOnAen0CbEOqGb8mal
NpFITDmiXnnhhRfWHF5s/6EPfagsvPDCZddddy0XX3xx1SBG8j6ONkKbiPUe
fV14au65567axIknnlh5rFebkOv85je/Kf/2b/9WcyDeTTzBP0GrcB/8V4lE
YvKwpuTo+ivtu9FfqTax+eab131Xji/fGOsZLrHe7fs0iF122aXylb7w0CbM
jcAjTbhO93TnnXfWXIhvi5eT9rnyyitXr/bNN9+cXu1EYhKw9u69997qQ1Cv
/MhHPlLXEE+BXJ6nwDobdEweWqR66CmnnFJ7O+U9NEx5xL777lvznuZ6b3q1
eTXoKO4L99Ex119//Vpb9W82c5VEYlqH9aYeYE82wylicB4luv9bqVeOBVwP
LRJv0R1jbkRTi9Tn0ezNCG1CbqQPNLza6hy82l/60pdqPTRzjkTi/2AvvvTS
S8sGG2xQ1wkNb7vttitnnXVWzd3HQmN4q4j1bhaVesY3vvGNmhO5Dx5OOdFp
p51W+8ub+VDUR3g7w6utPkL31Pf185//vLX3nEiMJdQfxo8fX73N1oiYQa6u
dtiVObGuM7RIcZAZlosuumiNgz7+8Y9XrYLW+u///u8T/b2oj9Af5CW40Syr
4447rjP3nkhMTVgzaofWhj4nvsipVa+c2oi5EXQUPaS0CXEEnWHFFVes8QVt
QswUcJ984PxdYg+6Cw1m0DpLItEGNPnB+lALDIjd1Q3F23oguqDdhRYpLhIX
HHnkkbUnRO6gr4P2am4Er2cAp9Ad9HAkPyQSf8Sk+IH+J3/njeKH6u25bjOa
2oQ+UPnDsssuW7UJsYR4ITgg+SGRGBmT4gd1C/3SCyywQJ3LQM/rGugIMdPy
nHPOqXMifPRjhMaQ/JBIjIzJ8YO4gW6p9qcfoqsIbYKGiefERoHkh0RiZEwr
/BAQM/Tqr8kPicTImNb4YSQkPyQSIyP5IfkhkeiH5Ifkh0SiH5Ifkh8SiX5I
fkh+SCT6Ifkh+SGR6Ifkh+SHRKIfkh+SHxKJfkh+SH5IJPoh+SH5IZHoh+SH
5IdEoh+SH5IfEol+SH5Ifkgk+iH5IfkhkeiH5Ifkh0SiH5Ifkh8SiX5Ifkh+
SCT6Ifkh+SGR6Ifkh+SHRKIfkh+SHxKJfkh+SH5IJPoh+SH5IZHoh+SH5IdE
oh+SH5IfEol+SH5Ifkgk+mFa4Qdnej/22GPl/PPPLz/5yU/queSB5IdEYmS8
mfN5559//rL99tuXBx98cIBX+tbgPL1nn3228oIzhpdZZpmy1lprlfPOOy/P
500kJoNJ8cPLL79crrnmmnLYYYeVs846q7zwwgsDvNIpg7XvDN4rrriinte9
3HLLlRlnnLHGQrvttlu59dZb3ziDM/khkRgZk+IHawxHPP300+X555+f6Ezb
tsI1uubbbrutfPvb3y6rrrpqmWWWWcqss85a1l577XLssceWu+++u/6ZQPJD
IjEygh/mm2++8vGPf7ycfPLJ5bXXXnvj69ZbfNqO119/veZAxx13XNlggw3K
XHPNVWaYYYaywgorlIMPPrjGQu4X78X9+PW5554rp556alliiSWSHxKJBsTg
OGHRRRct0003XVljjTXK8ccfXx555JHOrBHX+dRTT5Vzzz23bLfddmWBBRYo
008/fVlkkUXK7rvvXi666KLyxBNPVP5o8hwevP/++ys/rrbaavXvLLjggvXn
0QU+TCSmNqwR8bb9dbHFFisf+MAH6hpRr5Br2G/bulZcF03kF7/4Rdlrr73K
UkstVeMFccOWW25ZfvSjH9V44pVXXpnoHvCJnOmcc84pW221VY2dcKP4wc/h
vvvuG+BdJRLtgXXz6quvlt/85je17rfNNtuUeeaZp/zt3/5tXW9f+9rX6vr7
/e9/P+hLnQjW/C9/+cuqMayyyirlwx/+cNUf1SZoDHfccUd58cUX36hRgHv9
3e9+V6688sqy5557liWXXLLGDO5X3HH22WfXuEmckUgk/gjriE4nlvj+979f
tTx7sTX3mc98phxxxBHl3nvvHfja8f0feuihqjGst956Zc455yx/93d/V+sT
9v7rrruu6glNjQHwG83yn/7pn8rKK69c+WS22WYr66+/fs2n7rnnnvLSSy9N
xCeJRGJiiL2tL1reQQcdVGsaH/zgB8tHP/rRsummm9bc/PHHHx/zdRQag5xH
jMOTISdYaKGFqsYwYcKE8uSTT9Y/1+SF4JPx48eXz3/+82WOOeaovIcj1G1v
uummqsH08kkikRgZ1gldwnqj7fELLLzwwlWb8OuOO+5YLrjggrquxuJaIif4
6le/Wj7xiU+8kROMGzeunHHGGXX9u97m+rbeaQxyhq233rrWb/EczdK/8/Of
/7z6v/gqkxcSiSmHdSMut/5OP/30Gj/wEdi3xRUHHnhgufnmm6sWMDVAF6Ej
0Bjs9zPPPHPNC8QB/XIC10x34IvaY489Kp/IP2iWtEh8wlvt305eSCTePqw/
e7j8/aijjqo10A996EOVKz73uc9Vv8ADDzxQ9+LRgJxArwedMXIC3+/Tn/50
zQluvPHGWrfozQms+TvvvLMceuihlU9mmmmmyifrrLNO1VTuuuuuqrGkxpBI
jD6sf/VOsT7PsnqgOof6oFj/zDPPrBrBW11//n3ahj1efdK/KydYfPHFy957
710uvfTS8swzz/yJxuDvqb+cdNJJVW8MzZIvSn/Z9ddfX/2fqTEkElMXzXoo
/wCfxLzzzlvXo1he3fCyyy6r8cabhXVLE7X+aR18nOIF/y4tEu/4fr05QfRe
8UarT/JtuA5eL9dx8cUXV75KjSGRGFtYm3J/Nc8TTjih+pnDg7DSSivVfZs/
wZqeHHDDaaedVn0MNAZ5i7xCDZPGICfoXd++txqLuEJPZmiWtMjQGHp9UYlE
Ymwh1qcF0AR4C1ZcccWqX9IMNtpoo8odDz/8cF+vtvUrp/jGN75Ra6g0BvrB
DTfcMGLdMTQGOgRfBj5Rr/zsZz9b9YrQGJIXEon2gKYolv/Zz35W64difNpB
eLX1RtAORoJYQB3kxz/+ca07qEuO5GPAMyeeeGKNVWgMeGjZZZetdZSrrrpq
RF9UIpFoB6xLMb11zKv9xS9+sdYVaQlLL710+cd//MeaE+CD3r+HD3BAr1YQ
vRI0BjyDb/CCX3fZZZfqw+DT6O29SiQS7URoE/IAvZFrrrnmG7VG/33kkUdW
H3ezj3wkmNGg9rDPPvtUjUEeIW7YbLPNqh9D7xV/RvJCItE9iAXUF66++uqa
A+iRUF+Ye+65yz/8wz+UU045pTz66KN/Ug/FG3oojz766Kor0CvxC98FrwXd
kwcqeSGR6DasYbG/+Qu82rvuuusbOQKv81e+8pXaO0GLxCf4Qi2DV1M9Qm4i
dqBf0hh657skEonuI7QJPks9XhtvvHGd1SRnUPOQQ8hF9HbgDfyhB2vnnXeu
GoMaR2oMicRwI+qht9xyS/nud79ba5Q4Iua6+HX22WcvG264Yc0/8En6GBKJ
aQtyCfVOXkv1UHOl/+qv/qosv/zy5fDDD6/8wYOZvJBITJuInIO3Sr7BA4kr
5BK9/odEIjHtAQeoU/I28EJ+61vfmmj2fCLRFtDF1c3kvOYKteXDW8gvoFY4
bHtqkx/4I3i0h5Ef1GzVcG6//faBv0/Nj559Mzen1gyPYQFuME+Itm6mgT6A
Nn3Mg9x///3rvJRhwrTAD7xiZvyq7epBG/S71Py4Hr35zhuckp7baQ18eJdc
ckk9B+Hd7353eec739mqz3ve857a7+w8TLMMhgXDzg90FPGoOeA8XoN+j3o/
73rXu2otmV+NVzVn5/wpPEP9wPoD5MDmGOy3337Vv9eGj/lqeiHV/fAXHhuW
PGPY+UEtV9+ZeXz8HltssUX1lw/6nfJxHeZkOPfELB59s3rsEhOD5mAGiR4i
NXhzm9Xe5GRt+OhL5BFaffXVq9fYvObm+fRdxjDzAz8XvcG94QYcL89wf4N+
p3xch/ka3if9+HKNCy+8cOBnGLQJdCPPcNttt63eHGcq0G3atD+L+R577LEa
R+hdcG7ND3/4w1Gb9zhIDCs/uC96ln4Rve4+ZmC2LcenjfC986mJT+VBw7L3
vF14hvz85orI7fUFfO9732vl+4nvr7322rLJJpvUdURT0h/ZdQwrP3heYgXz
MJ0f7v7MrmnTvgOux97j586jJoZ2LuEwPIO3C8/QPALPkDfHmjM3OUCXwPf2
gTczJ220IL4z80CvUnxfz1HOYw9yBoUzaPUrdP05DiM/iPc8O3mqfccMG2su
zih0z+7Ru2bWLz1prD78qrypzVq5d8zew9MufzXzU49s27hsLOEZqkebkyiu
8gzNLYzcS+wuzhIf6jH0sx0L+P56m+US1ox5TPGcfM0etNNOO9U96Qtf+EJ9
rl3GMPKD+Xbnn39+1ZLl9WZ9m6/b/Lr3yUxdvan08LH6iA/WXXfdOtvTvhcQ
RztTxExyZ5Idc8wxQ1Unm1J4B+Vdce4KjSa0W++snw0vhFnK5qv777EADrj1
1lvrfHf1MNzUzAfFM2eddVbtU/Du8UR0WXMeNn4Qc95///31bB6aJI7wnjW1
Inq4eVjmWfz1X/91+fM///Mx+/zlX/5lPUtMndzcreZ123s8BzGEM0PM+us3
N3SY4Z69k7QY85Y9Q+etRe2XZmmNmoVs/oDasJ/dWMA1eG7Ow5UP4ifcFO+X
a3ee1b777lvfPx4Xe1VXY8Fh4wdzLMy2Eo+qGVqHzX0aPEucbv2ZdeH5jtXn
1FNPrV4oc8d7PZP2HucRqsWa0yU/EmNPa1CTpv9be+oBnmHoyqE7OyfK19SG
xV2Tm4U2mvC9eKrxk9jGPDV8FbB+nBXBU4nrzWG0xrqIYeIHz01ub+6ms8Ht
K3rPRvIcuW8fXxvEZ6T9xO+LVeVD6vyf+tSnKpeM5bs/aMQztPZokmrS6psB
GhJe57FW79xhhx3qvNWxhGfH82D+wZJLLlk1Lt6VOA83tBMz3u1RckZnR42l
hjpaGBZ+cB9iAnGfOph9RY4vl+gSvP/2npjr57yipmY/zPAM6bY0R7GBmrRn
GHGWdadX5Zvf/GbV/8RZzpMfRA4mBhUD0kVci1zVcwvgAn5YsYUYg17R5Lmu
YFj4wbq6/PLLq/ZnXdGQ6cxdy/tcr73HfkSnsPfISXpnjg8joibtGdIV6MdN
7Y+uLJ5ybqNnbOZZb+44lrD3uB4ag+sR94WmFFwn97FXeZb2rq7tV8PAD/YV
9YkDDjig7ju0Y37cLsZzIMa295jJ4b1z7rmYu2tcNyUID+LBBx88UWwQul/s
13J5dQP67XXXXTfQn4m4xXtHJ6IXmf+sdhHX5Jr5Y8WAoTmrp3cJw8APONn5
P2ph8na9O2bcdBnqd/JbNVFzQPVrqIEOK5qxgfdwr732euM8p8j37cVyR3UD
vStteE/Frda8mIceYh2pnwXivsQY6hlini69m13nBxxtNoc5unw08nbehq7X
BV2//VJ93Xvl3BI1vq7f10jwDGksatJxVqM11/SPOe+R3ux8Bj12fjZtgLiH
7sUryTNptjPNJDRlz4t+av47P4S4SI2qK8+x6/wQPhk9MXI8mrG8bxgQe486
Bo7gteELHSZ4/+j+YiV6JL3f/FP3Hl+3/vyeGJ4e48+2qYcNF5jzo4eMFqnm
om4WaGpjcicz4ruiOXeZH+wrnoP9RH5HI1aHHpY8PXQVvEBXiZy8q7rKSLC2
PEM6f3idxIOBqOfoneaVEk+1jSOD43g2+GOdc21OY+SD4amSE9nDoi4Tfv82
o6v84Lr9zL/zne/UnzkvjRpzF659SkDTt/eo90ddhuY1DAivk3XTjA16Y3Ne
RJqkOIoXsY2xeeRIPJ+8N6uuumq5+OKL3/i6e9KXTmt2L86f4rFqO7rKD66R
zsCj5nnQiLvqUZsUIr52Von81t7jDPWu1clGgv2TpkJv4IUSdzdjA/coV+eF
lV+pT7VZo1WDNr8DN8gjzP6J+2l6qmjO8ih7W3iq2oou8oN4jcedFmzfsa+I
u9uUk44mws8rj8KFYvG2zUiZUkSPLd0uYgMzmCI2iF5InEGz1AvZ9nuOe9Jz
Kh+Ua5hd1qx3ijHwhlgwNOc2o4v8EH0K3in1TH66Zq/TMCL6SrxzoeEN0hv0
dkF/9Az1z1orZkvaXyHmwvzgBz+otUxxE09lF/J1+aDzsJszK5q9Y80YQ71N
HZcntK3oGj/YV8wQ57v3XsnL5efDPtfV/dl75FHqZHoazR5oYy4+OYTXSU1a
vVKOaE5C7zyMDTbYoH5dL4YZw11AaCr68/Xp82v476ZHXIxBv7S34UcxRltj
367xgz3mhBNOqHok35A+vmllTkLsPeZUyjP23nvvMe9NeruI2Q00FGund85S
9DaZv2J/lat3bZajtW62nNyIDiGWaM6Jac42Uw/lI2+r5twlfoh9hfbr3TFn
adjOI5kUPCt7j3xqnnnmqXvPT37yk06dreMZ8kXzCITXqek39O5NmDChciBd
QrzUxR53PeqeDW2VRuaZNf2g5tPJmXCkeXRmbLaxx6Yr/OA6+VJdn7Whr1bf
Uhdy0tGE9XXVVVeVz3/+8zW/irmaXUCsCx42/M7TZu5fxNYxW0VcxOvAL48r
upg7xr3QVdReVlpppaq/BsJTZY8TQ5gtqOembegKP+BW74pamHfLOcJdi61H
C/JbMbn81t4zfvz4ul+1Hc3YwPrXa9XUldUzzZiUU4THqO31v0nB/dKI9H7j
APeLMyA8VfhRX5c9T90jNNq2oAv8gItpczgBF9tX+GS6lJOOJuJcD3uPn4c+
T3tPm2t/4XWyn6pX9sYGzXMu6P72U/trl+HexLx8DmbI0MzEvKEpxxw9feH2
PGccmIXYpnipC/xgb7Sv6NuOfaXLtb3RgBqv/NbPxN5Dz2tzjVds4HrlhdYC
30PEBjErwdkW7kVMZE58l3SVfgi9xbkYeFEPgNnkgYipxBhx9lab4uK284O9
0j5C46XZd2GvHAvYYzw39XP6l/3Y3tPGOll4ndSknTWqT6npdWrOhRE7iCGG
5YygOBdDDsj7qhdAr37oZs2zt3Cj3EqM0ZYem7bzA5+MvcTPVr5N543evmkd
kd+KS3EnX17bPObhK7Y+rA2xAd9T0w8Qc2Hsn/R+s3266Ovoh5gT4/x494gH
acyBOAfI3ken4PeI808G/Yl9KPhB7O69G/R1+eBYP8c4G/nLX/7yn9SJp7U4
onm/UdPxzHgq49zHNvG7d98ztCbUM80ObsYGrpWuHzPa1C6GMXeUX51zzjl1
/o13mQ7TnEUXMYazPOTQtCV9a2346I/D2+9///vLWmutVfWUQV+TDz2X16Hp
M2ueZ8ZnI041O6TLOvebgfum0Vpr1ldoWOEJiXME+Xmb+e2g4bnwD4pv4hyb
0JXDa2h+vb3Ju+f+hpHzxUOeW8xA1HvB9xsIXzBecEbKBz/4wcojg/54bmIa
58S8613vKh/4wAcqjw/6uny8MzgrfDLNfSdiMjGFuIzGM0wxaRMR45kDYYah
OWtNzyhPgX6M6HM3A7EtPws6Kr+rNUHDN6O1mVvTnvXWmeug1jfMuaN3lnam
r9CnyePNvo33ve99let5p9rw8dzw1Z/92Z/VujTf8qCvKeYM0nxpVuLS5s+z
uW/iEe9XW2aOjTZi9qK1jwPoMc11JH+3L4mzxIHOBmtLjSxqseIa2qS6ZXPO
Ox5zf3wBw+6TFxfx8ahP4Pump1yeKI6iUYqz6IB6Owf9MZtHT7rYzvOzDuWD
g74uH7PV5Gl0yZjrFz/TmANv3+Thj3POhs1PaX2ZXa2nW1y3+eab15wq1r+f
hzkk5iuJucxwbFsvkxiC5hizlbz7cR5W4v+8f55h1Dh7Y+VBos31i9h7XJt3
n07Z9KDGTB5rxrXHXIRhgtkw/Mg4kP7oPMDmbEbvUa9vrC21sUB4jc1WMkMu
vFHDqDNMKcIjJd/wjsczbEudus38ADHXT9wsh6XVN89Djn5Os8nkI/LbYYlT
wzvDUx4zsppzTe3BfGP8uXILNcK29jLFvC996fQucU6Xz7UeLYTPLZ4h31ib
6jdt54fwVtt7/Pxirl9zJo95puYyWkNmCVx22WUDvuq3D/dnNq16n/vWs0BX
CN+A+9ZDHL4x85V4jNqiS/YiZrTyAdHe+CjFQtOqRx56+8DNu6CptUU7grbz
A8jPeAP5gORnfEDR5wJRWzZPyuwUOn+XziAZCXR99Vz7Cn/kgQceOFFswI9M
p6RlRX9w2+OmOJMyertpXV2Z8z7aCM+D5+b5+agDt61+0wV+iJk7rk0OEX3B
Mec4zr3m47DX8qA0z2brGsQAYqLtt9++5qTrrbdejQ3ifqIWZr/xdX/OPtT2
fL53bpxajD7UYeizmFLQ0T1Tz9C64ylv44yYLvAD2Hv4HdTHRvIBxSxk+YWv
m13WnDvSJURswFOu1myOSLOXST1TPBEzYmgQXanbREztfRND4L5h05Qnh+aM
OfUc2pr9rm26MnSFH8Jrp8fPuuGPEJvFnPc4D0O905oSd5th1sZ7mRTERPyg
+v3kUrz4zTN/5Frqz/QIsQNdpi21sDcLz0x857yfyJ2G3f/ahBzCM2z2zbR1
Rm1X+AGaeg6dgTdKnB2IeqczmGjkZpd14QySJnAcX7nYm/ZAg4i5Y/Yd3gZ1
DPVMtQBabeRZXUHz3Bt+vBVXXLFVnq6piZhxr+8WN8TMh7bmwl3iB4h6UOw9
fIPi7YB6p3O06OPWmHpn22by9IN9Re07Zp/zNTR12JhxL6dQ61UL4K3uIuRD
6kz6EHhk5YNd15TfDOjOcQ4f37L11uYzf7rGD+En4auUu+n7s/c0Z/LQeaJm
FPXOtu9NngMu4AsQG/FD2lciJ7W/0Fto/jEHw9yRtt9XP8S517xfYiH5IF5v
iy9oaqB5jq93Uw7Z9jN/usYPEF4be4+1Yq5fU4uM/BZ3iDF4Ttq+N/XGBnKM
pn8oZk7ymVtL+jG6MHNyUoizeeWDtEqzldqo4Y8GYjav2TC8sIsttljt5277
WusiP8RMEf0scojw2sReG/VOMQZ+UO90VlNb8/TwOtlX8J0aDV0lYqLm+fDy
jjijt837zptBzI6hI+uxMcPCOaNtf//eCuIZmjtunek77kJ9rYv8AOG1sffQ
+Z2LTOcPRH4b516LMdrqxeGTUcMUG8TsxYgNwke533771biC7qJ/e1g8AzFb
Sf+t3gzPSx2769zXhP3KM3QGhv3KM9Sv1tb9qomu8kPMNNXT2TxLq+kTEJ9H
vdOf0afRNn8anou+/zjPwpyQWB9ypXPPPbfmSnQJmn9ba2FvFZ4JDck9io/4
X9us2U0pPEO9yM7AoLWYV93WPpledJUfIHovaN94Wfx9xRVXTDQzXX6rr5Me
pN6pV7ote1N4nQ466KDqJ1TnExs0Z5eKQd2fdaMHXz/+sGl47hPn0YnUpdV1
5YNt7SWZEsRZo+ZSewf1cNPOunJvXeYH6O29oDmI5QJ6EmgTdD9xhHpnW/am
0FHtK67dvtJ77WZamGnMK0nrjzPahg048corr6zzscwDMuu6+bPoKsSz5lHT
yOwBZje2vU+mia7zQ8wWEJNaQ729F75+3333Vf3BHiy/pUsMmr/jPG7XJeYU
G5hFHTmpGIGeIuahWZpxIRbqaj1zcgh/LH2fzsIjK3fsQo7eD3JHz4w2Zm15
hs3ZWV1A1/kBQouUX1hLvA/NOlnTd6R/C5cMOocPH5fYgG5vX4m+/9BOaPl8
5GZbmPEVXvJhBS7AidaTeqfZxG2atTsliH52+pfnS/9Sp+lKn0xgGPghZs3F
elJbbvZeNPNbMYZcRE4yqH6YiA3U+2mS1oFaTMQ0rpvOIKYQ8zg3gQ7RFt1k
aiFmkuNCz5C/EG+2TVN+M4hnGP2C6jNNL2xXMAz8AKFFWnNiiJg11zyjSX5r
joo1R/Mb1DxbPhnz+9X8rQNep5jHGN4N88bEOjR99Ys29vZNDcgL5YN6TPiu
zc6iOXeJG2OWvThV7mg/kvN2UVceFn6I2QLNmJ2epwYaX7cu1UB5DMTsYoyx
3pvEl973Zi6kBhPvP55Qw6Cj0Cy74P0cbcQ8ID2q4iu6bb+Za/jUurM/jPUH
l43EW6GZq0fheD1CXe2TGRZ+gNh79DB4LlEPjLg9ZkHrmRZDRH/nWO1Nvo88
x/sidhYb2FfC6xS9I7R718drJ+Zpa2/f1EL4Y81HwKF6mcy46NWUcbufj1jL
+zuWH/GneU+95xlGjzGPq/VED+9a/NPEMPEDhBapb9b+2zsfQX6r3uSdC0/V
WPV3Rmwg3tRb1tt7Kv6JM9bUwuQgXek9HW3EjCzzY9Q7zdlsrsWmpqRH3PlO
Y/lxFoT4Ru7XhPdLXOoZ6rPwDLvcJzNs/BBeVjNH7NG8NvrBI3+Peie/ivyf
9jdWvms1CT3ZfNRiAzWXyEn9Si+hVTbnwgxrPXNyiPmM9mhcGWczR6wVWqa9
QC+DnpWx/Ig9PUu5YcC16anl+beWPMPm17uIYeMHCK8Nz3JzPmPA/VmLdMEL
L7xwzPwqYhszB8Us/LYRG0T9Rb+ZmMZ8Tb2cXdTtRxM4U31T35p6p1mNzVl0
uFOuLz70fMfyQ9tWvwx+jzOwDj300Fojo4GJU7uuKw8jP4QWGf3QcaZ8cxYd
fUncRwsbS/0Bd9FMXUtzZlzzDCxzYTyTruasowX3H/NU6MlyMppEG+P1eIZx
VgkNbNAem9HAMPIDWP+0SPNUxOsxT6Vtay7O3jVvTA8J3URM02Xf4Ggi5rHh
TL0Zfj76F9r0HF2juX/qLLghzsAatEd3NDCs/ADRe2E+MC1C/N6ms4mAP9/s
4uYZWHmu1MSwN/Oei694jWKeq3y/DR86iTmhyy+/fPWG67cblj6ZYeYH/M13
aG8Wm+qdsze7v0G/Uz7ee3582ry8Qn7dnAuT+D+Isegzcns1KXGW2WHO2WnD
h9Ys/7F+aJd8em2Kb94OhpkfoJkX8uOpa1qHg36nfLxLekK87/qRzIWZlua8
TwliHpCZje9973vLe97znlZ93ve+99WapmcY88aHAcPOD3EWiVoUXXnQ79FI
Hz93fpq77rpraPad0UZolc5AkuePtR9qUh/1MTmPvhHv2jBh2PkB7D3q0M7W
sQ4H/T41P66HLuJMnK719o01ov6jruidbctH3xVewF/Dxu/TAj+AOjW/gXNZ
Bv0+NT+uh5aV9YpEG4Ef1GZoZOpH6svDyA+JRGLKIDfn0Yt+Qh41ferqzfaz
YYuVEonE5GHdy8n5UvVDm4kgdviLv/iL6nXnZ+fvEPdmrS2RmHbA68V77Pxg
fcS8x84d4M0xA8s8C/1yPEVmqasp8QJPq31CicS0gOhloZOba8OvN9NMM9Xa
u7mZekn4VvUO8anpmeUVNR/LzFBzCXh6MudIJIYHURPSb+aMpphpKJ9wPjw+
cB4LTVIuYQYBD5++el4isYVZTDhEn4w6Tr+5OYlEohuIHkU+dd5w/erWOQ+h
/mHntek1xge9a93652eVg6jPO+MST/CHmcc+YcKE2oOY2kQi0T1Yt3qTzB8x
/8S6xgvWOc2B9qhXfXJzy+QTap9yD3NL5BtyErMRnSXIZ57aRCLRDUS90twh
PifzDXme5BPmdKlVqF32nu2KT9Qz/H4vZ4gt4t8071XvKv3SvxnahB4ofz9z
jkSifbAurW3r1BwU/WR0R3u99ewMAXN3zD9srmF84vfMaDUnj+Zg7tJIeYPf
U+90Rrl5oGodZvDpm6ZNmHcUMUnyRCIxeFiHfMPqleaVmYXS1Ar23nvvmmNY
180cIDRLcxj1m/FG6X92RqCZa5Oa0+9rNEpzOfWl6Hniq/L9zMzRP0PTSG0i
kRgcrD+9wGoNeECtQU3SejUX88wzz6zaZHOth2ZpZrI5yv4cP5R8wTkt/A7m
/U1OT2jyywknnFD7pc1okss4a1ZuIx+Rl6Q2kUiMHaw3/eVqkvQA87asTR9n
H8sv9An39lDgE/1J8gOzm+UHNEtxg7gDX5j7PCV9QpGfmOPW1CbkNnIc84b1
IOGozDkSiamHZr1SPcFsZp5HuYS5Voccckj1OoopejUGNQZr2OwyZy/Y583p
klc4I5k34u30ZOEe9U59HLQJs3TkHHId3GO+vHppahOJxOjCerKuaAjOHGt6
op0JwI9gPhINopnzRw5AswyNIbzT6pNmDtAlR3MWsDhB7nL22WeXcePG1djE
deKL0CbEMKlNJBJvH/Z+69fcZX4mnujwPZvHbK6p2RTNGfu9GoOz0mkM1qkZ
ZuYAW6fmtE6Nvdy/KRYxjyVqKeKV0Cb0j4tl5EipTSQSUw7rJvZ+5wfFTH16
gXmFPAe33XZb1f+aazw0BjUL5x/bt8UL8hBxvjNDaQxjcZ5jcJt+D7mPs0px
FG1CbqROon80+8gTiTeHqFfyETiP1Bn1/AXWlfN49FXxROOA5t7bqzFYi/or
ejWGXl/UWABniVWckyt2USeJ3MhMGufTmuc+rZ1Bm0hMCawPvgH+pOY6Mk/f
+WI80aHxBcIXxRNJY3BWsPnfYgY6JI0hfFGDhjjBLDVa5TbbbFN5L+ZJO0PX
fdNWU5tIJP6IyNf5DsxYlqOrVdr7zW1xFnDv3h9xhjmdfErWW/ii8IrzHkJj
aFOOP5I2Efcqb8Jnfg5dP7MwkRgt0BmuueaaOh/W3m+N65s4/PDDa32heU4j
RO+VPMO+qxeTj5ovisagdkBjmJQHctDAWWIF9Vj5EN1VPVSsJOe49tprB32J
iUQrYK3T6hZccMG6l+IJHiZ+gl6NgR5pf3WOEC9S+KJoDP4NcUaXZqTLlcRA
cqc4Q0oPuntpU9yTSAwKNAfnTFsXNEh9TiNpDHyI1o2cg69A7k5jwBVqBG3Q
GN4M3E9vvSLOoKRF8GiYY5M6RCLxR36gH3zyk5+sen7AHsoXpfeK30iMQXvk
jaYxmAfXpRmych73K2Zo+jXFRT/96U8r3yU/JBJ/xKT4QUzA46THgsYgxohe
CT7rLp2pguvoIubZmaGv3hkckPyQSIyMSfGDuFu+4dxpWv/48eOr9t/F82vU
JOiOvFHyoyOOOOKN+0h+SCRGxqT4IfoZ1Df0S/f2XnUJdFO+Tl7QOMcPL0Dy
QyIxMibFD010lRcCyQ+JxJTjzfJD15H8kEhMOZIfkh8SiX5Ifkh+SCT6Ifkh
+SGR6Ifkh+SHRKIfkh+SHxKJfkh+SH5IJPoh+SH5IZHoh+SH5IdEoh+SH5If
Eol+SH5Ifkgk+iH5IfkhkeiH5Ifkh0SiH5Ifkh8SiX5Ifkh+SCT6Ifkh+SGR
6Ifkh+SHRKIfkh+SHxKJfkh+SH5IJPoh+SH5IZHoh+SH5IdEoh+SH5IfEol+
SH5Ifkgk+iH5IfkhkeiHyfGDcyud59319ZL8kEhMOSbFD3jht7/9bbn99tvL
vffeW9dRV9dN8kMiMeWY3Pnd55xzTtlkk03q5wc/+EH51a9+VV555ZXOnbfn
mn/xi1/Uc4bnmmuucthhh5WXXnqpfi35IZEYGZPihxdffLHyw8orr1ymn376
stBCC5Vtt922nHfeeeXpp58ur7322gCvfMpgvT/00EPlkEMOKVtttVW54IIL
anwEyQ+JxMiYXH7xxBNPlLPPPrtsvfXWZd555y0zzzxzWW655co+++xT92Nn
esc6azteffXV8vDDD5c77rij3ncg+SGRGBnWifWAH5ZZZpnyox/9qK6jyB/8
av1YU8ccc0xZd911y+yzz14/a6yxRo3T6RNd1Sbc33PPPVfOOOOMynvJD4nE
H/H888+XE088sSy66KJ1zW+33XZVx8MbzbjAevF711xzTTnooIPK8ssvX2aY
YYYaU9Am/BsPPvjgRNzSZrjG119/veqvEyZMKNtss03lhgUWWKDei7pNIjGt
g253ww03lO23376uD5/VV1+9fOc736kxw8svvzzRXmpNPf744zV/32WXXcrC
Cy9ceQK/7LjjjuWiiy4qzzzzTP1zbYX7kRfdeOONVY/4zGc+U7lx7rnnrvzo
97vAcYnE1IZ1QIe0JsQFK6ywQq3/fexjHyubbrppOemkk6quh0diT/V3/P8H
HnignHbaaWXzzTeva2uWWWYpK664Ytl///1rnNE2bcL1y4PuueeeMn78+LL+
+uvX65511lnLSiutVDkxcqVEIvFHNOOC3XbbrSy22GJVi1x88cVrXKBm8eST
T04UF1hvuOW2224rRx99dFl77bXLbLPNVuaYY47630ceeWS58847awwyyHg9
+OyRRx4pZ511Vhk3blytxeBB97fHHnuUn/3sZzXXyLwikRgZ1hEfkbjg9NNP
L1tssUWZZ555alwgrhAXXHfdddUX0cw5xAjPPvtsufLKK+ufWXrppWvOQfP0
b/z4xz8ujz322JhrE6ExuLbLL7+87LXXXvXa3M98881X65yu7de//nW9tkQi
MXnYQ3GAuEAMwHMoLuArWmeddWo99O677/6TuIAf4tFHHy3nnntu2WGHHcr8
889fZppppvKJT3yi7L777uWSSy6pa3UstAn89cILL5RbbrmlHH744bXWQmOg
r3zuc58rxx57bLnrrrsyl0gk3iKsMVrjFVdcUeMC9T8xuXW/5ZZb1hiDn6C3
HioGue+++2odQG0Dr0SOT+Ogh1q7U6OGiK/4Iu+///6qnWy88ca1xuL700bo
kddff32taaYGmUi8fYgL5O7igp122qksuOCCZcYZZyxLLrlk1SouvfTSWvvs
1SZwwM0331w9EjyY/g6u4HH+3ve+V3s6cMlo5PzWOp4Sv/B4qcfQUOQSoTGo
Yz711FOpMSQSowzrTz5hTR9//PFlww03rGtd3rHKKqvUfqebbrppRG2CF1tu
If9fYoklKk/gGJ4D3ky66FvVJkJjiDiHr1OcI14IjYH3KTWGRGLqw96rbilH
UA/kHbAW1QnXW2+98v3vf7/mFuKCZs4R3uYzzzyz9nDQLuUqyy67bPna175W
9UMx/5TUQ0NjuPXWW2uMwrehdkJnoJnQSdRPUmNIJMYW1rFYXW6x995717hA
zULdUFygr8vXmz1ceII2QBfUC8p/EOsZz9AR9YdOLo7AUWIZdRYaw0YbbVQ1
BnzD13nggQeWq6++umqhqTEkEoNBxAVidzE8X4G4gG9CXPD1r3+9xvzigl5t
gq+bTigv0RcVNVSaZ7/6Rnw/tdLzzz+/fOUrX6nagu+3yCKLlJ133rn6N/SU
tcmXlUhMywhfolheDxdvlJwjvNqHHnpozQF4qZraIB7gubr44ovLAQccUPbc
c8+qU/TWNZo+hquuuqr+WbUIcYe4YbPNNqs+TvEEH1QikWgfoofLGhbjiwdo
kdawOuMpp5xSNYjmfJmICez5+rvEFb3/Js1TLwiP5lprrVV1Udyz5pprVo2B
Jxr3ZC6RSLQfTa/2rrvuWnu38IRcQH3U7/MyT8onJc6gceIMsYEYQU8IjUF9
4pvf/Gbt8cBHWa9MJLqF3h4uvV5zzjlnXd/iCvGFGohaSDOfaMYTF154YdUY
+Bj8PTVRvkyzXPBPzmhIJLqN8GrzOX/3u9+tPmdapHqoWTNmseinpF+odcSM
CRqDeVZ0DPmE/OTkk09+o3aaSCSGB+oJTa+2+oacwzwWHibahP6uo446quoK
Yg3csNpqq1VeoW/yOqTGkEgML6KHiwfaLBaagh4uuQPO4HlUr1xqqaXKvvvu
W2dc4pXUGBKJaQPh1ZZXHHfccWWDDTaoPu3pppuuxhOhMeCR9DEkEtMmwqtt
dpV+LZqDPIPGkD6GRCIBYgS+B14pWmZqDIlEIpFIJBKJRCKRSCQSiUQikUgk
Eoku4P8BJiQ4iQ==
"], {{0, 254.}, {264., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->{40.73046875, Automatic},
ImageSizeRaw->{264., 254.},
PlotRange->{{0, 264.}, {0, 254.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJztnQm8jNX/x+vfoleWVCK7a8mSiCxRkjVl30WWm+6VfYuLyBoi+xKSJaFI
EYok+xLZl8h6o5CU9r3z8/6+Xsf/MWbuOnOfmbnf9+t1izFz53nOOZ9zvst5
zjfi2a4Nov/vhhtu6Hnblf80iIyp1KNHZN+GGa/8pXGXnu3bdYl67skuL0S1
i+rx8LM3XXlxxpWf2Cs/N1/5MYqiKIqiKIqiKIqiKIqiKIqiKIqiKIqiKIqi
KIqiKIqiKIqiKIqiKIqiKIqiKIqiKIqiKIqiKIqiKIrr/Pvvv+b33383ly5d
Mt9884356quvzJkzZ8yFCxfMjz/+aP766y+3L1EJMP/995/0M/1Nv589e1bG
wddffy3j4rfffpNxovgX2v2PP/4wsbGxZtGiRea5554zDz30kMmePbvJnz+/
qV69uhk2bJjZsmWLuXz5srxfCT/Q1g8//CD9TH/XqFFD+j9XrlwyHqKiosyC
BQvM6dOnzZ9//un25YYN6OmXX34xmzZtMpGRkeaee+4xd955p7T7fffdZ/Ll
y2fuvfdekzFjRlOsWDHz6quvmnPnzqkOw4x//vlHbJ4pU6aY4sWLm7Rp05qs
WbNK/zMOcufObe666y4ZC08//bTZuHGjzNtK8sG2QH9NmzY1mTJlMpUqVTIT
JkwwR44cEZsE+2P16tWmbdu2okvmxcGDB5uLFy+6femKn2D9w+YcM2aMKVCg
gMmbN69p166d+fTTT2VdZBwwHkaPHm0efvhhGSd16tQRHapdmjyY+06ePGli
YmJkfmvZsqXZu3fvdWsc78MnGDVqlGiQfli4cKG8roQ++H6LFy82jz76qKyB
06ZNM7/++ut17+O1NWvWmNq1a5ucOXOarl27yrhQkg426NKlS03FihVN6dKl
xdb3BXo7deqU6du3r9go0dHR4j8qoY1zHs6RI4fp3Lmz2KS++Omnn8ySJUtk
zJQpU8bMnz9f/ZJk8N1335mxY8eaPHnyiK2Jrx0XtD/rX9myZU21atXERlVC
G+Lg69evN40aNTIlS5Y0c+fOjfP9aPbo0aOi1WzZspkBAwbIXK4kHux47Ih+
/fqJHco8iMbigv7asGGDadiwYYL6Swl+fv75Z1nXiAMkdF4lFjBy5EixR9u3
bx/nuqn4Bj97//79YlOyDuJvB+IzSnBDzOX111+XmHezZs3M7t27E/wZfEdi
eQn5jHI9qkEFnBpMqJ6S8hnlev7++29z6NAhsSXI/YwYMSLeOCd52V27dpk2
bdpI3mjcuHEpdLVKoEBPs2bNMiVKlBCfcMeOHfF+5vvvvzdTp041hQsXNi1a
tJB5WUk8xLLOnz9vhg4dKnHOLl26mG+//TbOz+B7f/DBB6ZKlSqmfPnyEs9W
Qhv6dOXKlebJJ580FSpUMO+9916ccU7+jVwi8XHyhD169JDYnpI0yAvNmTNH
7PqaNWtKfMzXWmg1yz4Z1sAmTZqoDRIG4F+QE2YfGv3KHrW4YnPkCNetW2fq
168v+2fYz4FNpSQNbMvPP/9c9qjh3/Xs2VPyE546RH/o9aOPPjK1atUyERER
YrsSJ1VCH/ZCkZcvVKiQqVy5suSM6Vvnemj3ch87dswMHDhQ9moQH9+2bZuL
Vx4esAebPGupUqVknxI2xuHDh6VfiFujPWwPbFDanJwQ+wU/++wzty9d8RNo
68CBA6Zjx44mS5YsYpcuW7ZM7B76n3GAvblv3z5ZJ1n/7r//fjNx4kSdh/0A
eULamr26tGvmzJll/wN2/muvvSb70/C7sVPoH3S4du1a3RsRZqAl9g3T1/h5
7BklV0HsmzWyd+/eskeR8fHAAw+YV155Rf1AP4KeiHVhg9SrV09yrzw/wbMS
PENBDp+42QsvvKAxsDCG9fDgwYOmf//+pmjRotLvPCvBGECX+CDs1WavVHz7
OZSkwZqIbYqPOHPmTDNkyBCZB999913ZK6p7tMMf+ywp/c2cTAzu5Zdflnwg
/oc+P6ooiqIoiqIo4YsvW5/X1Q9IHcTV1zoOAgfxFvYO8jwTMVLbztZHZx8b
Z2vps2LhDfs2yDmQr3Lm/ojX2VwxuWM9w8K/oDOeCZs8ebKcZ8BzvfbcLLtP
u3v37rKnl70ySnjCnjPOjeG8oGefffaaPTB2bzd7hcnTM08r/gMNssaRF7r7
7rtNr1695Kwn4P/sD2zQoIGcbzdv3jyXr1YJFMy3PDfB2ZZVq1Y1H3744dV/
s8/tct4F+2k4f1bxH6pBBVSD7qEaVEA16B6qQQVUg+6hGlRANegeqkEFVIPu
oRpUQDXoHqpBBVSD7qEaVEA16B6qQQVUg+6hGlRANegeqkEFVIPuoRpUQDXo
HqpBBVSD7qEaVEA16B6qQQVUg+6hGlRANegeqkEFVIPuoRpUQDXoHr40yOup
XYOp6Rwxbxq0968aDCyeGqQGGmf48HPu3DmzZs0aqTeXmjTIOXOcIcfYo7ZC
ajjn36lBzm6i1hZnqXF+E+epUe9ONRgYnBqkzgtnaq1atUrOVxs+fLjUeaQe
VmrQIGf2ob0vv/xSasK9+OKLcp4YdfeofxnOa6LVYNu2bU25cuWkthK1t1j/
Fi1aJPYRte9Ug/6HcXfmzBnTr18/c/PNN0u9HWrU33TTTSZNmjRSe4f5r2TJ
klK3NxzPlkRbnKdJO7z//vtS94vaU7feequ0B+c6UieauYrzVsNRi9z/1q1b
pb4ktZaoc3bLLbfIGODPjAHapH379tJO4dgGboCNRS2d7du3iw1y4403Xm3z
IkWKSE1C+oPXChYsKGdLcg5wuIxDW1+WM4zxezlHlRqzadOmldrEzDvURr3j
jjukBiNrADXDOeeWz4VDGzCnor+TJ09KvS3WQOZi+p3avNQbZDwwBmiHpk2b
ytmjqcVGDxTW5jp69Ki0O7Ym6x115h577DGpeUbdVepd4R9Se5X2z549u9gq
jNcLFy6E9Di08w9nGGN3YWvffvvtUn+RGNTbb78t8w01afk760CGDBnMgw8+
KHPR7t27xV8K1XFo5x/68ZNPPjEdOnSQe0yXLp0pXLiw6dSpk9m8ebM5dOiQ
GTdunKlQoYKMD2IGxGwYN8ePHw97G93foD1inbGxsWJbNWnSRLTFj63zSf1B
51n3jNPVq1ebZ555Rvooffr0Mj/iPzIOid2E0jikDRg3+HizZ8+W8cS4w86q
VKmSmTRpksSiLNwbMYkZM2aYihUryvtor8cffzwkxyHXyZnaxFp27twpdSax
d5h/WPtZ56g56ayzi92zZ88eqZVevHjxq7VhGzduLLY7tqlnDXvlWmzdCHxp
4i1RUVEma9asV/XE3zds2CDzojdszWxiMk888YTYJqwJ1EemjjYxDOqWB7Ov
aH0+1jbGTfPmza/WG2YNJP5y+PBhn5+nbTgLnjFbrFgxGYd8njWStTIUfEX6
h3764osvJN5CbQPsbsZCjRo1pNYn/ewL5potW7aIT2htdOI0rVq1knmazwZ7
G6Q01t4gxr5x40apK46dYf0dYg/Lly9PcG1jfhd6w1ZFf9gl2Cc1a9YU2y0Y
50OnzbV27Vrx6RhzrH3MP+3atZO28Zx/bG7M816wI4hdYLvhOzIXYR9go/P7
8S2DzUa38w/2zzvvvCO5JuYQ/I/y5ctLrV38koTMofwubB9qj7Rs2VJsd7RI
vKBHjx7SluHkLycV7t36O3v37pU8A5rB3mDeeuqpp8SOYu5ObDvxfuZD6yti
x7CW0BfkNfAtsGOwd9zsA2tzMV6wr1m/mH/QHvMPsc7FixdLG3nCWORz1ITG
LvWcV/gz44z1D3vMjkN+PzY6dlsw2OjOmBPzA/MG/Y/9Q835Ll26SDzOl/0T
F9ZGnzt3rqlVq5bMa7RB6dKlxb9m3IWyv5wcuGfsDea1N954Q/KtjDvWK2wP
cn6sZcnFOR/iKzIOWROIow0dOlRiOm71gW0DbMdp06ZJPIFxxzgh/oRvx/jx
Bde9cOFCWd+JTRCD8jav8D2s/eRs+L3Y6KwvxBbxK/EVuQ435iI7B+Ozk18n
lmTtnxYtWoj9423+SSy0Cb41tetZUxlnzMn4ztOnT3e1DVIaG285ffq0zO8N
GzaU8UB7EGMnx8rc5O+2sOOQdRW9Mw75TuKrjHX6gBhsSvSBbQPWL9sG2FvY
zNgBrIX4QvFdC2OTeA22KvMKcQjmroMHD3qdV1hH+L2MdfTH9+Er1q5dW2x0
9G73/wUaG3NinmUOrly5ssw/WbJkkfgTNe7imn+SCjn+/fv3i1+Nv2zj7KyR
zGcp2QZJIaHX5Wv/Iu2Oz8ceP/wS1iQbb3n++eclxoyvHEgYh8Q0iNtjj9AH
6BHfgxgs8aBA+evOmBNtQK4zV65coh9rcyWmDdAYsRvmcWKl6In5jD8zrsml
eYuFMsaw7bp16yb5VWujt27dWq6LmAVjNVBtYPcZ0N74+vQB101/vPTSS5Jn
CKQGrJ9C/hD7gbiNsw2wmbAn4qrzm9Dv8dd92PwoNl1ccwSv03fMwazrrP9O
iKksWLBA5h/sjbx580rcb8mSJeK3pBS2D4id0Qf46fQB+26IpRF7Za7wl69o
fT7uke/EF8M/dbYBMdCk2lxolrl90KBBYkswprFnyeksXbrUayyUPuV6VqxY
ITYfth9zAb5iTEyMxHPIC/izDazPx/4B9M/8Y+dg2n3Tpk3X1NUNNHZNwF/G
T6ENbP4L28jzWuwcyvjGZvLlv9h7RQOMeVurNjkwXukT7AN8Dl+/k9exddjD
yJjytCXQMHFl7pG1n/nbzT199AHzHX3AfEzcEC2yJjGe8RW55uTkMugn+uHA
gQPie9k4u405+asN6He+h/mDmIbdr4DGO3fuLHFA7pXr8fQV6SdsWnxFrsvm
c8h5Y9cmJ5/jjLtZn485GO0x55HnS+k52BNnG9SpU0f6iJg69+0ETeIn4Vev
XLlS9OsN5i1iu8x/aAF/P7kwd+GjYC8wf/nKEdh1rkyZMuJjYFc5oR/pC/wg
X9fvBrbNmB+qV68utilrCTESYiVcb2J9RafPh59Rt25daT98MGICjEVsRX/D
99JfjGtiquwZQovWV8QOp588NUUbcD3EpsuWLSvXSRugS64f/z2xfpJzbzl+
ODYy10LevFq1ajKn0+7BAlpk7afPnLl/C/9GvhIbBtuBedUbrFk8x0OfowXa
L7n4S4PBDus48z73yj0wBokR4CuyJ4P4f3x+ktPnw69gX4FdW5j/2evp3OMT
KNAUY4kxQ7yDuAP3wxzz5ptv+tQU98dcjz1qfUWunxwb+yYS4ita3wVfFZ+P
tY7vZ/zQrgMGDJDxG8x7JbyhGkwZGFvM3Vw7eXLrK0ZERJiuXbuKTe7NpnPu
M8CvIe6Gn0O+hRw5ezTQJP2TkjAfoCmeN7H7t1gb8X98xV+sXUsOtU2bNmLP
sn5xP8St42sDm+ejvfgs8w96xkbGF0xJn8+fqAZTFusr4tdi07EWoEV8WWf8
n/cxFvkz/iP5J96Dz8dYpx3wH2hDt0AbXB+6IB5tNYXPiKaIkeKPefMVWfvp
U+7D5rcT0gbsq8PnI9ZI7oV9L276fP4gmDSIT0e7e/7wOj5oOGjQgk134sQJ
iafY+D++EvYd8X/yvvhY7MNgLyM65T2PPPJIwHy+pGI1xZioV6+eaIo+LVWq
lOiGeJrVlBPagPwpviL35WwD/Dz8PdsG+I+0AT417cVnaINgzbclBqcGiWUz
33jTAXbUxx9/HBANMgdGRkbKd9Mnnj+8Pn78eNnrEC4atFg/CRvTxv8ZZ8TQ
8LHwGxnPzP/s2ycG6JmbCRaspugrnqtAU/hrxEmIDeIressr2jYgt+JsA3TJ
Z4m12Dbo06ePPG8VrG2QFKwGyd+gL2x5bzrAPsDnpk38rUGeVyanQ7+xl8Hz
h9eJ69Mv4aZBsDYd+Zno6Gh5XpZnRm+77Tbx+ci5M/8xD4YC+GXEh4i/YF+i
H/Iz+IrkDb09exlfGxCDCqU2SAxWg+QRmW/I33jTAes/cxT+i781yLkRzH34
5vjYnj+8jg/Ee8JRgxbsDfLeEydOlPsmboPdGUz5loRiNUX8hb3saApbkrme
9RyNMvY8fUVnG2Cb4VuGahskFKtB8po29+pLB7wHez0QtihxMuwR/CDPH14n
rxuOtqgn2GrEOfF/sMWIwTthvFr/IBR8IWf8xT5fwLrIXI/Nap+9dN6LbQP2
GtAG5CTDGactSi6fPL03HRCrwTcOhC2amuKi8RGXBm18Ct+AeI7nfotgxu7r
5rkefF38RPxFdMn+cvYNWVKrBoMhLqoajFuD+ELMkYxb8vuh2A7k73n2kudd
ec4L/wc/x3mfqkHVoJvEpUHWPfbV8FwUMVP6IxTB7mTccZ/k29njw75Ti2pQ
NegmqUGD8aEaTHkNEu9kLoxLg3wfe35Vg6rBcMNqkPwb+RtfGsSOJ87MPgi0
4A8NEm9mHwXxZ/KzvmIMvM5Y5EwzrhF/IlxRDaY+DRKPYg8Dz9PwvCOxK2+g
QZ4hY48we6c8Y+ZJgRgDfgDPVHE2XFzPDzI3sIfLnj0brqgGU58G7Z4G1kL2
EbOHwRv2OTD2yKIF9s0o/kc1mPo0qAQXqkHVoOIuqkHVoOIuqkHVoOIuqkHV
oOIuqkHVoOIuqkHVoOIuqkHVoOIuqkHVoOIuqsHrNcg9K0pKoRpUDSruohpU
DSruohpUDSruohpUDSruohpUDSruohpUDSruohpUDSruohpUDSruohpUDSru
ohpUDSruohpUDSrukto0yJnbnrVrVIOKm6QWDaI7atpzvi21wZ3nWqoGFTeJ
T4M8x0NNVM54DUUNoj3OyUR3nNU8bdo0OVeTGmgW1aDiJnFpkH/jLOSmTZvK
eePbtm1z8UoTD7UIqVvAec1Tp06VuvPUGqHuEmfbWlSDipvEpUFbTxMdUrPc
V32OYIO6iZzTfurUKbNo0SKZQ6glS/1Bznen3gG1dy2qQcVNPDXI+AuFerve
sD4f8wa1Sqg5SG0R6i4XK1ZM6p5t3bpV3uPE1ll0toGipBRocNWqVTL+qP09
adIkqT9AHdtQ0aL1+S5dumR27NhhRo4caUqXLm3Spk0rtdap77V06dLr1nE+
R2yGeiIzZ84U/VGzV8+yUFIS6n9QB6dRo0ayXpQsWdIMHz7c7N69W+rzYI8G
sxa5PmJH1A6iLgm1sjJkyGCyZcsmf548ebLYnc57sJolRkrtk1GjRkm9enxF
ag6HYuxJCV3wnVj35s6dK7GKrFmzSn3GatWqmVmzZpkjR46Ircb7ggnr88XG
xsoaFxkZaXLlyiXXTr3KgQMHSp0gtObEapb74v7QHHMPNbK5Z+KmxFAVJSVh
XWA93L9/v4xd1kLiF+ixefPmZtmyZTLW8aPcXhP5fuxk5o1169aZPn36SD3Z
9OnTm8KFC5u2bduKbc39OPHUbJs2bUR3aLZUqVKmb9++Zt++ffK7FcUtGN/4
TNR97NChg8QzsM/y5csn8Qxs1vPnz7viK1r7EfsYO3ns2LGmfPnyJl26dCZ3
7tymQYMGZt68eeITen4On4/rRrO9e/eW+0KzRYoUMdHR0RK/wS9WlGCBNYO6
4cQmmjRpImMcLbI+jhgxwuzdu9dcvnxZ7LqUuh7s4WPHjonO6tevbzJnziw/
7B8YM2aMrG/efD6rWXw+bFS0FxERYRo3biy1nbE73V7bFcUXrHcnTpwwU6ZM
MVWrVhXbLVOmTJJHQwvsNWH9CJSvyO/Fpjx79qzkTzp27Cj6yZgxoylRooTk
Hsg1ePP50CzXZzWLzWl9vvHjx0t92WDzcRXFG6wR+FG7du0y/fr1M8WLF5d8
d/bs2U3r1q0lr3bmzBmx9/y1ntg17OLFi2bz5s1m0KBBsgajvYIFC8r34qOi
MydWs1zP8uXLTVRUlMmZM6d8js/36tXL7Ny5U30+JSRBF9ifxO2Je+TPn1/s
00KFCpmYmBizZcsWsV+T6yva/WXENNlfVqVKFVnDcuTIYWrWrCl5PL7H89r4
Xl7HZx0wYMDVOA3XR8yUuYJ4qKKEOqw1xDfwperVqyf5ALRYtmxZ8cuILSYl
r2jjltiIixcvlrw6OT7is+TOhwwZYg4dOnTNZ5w+3549e8y4cePkveQH8+TJ
IzbonDlzJIaqKOEGtif+Fr6VzStmyZJFfEXGPfGThOYV0StrGGts9+7dxd5E
1+wv69Spk8RpPe1HG6fhGsht1q1bV3xVfD72Zo8ePVr+TX0+JZxhHUIH+Fj4
iuTZWLfwwfDZVqxYIXtU4vMVsRHx74hx4ruxv6xZs2ay1uITen4neUriNPx+
cgt8H/Yq+9Pw+XhGyfl8oKKEO+gCe5DnK/AVCxQoIOtY0aJFxTfbvn27aAm7
0ZsW8TNZO8uVKyf7yyZMmCA2qed3sBbye/A9Bw8eLPtbsTtZN/lenu3gdylK
agWbkvUJPbEHjDgK6xP7oCdOnCjP8aERT/uQNQvblf0txGK8/V4+xz4e9n9W
rFhR9pex/tWuXdvMmDHjmmeRFCW1g6YOHz4seXHOwcBH4wef7a233pKcY0Ly
ivw77zt+/LiZP3++5NWxdcnNo8Nhw4bJ96jPpyjXY31Fcug9e/YUu5GYCfFK
fDjWPNYub7kMp89Hbp64jN2rQ26+W7duYo967glVFOV6WKPYD0b8hDgNe0/J
8eMr9u/fX573Y4+n9RXt2S/4kOTmiY0Sp2GPZ6tWrWS/teeeUEVR4gdt8azs
7NmzxSbFV2RdJK/BGUvk/sg78v/p06fLnjK0x/vq1KkjPh97QhVFSR74isRd
hg4dKnFQfDt8RfL9+HecHcVr+H08F2GfA1SfT1H8BzYnOcFNmzaZzp07y74y
1rw0adJIHNXm5tevX3/dnlBFUfwHOQf2kXHOIHs6yWHgM3IWmueeUEVRAgda
JM5CzkK1pyiKoiiKoiiKoiiKoiiKoiiKoiiKoiiK8v/8DyvCCzs=
"], {{0, 154.}, {
           225., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSizeRaw->{225., 154.},
PlotRange->{{0, 225.}, {0, 154.}}]\), \!\(\*
GraphicsBox[
TagBox[RasterBox[CompressedData["
1:eJzt3Qm8z2X6//FZ/vObJVmzbxHZ4siu7FkqQkKiLKeyRLZIZQlpISqiRWWp
VKijVBJZQrYip5SQfd/DMTXTzFy/ed/zuM3356+ZzjfH59zH6/l4nBlLuM/n
nM/9uT/Xfd3XVSSxV4vOv/nVr351zx/++T8tOg2o27dvp/tvyvrPn7TqeU+3
Lj3vvOO6nv3u7HJn32qJv/3nL7b79a9+1f+fH//vnz82AAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAA4Bz4xz/+4T4AAGH78ccfbefOnbZq1So7dOhQ1MMBAMTp
73//u+3atcvGjBljdevWtaeeesq+//77qIcFAIiD5vR9+/bZiBEjLEuWLFa/
fn376KOPoh4WACBOf/7zn23evHluPs+TJ4/179/fjh07FvWwAABx8Gt1xV8K
FSpkVapUsenTp7NnCgCB+stf/mKrV6+2Nm3aWLZs2axDhw62efPmqIcFAIiD
1uRHjx61KVOmWOnSpa1kyZI2fvx4lxMDAAiP5u+vv/7a7rrrLrdf2rx5c7d2
BwCER2v1EydO2KxZs6x69epWuHBhGz58uNtDBQCER/ulO3bssGHDhlnu3Lmt
Tp06NmfOHPZLASBQOnO0aNEia9y4seXKlct69Ohhu3fvjnpYAIA4aK1+4MAB
e/rpp61IkSJWvnx5mzp1Kmt1AAjUX//6V1uzZo3deuutLrexXbt2bv8UABAe
rcl1lnTatGmWkJBgxYsXt1GjRllKSkrUQwMAxOFvf/ubO3fUr18/u+SSS+za
a6+1xYsXRz0sAECcTp065fJeateubfnz57cBAwZQjxcAAqX9UuW8PProo5Yv
Xz676qqrLCkpif1SAAjUDz/8YCtWrLBWrVq53MYuXbrY9u3box4WACAOvhbM
iy++6PZKy5Yta88995yr+wUACI9qwaxfv96t0bNnz2433XSTy3UEAIRHa/Xj
x4/bjBkzrGLFiu4s0sMPP2wnT56MemgAgDgot3Hr1q32wAMPWM6cOa1Bgwb0
uQOAgKlG44IFC+y6666zvHnzutx11REAAITH14IZO3asFSxY0CpXruzOmmoN
DwAIj2rBfPbZZ9a2bVu3X9q+fXvbuHFj1MMCAMTB14JRnzv1uNPHuHHjXI1e
AEB4lNu4YcMGV1tddRubNWtmK1eujHpYAIA4aK2uPMbZs2dbjRo1XJ+7oUOH
unxHAEB4fC0Y5amrz53qfL333nvUggGAQKkWzNKlS61p06auHm+3bt3ocwcA
gdKaXLV3J0yY4OIv6nOnvVPF2wEA4VFu47p166xjx46WNWtWl+Oo2jAAgPD4
WjCvv/6663OnWjCPPPKIO3MKAAiPrwWjPkhaq6t2wCeffBL1sAAAcdJaXTXW
1Q9JtRvfeOONqIcEAIiD9kS//vpr6969u6sXcOONN1pycnLUwwIAxEFr9KlT
p1qxYsWsXLlyNnHiRGoFAECA1MNu+fLl1qJFC5ejnpiYaNu2bYt6WACAVFLO
i2rvjhkzxtVTr1atmr355pvkpwNAgBRfmTdvntWpU8fN6ffcc487gwQACIvy
F7ds2WL333+/5ciRw+rXr2+LFy+m3gsABEh1GRVnUT2Ayy+/3EaPHm0pKSlR
DwsAkEqqB7B27Vrr0KGDy11s3bo19QAAIEC+z9ELL7xwum7XSy+95Go0AgDC
orl7yZIldsMNN1iuXLmsa9eutmvXrqiHBQBIJfXB2LNnjz366KNuPlePI/U6
0n4pACAsqrU4Z84cN5cXKlTIhgwZ4uIwAICw+Jou6iutfdHrr7/eVqxYEfWw
AABxOHHihL322mtWunRpK1mypI0dO5Ya6QAQIOUufvrpp66HkWq6tGvXzjZu
3Bj1sAAAqaTcxcOHD9v48eOtQIECrjb6K6+84uZ5AEBYlLuoM//qXaSaLn37
9nV1uwAAYfE1XR544AEXc6ldu7bNnTuXmi4AEKBTp07ZO++8Y5UrV3a9ox96
6CHX/wIAEBblLn755ZfWpUsXV3exWbNmtmbNmqiHBQBIJcVWjhw54mq6XHbZ
ZS5/ccKECfSjA4AAqR+dzhO1atXKcubMaXfeeSc1XQAgQKrpsnv3bnvkkUcs
T548VqVKFZs5c6b7dQBAWBRfmT9/vtWrV8/y589vAwYMcHEYAEBYlLu4efNm
69+/v4u5qKbL8uXLyV0EgACppsvrr79uZcqUsaJFi9rIkSOp6QIAAdJZ/3Xr
1lliYqLLXVRtF2q6AEB4FFs5ePCgq+mifnTlypVz/eiUow4ACItyF5ctW+bO
FSmO3q1bN9u3b1/UwwIApJJyFHfu3GnDhw93uYvUdAGAcGkPVD1FVdNFtXQH
Dhzo9koBAGHxa/TBgwdb1qxZXT3dVatWRT0sAEAcNKcrbq56i5kyZbKaNWu6
HtIAgDAp9qJzow0aNHDx9H79+tmxY8eiHhYAIA5aq+/fv9+eeOIJK1iwoFWq
VMmdO2KPFADCpFzG1atXu3NG2bJlsw4dOrgaAQCA8GhNrnjL1KlTXa30EiVK
2Lhx4zhzBACB0vy9YcMG6969u2XOnNmdP9LaHQAQHq3VT5486fqPXn311Vao
UCEbNmwYNbwAIFBn5qvrTOn7778f9bAAAHFST4xFixa580e5c+e23r172+HD
h6MeFgAgDlqrq0ajekoXKVLEKlSoYK+88gq5jQAQKNVSX7Nmjd12220uBtOu
XTu3fwoACI/W5N99951NmzbNEhISrFixYu5MknrbAQDCo/l706ZNLp6eJUsW
F19funRp1MMCAMQpJSXF5b3UqlXL8ufPb4MGDXL5jgCA8Gi/dNeuXa5fRvbs
2V3e+ttvvx31sAAAcVJu45IlS+yGG25wfe169Ojhan4BAMKj/VLlp0+cONGK
Fy/u+k9PmjSJ3EYACJRyG5OTky0xMdHlNrZq1crWrVsX9bAAAHHQmvz48eM2
Y8YMq1ixojuL9Nhjj7m5HgAQHuU2fvvtt9a/f3+3VldfpAULFkQ9LABAnE6d
OmVz5861evXqWd68ee2+++5z63cAQHiU27h3714bOXKkq+9VrVo1e/PNN9kv
BYBA/fDDD/bJJ59YixYtLEeOHNa5c2eXww4ACI/W5EePHrXJkydbyZIlrUyZ
Mvb888+7NTwAIDzqc7d+/Xrr0qWL2y9t3rw5fe4AIFBaq584ccKSkpKsatWq
VrhwYRsxYoT95S9/iXpoAIA4KLdx27ZtNnDgQBdXVy7Mhx9+GPWwAABxUv/p
+fPnW8OGDV0eTN++fe3IkSNRDwsAEAfti6qe15gxYyxfvnxWuXJle/3118lt
BIBAKYau/dE2bdpYtmzZrFOnTrZly5aohwUAiIPW5MeOHbOXX37Z5TUqv3H8
+PEuNwYAEB7N3+pBfffdd7vcRtVaX758edTDAgDEwfek1jkk1WxUjXX9GAAQ
Ht+PWnkv6nHXuHFjW7FiRdTDAgDEQX2np0+fbmXLlrVixYrZ448/7vIcAQBh
UV8M9TxSvssll1xit956q23evDnqYQEAUklx9IMHD9rYsWMtf/78lpCQ4OLo
5LwAQHhUd3fp0qUuz0Vr9O7du9uBAweiHhYAIJV0hnTHjh02dOhQy5kzp9Ws
WdPef/99zpACQIBSUlJcXcYKFSq4uItqealWIwAgLIqXf/PNN9azZ09XD+DG
G290+6QAgLD480WTJk1y54tKlChhTz31FLXTASBAyl387LPPrG3btm6N3rFj
R9u+fXvUwwIApJLW6MpreeKJJyxv3rxWqVIlV1tX50gBAGH5/vvvXS+jOnXq
WK5cuaxXr1526NChqIcFAEglrcUVYxk0aJCr6dKoUSNbtmwZuYsAECBf0+WK
K66wQoUK2fDhw6npAgABUu7iV199ZV27dnX10Vu2bOl+DgAIi2IrR48etYkT
J9qll15q5cuXdzVdlP8CAAiL8s4VN2/atKmLo99xxx22Z8+eqIcFAEgl1XTZ
t2+fjRw50tXo8jVdyF0EgPBoD1RzePXq1S1Pnjx27733ujOkAICwaC2+ZcsW
N4/rvOj1119vq1atIncRAAKkGovTpk2zUqVKna7pQu4iAITH13Rp166dZcmS
xW655Ra3ZgcAhEWxlcOHD9v48eMtX758VrlyZVfThX50ABAe9aNbtGiRNWzY
0OW6dOvWjX50ABAg5S7u3bvXHn74YcuRI4fVrVvX5s+f734dABCWU6dO2dtv
v+1q6OrMqGq60I8OAMLja7rcddddrqZLs2bNXD86chcBICy+H92UKVOsWLFi
rvbic88952LrAICwqKbLypUrrVWrVm6Nftttt9mOHTuiHhYAIJW0/6maXKrp
on501apVszfffJOaLgAQIN+PTvW5lI/ev39/+tEBQIBicxdV06VBgwb28ccf
sy8KAAHSnL5//343p2fKlMnq1atnCxcujHpYAIA4KSf9gw8+sNq1a1uBAgVc
D2nq6QJAmHz85bHHHrNcuXLZVVddZUlJScRfACBQykP/5JNPrHnz5q4/3Z13
3mm7du2KelgAgDhoTX7kyBF3zkg1AcqWLWsvvPACtRgBIFCav7/88kvXQ1rn
jm6++Wb3cwBAeLRWV72umTNnWsWKFe2yyy5z55DobQQAYdJ+qWoCKPfF56t/
9NFHUQ8LABAnrcvnzZtn11xzjeXOndv69Onj+h4BAMLjzyE98cQTrlZAlSpV
bPr06eQ2AkCg1GN6zZo1duutt7oYTKdOnWzbtm1RDwsAEAdfT/3ll1+2EiVK
uJrqWreHUqtR47/Q3yu4BlwDIJbm740bN1qvXr0sS5Ys1rRpU1u9enXUw/qP
NGbVlPz000/de4b2Bi60e9pfg88++8zWrl17QV8D9QRYsWKFpaSkRD0kIF3Q
vaD+pKqprlowQ4YMSZf3h/YAjh075urW3HLLLa5PU9WqVe2+++6zzz///II4
OxV7Ddq2bXvWa5DR53Zdg+PHj9uSJUtcD8Yrr7zS2rRp434O4F/3iGoEDBs2
zNUMqFOnjpsz0gvNUar/rvcHvU+UKVPGLr74Yvvd735nf/jDHyxnzpxWt25d
Gz9+vB04cMB9PhmNroFqO2jeHjBgwE9egyeffNK2b9+eYa+BenWpn+6DDz5o
FSpUcN+vefLksa5du9qmTZuiHiKQbmjOXLp0qes5rfnh7rvvdnkxUdM+ru5V
1R5Tbo7uYZ9TP3bsWFc/uHz58vbHP/7RChUqZC1btnT9m3SuKqOsV7X21t71
U0895dbkl1xyyf+5Bo888ohbq2pu13uWvobTpk1zdSAy0jXQmQrVsrj22mtd
HTqdg7766qttwoQJ7jmm+R7Av/haMM8884wVLFjQ1YJ58cUXI5sTdA/v3r3b
Jk+e7O5hrcW0LtW8rrlN9QwUgzh48KCrBa/8etWwUX34UqVKuXWb3sW1tg11
XlO8WO8dr7/+ul1//fWWP3/+09dA6/GzXYMiRYrYRRddZJdffrnLY9IZhNCv
gWLms2fPdrEmfW9qLlfMSWv1devWuWsQ6ucHpCWtiXWPdOzY0d03illv2LDh
vI5BMQM9W9577z2XY6k5SvOY8nIUM1ZdSd3DPrbgYxJaw2l9rjHrDJXWsZr7
tDewfv36YHJ5RJ+b3jMWLFhg7du3d/UbdA1Kliz5X6/BW2+95eY+XQN9DStV
qmSDBw92839I10Cf08mTJ2358uXWs2dP95zW56N3sS5dutiiRYvcXJ8RY0zA
uXJmLZiiRYvaqFGjzsveo4+ZK16sXqmlS5d297BiCZrX3n//fbdm1XPnp/68
+n5888039vzzz1utWrVcPEZzW/369d37uXpsp+c5wF8D5bEoZq53JeUi6XNo
3bq1e8793GugGIViM/7P+2ugd5/0fg30fPr6669dXE3P5Rw5crh4oGpE67mt
r+NPXQMA/5fWclu2bHHzquYDzQVpXQtG96f+TeXGq1eH4sX6uO6661zuvGKl
mut+7vi1htV+qvZ89WzwsfabbrrJzQla/6W3d3V/DRRTUYxYn7+uv/Y9p0yZ
4n4vNddA5w6U56l9CMXaFY9J79dAawft1b/00kvueaRnka6B9hAUb9u8ebN7
ZgFIHd03H374oTVs2NDFZbUPlxb8e4HWn40bN3Y1CrQ21z08evRoFzOJd+7R
/KA1reLJ3bt3d3FYxdo1x/fo0cPlM2tPLep5TWtm1dlRzKRJkyaWN29eN07N
w3pHSk5O/kXXQLF2xXCUL6QYjt9v6Ny5s4tfpJdroHibcq30Tla4cOHT8bYH
HnjAPZuImQPx8/OM9qU0rygWkBY05ygvTXOs5nLFi/V+oBjq0aNHf3H817/H
a52vWjbKidH6V7kz1atXt4ceesidt4oiFuHjxdrHVb8pneHNnDmz+//evXu7
Xz+X12Dnzp3uuaEYTux+g2Lt2kPRv3O+50z9ezoHsWrVKvd1176nj7clJiba
/Pnz3TMppH0AIL3SPKf7TXO75oS0+jf097/zzjs2fPhwt05T/uS5jpX6uUPP
D+X1KNau9apyaRo1auTyezR3nI85zcfMtWep/VvlYWp+1fpc+7u6FuobmxbX
QO9feoYp1q4zCLGxdl0X7bGej+ebf85oLHof8/E2xcz1vvbaa6+5Z1Bafd8B
ODd0L585Z8Q+O35uvDheWu9p7av3gIEDB7q4kmLtyq3x82lanrHXe4neGTR/
ak7VHKYYQ40aNdxz5dtvv03zM/663oq1q6bAo48+auXKlXPXQHmgWsMr1q5r
lFZj0Ndg37599uqrr7r8TD3L9H6id4Zx48a5/V19PxBnAdIv3Z+KhypGP2nS
JPfjKGkNrHlF7wXq4ac4vuYVvfvrrNWyZcvO6Rl7zWN6D9B8eeONN57OM1de
i+I/im2d7zNSPtaur4ly+TWna0zab+jWrZstXrzYrevP1Zh8TQP9e7fddpvb
r9W/p1hTv379XPyFmDkQBt3Pyj9TXovmUMUWouZjIFu3bnVnLlW7TLFcxdoV
C1C+iH7vl8QifMxH7wXap/Uxc82fqlOiPUq9n0QVLz5zv0HPG7076Br43P4v
vvjiF8WBfMxH7wWKmWvfU9dZ63OdP1COquJtF0KtHiBq/p7Xu7g+fio24udH
5S6otlLsHKXf0/2q+Kje9du1a+d+rLkyPeRJawzaq9Tcpfwe5RH+6U9/cmt3
xXaVR5jaWIS/borfjxgxwp370Typ2LWvXaCcvfRylt0/e5QT/uyzz1rNmjXd
fkNsrF3jTe3XS88CxZMUU9Eehq9poHyqqVOnpipHFcAvp/tNuektWrRw+cK6
t892D2odlpSU5Prg3XvvvafPteicn961tQ+mNZ/mSJ1l0pluzZ/6M4qBpAd6
DumZpLiL8ucUg1BOt9bWyq+bM2fOzzpj73OsFR/XfOhzTGrXru3mS+0LnsuY
xrnk4yOKgwwdOtRdAz3fFCdp1aqVzZo162fFR3xNgxkzZrjaMz62pZpbY8aM
cc+O9JgfD2R02q/TvOvvba2vlOt85r2o+1NzmN6nb775Zhez0H2tulM6Q6Pc
Eu0Jak9S97d+rFiH8ug0x6UXvu6fnjOaw5VPp7w6zUcJCQmunorOMmnuO/Ma
+Bzrd999181/+nPKK1HMXGeflGOtvcn08G7y3/jcfuUSKkakz0XrdsVNlNeu
2gRn28v1+6+KKSnGpjxzXTudR77nnnvctdM7TwjXAMiIYuf0X//61+7dWftn
WofGOtuc7uMPmh+1Lvv444/dfa25XD/WXK6/Jz2+e2vs+tx9rF01xDSn6Yy6
clRGjhzpYgp+bte6W/Ocro3mPc1jOuOkvHM9A/W+ElqOdWwNGdWK0PlTvW8o
Fl65cmUbNGiQqwXk89p1vZTnrnwi1QHWf+trGiifSN8HoV2DC4HPRTvbOuVC
cSFdg9g5/be//a2r5ao5S3GT2Nzhs83psXStFI/RvqP2xfTjEPhaWspLefzx
x935Vr2vKG9FzybFhPV7OpelPUXNY9pj1O+plqLmw9BzrH2sXfO3Ym/ab1BM
SvO1ajc899xz7kyuYuY6x+VrGuhdTGf8VdNA30dIX2L3fNR3QB9+nXIh0Xu5
ak7oXlZs1Ne6y6hzu5/TFXfQeXLFy/25ENVa937OnK68F8VSb7/9dvfjkCgW
obW28vv0rqEYktbtyl9RbEVnl7R+Vb6M5jbNfxktx9rH2tUvUOfD9GzXNdDz
TdfD1zTQ9dC+sO6NjFTDPiPx5ySefvpptx+ur52+jnpGKydAuVgZ/eumeVvz
kGp5K6as+KI+VF9J93A8OQEh8HO6ao8ob0F18lSTRHOYYsvKb5b/NqeLciAU
o9XaNcSaej7Wru8D1UxQ/o7Wqnp30fNOewOKF8fWwM2IfF679s6Vn6lroF5L
ip2rxoPyNrWvkJGvQaj8OQnd04qJ6X7VO5diivo6+vN3+t6eO3duhluXiO97
qO9fxQwUI9U10PulfqxroP9XrFE5ARltLz92TlcOh+KiEydOdHteWo/pjLnm
558zp2cUPnas91T1cFX9WvWj0L1yoeRY++ebclJVf03rPeWZ69zBhXINQuL3
fNSLW8/h4sWLu/iY8pm0B646d7q3O3To4PpH+Rwl1epXbDE91Hn7pfx9q89H
uXna89E10PuJPm/dy3qO6cyhrouvdafro72yjHANJHZO1/uJYg/a29QaXc81
xY0VS72Q5nRPz3tdHz3z00ue+fnmY7LKdUmPe9341/uxYoF6x9Z+h85JKH6q
dajmMT2H9f2r72XtfbzyyisuJ0BnfPXfKQdXuWvnqxZQWvDnY7QfpPio3ks0
n9erV899vsrP0/evPhSTUg6uzt9pXtcemfbR1B9Svxf6Pv+Zc7pi6Po1vbfo
6645XOe6dU0utDkdSM9i+x4qTu77t1arVs29V+kePXPv2ucEKH9J+Qy+FpDv
86v8p7SsBXSu+RqBmsO0/lQeteIs2h9UXoP2fM6MrcTW+9M7uHICfv/737u5
TddRtU1CugZnOtucruukOIOeearHra+78mAUk2FOB6KluSY2Xqz5WLEU7Wlr
z0fnpf/bubHYnAC9k2uvJLbPr87lpefcED8vK4agWIrvU6w4ufKNf04tb99b
R+fvdAZRf4ffb1BvT53BSK9nB0Xj8h+xzjani77misGozrfe5ZQHo689czoQ
DV+fQ3O2zgioHl1svFj7Hanp3+r3TJR3rPW5YhG+toPW+qo9p5hOeorH+Fig
xqXcK9Wx1vykWIvOwiu3QTkOqdnz0X/rzyAqZ0/XQM9I/d2qX+TrIaWnuV3j
UaxJY9acHZuT8lNzuugZpf0xncHRNdOzXF9v5nTg/NK8o/inz030ddh0z6r3
4y852xd7PkFnEXwtIOW+6e9X71/F5KOe03xuovLMFRdWrEl7Anr++Nz7X3IN
9Ge136DcR+V065yK5nf9WDEaX8MqSr6Ota9JqHNFeg7FntX/T3O6xq8/r5xz
5UH95je/cbEq5nTg/NA9qBiC1lba8/S16JWzorPOPl58rv4t3+c3thaQYhFt
27Z1a8IockPOrOGsHHPN5crtUcxEPeu1h38uxuXzILXfoOurd6Ezcx/P1b+V
Gj6vSfOz6pkoJu7jbTorqTF7/2lOF63p9e6h2h46Z6qvMXM6kLb8Pax4seLD
/h72ZwQU7z4XfQ/PxtcCUt6f1oB+DlVus2Kxvh5SWvO5iarhPGDAAPfvK9ak
55rmdo0vrfoe6u9UHEt7FrH1kGL7L2hdfz7mdj1HfbxN8SBfx7pNmzYur0nj
jB2HrpmePaqlqPcZjTWW76mpdYLeyfS1Vb8gvQcCOPe0jtJZdq0TtdbSPayY
p84C6l5Ni76HZ/KxCN3n6jOofBBfD0l5Inp3Vxwkreb22HOw+vd8j2HVClTc
4XzU5Ijt86va2b4eksaheI/i+fo6pdU18LEm1d9Qvqnf61B+pn7tbHlNomun
canPmNbrZzvT7/OF9MxS3ErvYHr/AHDu6B7WfK14sd6ZFe/08WLloG3atOm8
19XxazrFIlRP29dD0rpVNZY1v+q9/1z2C9M7gvZstefpa6bq2ab88bPlJqY1
/86k3Efl/+kMgJ5vypu84YYb3Nx5Lq+BjzVpHa38Ut8vTPmHOoOQnJz8H2ty
6Nf9OVHtk/zUe4z+HT23NXZ9fvrv0kvPCyBksfWLFetQ7FZzhq9f7Gs4R7lH
qbWf1nVLlixx51VU/0dzu2Laqs2s9/tfkhvi3wt0Dla5lPp7NZcrlt+zZ08X
E1Y9iijPA52t17H2FzXWTp06ubrV5+IaKLdS8R5f/1U5loq/qf5rWl0DxXd0
/lbvX4rnXKhnLIFfwt/DWnfpjL5ywxVnUT6H6rHovVhr1vRyrtHnPupdQjmD
2lfTePWh2K3vlRBPryzlm2ivT3u/ii/oGii3Uut15Wmkp/pRvtex+ijofIDy
gxTn17uE6lOrh3q810DvQv4a6EMxL10DXfO0jDVpPa+zVr4Pver4Afj5fG6i
YpmqWaw4teZG1QxULp1iuOl1reT3LhUL0jygOlFasyu/0tc8VP/Zn9MrS/+d
zu4rXzA21qSauIoJKyaQHsXWhFKs7FxcA98zzV8D5U/6nmlpTWsLxdQVV1JM
6f777z9n+VRARub3pvR+q1ipz030sVLliIdyL/m8P8WG1OdL7xmqYar4r3Iy
tI49W91H/VzxYMWadF4z9gyrr316LuPTaSm2/4J6Met9xdda0DXQ+8zZ4v8/
dQ3UR1OxrfN9DXz/C50z0x6w5nbtrwI4O7/XqPp/ykX0sVLd+126dHH3tmKl
IcxjZ4qtzay4r6+zr/x2fa6an3yc2Z+D9f2AFbNQ7EKxJuWfKy8vxLqfvtaC
6rfqGmitq/X2z70GWp/rDIDyMxVvi+IaKLajvQLl92he176Gr7UO4N90f+od
XbmJlSpVOt0rS3kjb7311um6iSGL7YOommKqJaJ5XXODzqGrXpj2UXVO1efl
aS5TvEG5khmlX9iZuY++7qPyMbWO1zXQuVzF23wNAvWtUG5i1NdA49dzSfFA
1ZtQrWLV+Eov+zlAeqBYqNZePjcxtleW9tIyWs+/2JqHyi1X7qNiEdrzVFzB
r18Va9Icpz7GGa0vh899VBxN86PP//ypa7B+/fp00y9M6w/li6rPsp65enfQ
9ymAf8coFWv2Z9m193Qh9MryNQ+Vm6i8P81l6pWlvAqdw1QMKqP3yvLXQGeB
lQ+jr////M//pOtr4OP8es/QO6VySfXcCf0dCjgXYnsPaV2uvIKoYqVR0eeq
vA7tF6hnmmpHprfcxLSmz9VfA/WgSO/XQM8YneFVfq3WIg0aNHC58QD+HWNV
DsOFutaJPdfINQjjGigmqDNUms+1h62aw7F1wgAA4fA1ebWvr73cypUru73v
9BDzBwCknnKxVKdA54WVu6O+K8rfAgCER2ty1beZMmWKOw+mMxXKZSK3EQDC
pD1u5Zuqppj2S5s3b+7q2QMAwqTzA++8846rF6CaocOHDw9inxcA8P/TfqnO
xmou17k5nYFVXi4AIEyqUaP8etV8UF2L7t27u5oWAIDwaL9U9bxU60A1JNUX
VXun5DYCQJh0Zkp1ldUHRHWL9P/UggGAMGlNrrOkqqOp+mOXXXaZO5OU0erQ
AcCFQrnpOnekvh3qzaX4umquAwDCE1trVGdL1adPvVAAAOFR/otqe9WrV8/V
9urTp4/rowEACIviLuqLrnr4ymds1KiRLVmyhNwXAAiQ6gRPnz7dEhIS3P6o
+hFq3Q4ACIvyGJOTk+322293PWZbtWrletwBAMKi2Ip67Kk3tnrZKY9RPavS
a58mAMBPUx31ZcuWuZqM6o9xxx132I4dO6IeFgAglZS7uGfPHhc7V57LVVdd
ZUlJSdRQB4AAaQ907ty5VqtWLcufP78NGDDAxWEAAGHRWnzz5s127733upiL
zoyuXLmS3EUACIyv7TJt2jQrU6aMFS9e3MaMGUMvDAAIkHJa1q5da+3bt3e5
i7fccgs1GAEgQFqjHz582CZMmGCFChU6XSud3EUACI/iKzrz36RJE8uVKxc9
jQAgUL7u4ogRI9x8XqNGDZs9e7b7dQBAWNTf4v3337fq1au7uMuQIUPsxIkT
UQ8LAJBKP/74o9sH7dmzp6uNfsMNN7h9UnIXASAsmre/++47mzp1qpUoUcJ9
jB071tUFAACERTktn376qctZ1Bpd/aO3bNkS9bAAAKmkNfrBgwdt3LhxVqBA
AatYsaK9+uqr1HQBgAApd3HhwoWub5HqdPXu3Zt+dAAQIK3Ft2/f7vJbVNOl
du3aNmfOHPZFASBAyl189913rWrVqnbppZfa8OHDXY86AEBYlLu4fv1669q1
q2XNmtX1vKAfHQCER7GVo0eP2ksvveT6RZcuXdqeeeYZaroAQICUd75ixQpr
2bKl5ciRwxITE23Xrl1RDwsAkEqq3bJv3z57/PHHLW/evFa5cmV74403yF0E
gACpH92CBQvsmmuusXz58ln//v1dHAYAEBatxXU+VP3odF5UOenLli0jdxEA
AnTq1ClLSkqyChUqWNGiRe2xxx5z+YwAgLAopyU5Odluv/12t0Zv3bo1/egA
IECKrRw5csSef/55K1y4sF1xxRXux8pRBwCERXO31uidOnWyzJkzW4cOHWz/
/v1RDwsAEAftjX777bd2zz33WKZMmaxhw4ZubxQAEKaUlBRX2+Xqq6+2/Pnz
u7pdymsEAIRHZ4327NljjzzyiOXOnft0/2jyGAEgTFqXL1myxPUaVW1d1e/a
vXt31MMCAMTB579MnDjR5acnJCTYpEmTWKsDQKCUp7527Vpr3769ZcmSxW6+
+WZq7AJAoLQmP378uE2fPt3Kly/vau2OHj2aGl4AECjN31u3brX777/fsmfP
7uq+LFq0KOphAQDipDovc+fOtTp16riau/369XOxdgBAeJTbqLOkY8aMcTV3
q1WrZjNnzmS/FAAC9cMPP9gnn3ziepEqBqP6Xtu2bYt6WACAOPi+pMpnLF68
uJUqVcrGjx/vetsBAMKj+l4bNmywHj16uPpeWrN/9tlnUQ8LABAn1YJRr4xK
lSpZoUKFbOjQoXby5MmohwUAiIP2S3fu3GkPPvig5ciRw+rVq2cffvhh1MMC
AMRJtWAWL15sTZo0sVy5clnv3r3t4MGDUQ8LABAH7ZceOnTIJkyYYAULFrQr
r7zSpk6dyvlSAAjUmbVg2rVrZxs3box6WACAOPhaMK+++qqVLFnSLr/8cnvy
ySfJbQSAQPk+d3379rWLL77Y1VpfsWJF1MMCAMTp1KlTNmfOHKtVq5YVKFDA
Bg8ebCdOnIh6WACAOCi3cd++fTZy5EjXD+mqq66yWbNmRT0sAECcVAtm2bJl
p2vBdO7c2fbu3Rv1sAAAcfB97p577jl3trRs2bL24osvktsIAIFSbmNycrIl
Jia6WjDqc7d+/fqohwUAiIPW6tobVV31ChUquD53o0aNcnEZAEB4tF+6Y8cO
GzhwoDuH1LBhQ/rcAUDA1Odu3rx5ds0117haMH369HF11wEA4VEM5sCBAzZ6
9Gg3p1epUsWmT59OnzsACJTqA+g8acuWLS1r1qzWqVMn2759e9TDAgDEQWvy
Y8eO2eTJk12fu2LFitkTTzzh4u0AgPAoN111Gu+++26X23jTTTe5XEcAQHh8
XF35jIq/qB/SwoULox4WACAOykv/+OOPrXHjxq7HXbdu3VwfDQBAWBQ33717
t40YMcLV9dIa/aOPPiL3BQACpPq7SUlJlpCQcLr+rn4NABCWH3/80e2N9urV
y8XRmzZtamvWrGGNDgABOnnypL388stWpEgRK1GihI0bN87V9wIAhEVnjVav
Xm1t2rRxa/TbbrvNtm7dGvWwAACppNiK8lrGjh1refLkserVq9tbb71FDXUA
CND3339vc+fOdX1JVeeld+/erk8GACAsvsbukCFDLFu2bNagQQNbvHgx+6IA
ECDlKb755pt2xRVXWOHChV1eOr0wACA8yl386quvrGvXrq4PRosWLeyLL76I
elgAgDgcP37cJk2a5HpLly9f3qZMmULuIgAESPGVJUuWWJMmTSx79ux2++23
265du6IeFgAglXzdxTFjxljOnDmtRo0aNnv2bHIXASBAyl2cM2eOy0PPmzev
DRgwwMVhAABh0Vpc50Pvu+8+ty/asGFDW7ZsGbmLABAg1XSZNm2a60lXqlQp
e+aZZ4i5AECAVNNl5cqVrhedetLdcssttmXLlqiHBQBIJcVWjh49ahMmTHDn
/6tUqWIzZsygdzQABEj7ovPnz7c6deq4/kU9e/a0w4cPRz0sAEAqKV6+bdu2
0/ui9KMDgHCppsusWbPcWdGiRYvayJEjXWwdABAWnfVPTk62Tp06WaZMmaxZ
s2bu5wCA8Ogskeq4XHrppVa2bFl78cUXyV0EgAApvqLzRFqbq6ZLly5d7ODB
g1EPCwCQSspR3Lt3rz366KOWI0cOq1atmiUlJbEvCgAB8v3oVJ8rX758NnDg
QLdXCgAIi+LlmzZtsj59+rh9UfWjU11dAEB4UlJS3BlR9aMrXry4Pfnkk66n
EQAgLMpdXLt2rd1666128cUXW9u2bd15IwBAWLT/qfP+48ePd3XRExISbPLk
yeyLAkCAtEZfs2aNtWvXztVdvOOOO+zYsWNRDwsAEAftjap2br9+/eyiiy6y
+vXr2+LFi6MeFgAgTsphnDdvntWtW9fy5Mnj6nbp1wAA4dFZo/3799vo0aNd
nfSqVavazJkziakDQKBUE2DFihWun1G2bNksMTHRduzYEfWwAABx0Jrc1+5S
bV31HX322WdZqwNAoHTG6Msvv7Q777zT5anfeOON9vnnn0c9LABAnFTf5e23
37aKFStaoUKF7OGHH6bOLgAESvP31q1bbcCAAa7ui3JhPvzww6iHBQCI0w8/
/GALFy50dbxy5szpctepzwgAYdK+6KFDh+zpp5929QIUh3nttdfYLwWAQCm3
cfXq1da6dWvLkiWLq++l86YAgDCdPHnSrc9LlCjhPlTni7U6AIRJuY0bNmyw
u+66y9WCady4sa1cuTLqYQEA4qS90Xfffdf1Ji1QoIANGzbM1XIEAIRHtWB2
7txpgwcPdnH1mjVr2nvvvRf1sAAAcVJu46JFi+zaa6+1HDlyWK9evVysHQAQ
Hu2Lqk/GxIkTrWDBgla+fHmbOnUq+6UAEKjYfkiqBaMcR+2fAgDCpHjL9OnT
rUyZMlasWDF76qmnXLwdABAe1YLZtGmTi6crt7FRo0a2dOnSqIcFAIiTetp9
8MEHdvXVV1u+fPls0KBB9LkDgEAp1rJnzx5Xgzd79uxWvXp1V5sXABAm5TYq
5tKkSRPX565bt2525MiRqIcFAIiDchi/++47mzRpkhUpUsTKlSvnfkxuIwCE
SbmN69ats44dO7r9UvW5++KLL6IeFgAgTikpKfbWW29ZQkKC60v9+OOPu7pf
AIDw+D539957r2XOnNnq16/v+iMBAMKkPEb1K61du7blyZPH9THV+h0AEB7l
Nu7fv99GjRrlepeqJq/iMQCAMKnP3fLly6158+auHm9iYqLt27cv6mEBAOJ0
4sQJV6vxsssuc/Vgnn/+eWrBAECglO+yfv1669y5s2XKlMmaNm3q6jgCAMKk
PnezZs2yypUrW40aNWz27NlRDwkAECd/vnTVqlX20UcfuXgMACBsiqPrg1oB
AAAAAAAAAAAAAACcX/8L6YKfvQ==
"], {{0, 347.}, {373., 0}}, {0, 255},
ColorFunction->RGBColor,
ImageResolution->72],
BoxForm`ImageTag["Byte", ColorSpace -> "RGB", Interleaving -> True],
Selectable->False],
DefaultBaseStyle->"ImageGraphics",
ImageSize->{57.40625, Automatic},
ImageSizeRaw->{373., 347.},
PlotRange->{{0, 373.}, {0, 347.}}]\), 
     CloudGet["https://wolfr.am/KR7FBcsM"]}], MoleculeEquivalentQ]]

Something else that’s new in 12.1—and a first sign of something big to come—is the ability to import data about molecular orbitals:

Import
&#10005
Import["ExampleData/Pyridinecarbonitrile_MO_25_29.cub", "Graphics3D"]

Making the Data Repository Easy

We launched the Wolfram Function Repository in June 2019, and there are already 1146 functions published in it. One of the innovations in the Function Repository is a very streamlined process for submitting new functions, applicable both for the public Function Repository, and for individual deployment on a single machine, or in the cloud.

In Version 12.1 we’re introducing a new, streamlined submission mechanism for the Data Repository. File > New > Repository Item > Data Repository Item gives you:

Data Resource Definition Notebook

Then if you’ve got, say, a Dataset, you just insert it in the notebook, then add examples (using the Insert ResourceObject button to insert references to the object you’re creating). When you’re done, press Deploy, and you can deploy locally, or privately or publicly to the cloud. Lots of checking just happens automatically (and if there’s something wrong you’ll usually get a suggestion about how to fix it).

My goal was to make it so that a simple Data Repository entry could be created in just a few minutes, and I think we’ve streamlined and automated things to the point where that’s now possible.

External Connectivity

We want the Wolfram Language to provide a consistent computational representation of as much as possible. And that means that in addition to things like the molecules we just discussed, we want our language to be able to represent—and seamlessly interact with—all the other kinds of computational systems that exist in the world, whether they be programs, languages, databases or whatever. The list of kinds of things we can deal with is very long—but in Version 12.1 we’ve made some significant additions.

The Wolfram Language has been able to call programs in other languages through what’s now WSTP since 1989, but in recent years we’ve been working to make it ever easier and more automatic to do this. And for example in Version 11.2 we introduced ExternalEvaluate, which provides a high-level way to directly evaluate code in external languages, and, whenever possible, to get results back in a symbolic form that can be seamlessly used in the Wolfram Language.

In Version 12.1 we’ve added Julia, Ruby and R to our collection of external languages. There are all sorts of practical issues, of course. We have to make sure that an appropriate installation exists on a user’s computer, and that the data types used in programs can be meaningfully converted to Wolfram Language.

But in the end it’s very convenient. In a notebook, just type > at the beginning of a line, select your language, and enter the code, and evaluate:

> [1,2,3+3]
&#10005
> [1,2,3+3] 

But this doesn’t only work interactively. It’s also very convenient programmatically. For example, you can create a function in the external language, that is then represented symbolically in the Wolfram Language as an ExternalFunction object, and which, when called, runs code in the external language:

> def square(x)
&#10005
> def square(x)
  x * x
end
% /@ {45, 135, 678, 34}
&#10005
% /@ {45, 135, 678, 34}

For each different language, however, one’s dealing with a whole new world of structures. But since we have built-in support for ZeroMQ (as well as having a connection to Jupyter available), we at least have the plumbing to deal with a very wide range of languages.

But particularly for languages like Python where we have full built-in connectivity, one of the significant things that becomes possible is to have functions that work just like standard Wolfram Language functions but are fully or partly implemented in a completely different language. Of course, to have this work seamlessly requires quite a bit of system support, for automated installation, sandboxing, etc. And for example, one of the things that’s coming is the ability to call functions containing Python code even in the Wolfram Cloud.

In addition to external languages, another area of expansion is external storage systems of all kinds. We’ve already got extensive support for the Bitcoin and Ethereum blockchains; in Version 12.1 we added support for the ARK blockchain. In addition, we introduced support for two external file storage systems: IPFS and Dropbox.

This is all it takes to put a Spikey into the globally accessible IPFS file system:

ExternalStoragePut
&#10005
ExternalStoragePut[CloudGet["https://wolfr.am/L9rTMCn6"], 
 ExternalStorageBase -> "IPFS"]

Here’s the content identifier:

%["CID"]
&#10005
%["CID"]

And now we can get our Spikey back:

ExternalStorageGet
&#10005
ExternalStorageGet["QmcNotbm3RZLv7caaPasU8LiHjNCCuuEemrCgbLAsN49TZ"]

You can do the same kind of thing with Dropbox, and after authentication (either through a browser, or through our new SystemCredential mechanism, discussed below) you can put expressions or upload files, and they’ll immediately show up in your Dropbox filesystem.

Given the framework that’s been introduced in 12.1, we’re now in a position to add connections to other external file storage systems, and those will be coming in future versions.

In addition to plain files, we also have a sophisticated framework for dealing with relational databases. Much of this was introduced in Version 12.0, but there are some additions in 12.1. For example, you can now also connect directly to Oracle databases. In addition, there are new functions for representing relational set operations: UnionedEntityClass, IntersectedEntityClass, ComplementedEntityClass.

And, of course, these work not only on external databases but also on our own built-in knowledgebase:

SortedEntityClass
&#10005
SortedEntityClass[UnionedEntityClass[
    \!\(\*
NamespaceBox["LinguisticAssistant",
DynamicModuleBox[{Typeset`query$$ = "EU", Typeset`boxes$$ = 
        TemplateBox[{"\"European Union\"", 
RowBox[{"EntityClass", "[", 
RowBox[{"\"Country\"", ",", "\"EuropeanUnion\""}], "]"}], 
          "\"EntityClass[\\\"Country\\\", \\\"EuropeanUnion\\\"]\"", 
          "\"countries\""}, "EntityClass"], 
        Typeset`allassumptions$$ = {{
         "type" -> "Clash", "word" -> "EU", 
          "template" -> "Assuming \"${word}\" is ${desc1}. Use as \
${desc2} instead", "count" -> "2", 
          "Values" -> {{
            "name" -> "CountryClass", 
             "desc" -> "a class of countries", 
             "input" -> "*C.EU-_*CountryClass-"}, {
            "name" -> "Unit", "desc" -> "a unit", 
             "input" -> "*C.EU-_*Unit-"}}}}, 
        Typeset`assumptions$$ = {}, Typeset`open$$ = {1, 2}, 
        Typeset`querystate$$ = {
        "Online" -> True, "Allowed" -> True, 
         "mparse.jsp" -> 0.363881`6.012504373043238, 
         "Messages" -> {}}}, 
DynamicBox[ToBoxes[
AlphaIntegration`LinguisticAssistantBoxes["", 4, Automatic, 
Dynamic[Typeset`query$$], 
Dynamic[Typeset`boxes$$], 
Dynamic[Typeset`allassumptions$$], 
Dynamic[Typeset`assumptions$$], 
Dynamic[Typeset`open$$], 
Dynamic[Typeset`querystate$$]], StandardForm],
ImageSizeCache->{232., {7., 15.}},
TrackedSymbols:>{
          Typeset`query$$, Typeset`boxes$$, Typeset`allassumptions$$, 
           Typeset`assumptions$$, Typeset`open$$, 
           Typeset`querystate$$}],
DynamicModuleValues:>{},
UndoTrackedVariables:>{Typeset`open$$}],
BaseStyle->{"Deploy"},
DeleteWithContents->True,
Editable->False,
SelectWithContents->True]\), 
    EntityClass["AdministrativeDivision", "AllUSStatesPlusDC"]], 
   "Population" -> "Descending"][{"Name", "Population"}] // Dataset

We’ve been very active over the years in supporting as many file formats as possible. In Version 12.1 we’ve added the popular new HEIF image format. We’ve also updated our DICOM importer, so you can take those CT scans and MRIs and immediately start analyzing them with Image3D and our 3D image processing.

Like, this is part of our director of R&D’s knee:

knee = Import
&#10005
knee = Import["knee_mr/DICOMDIR", {"Image3D", 1, 1, 1}];
Image3D[knee, ColorFunction -> (Blend[{{0., 
RGBColor[0.05635, 0.081, 0.07687, 0.]}, {0.0777045, 
RGBColor[0.702347, 0.222888, 0.0171385, 0.0230167]}, {0.3, 
RGBColor[1., 0.6036, 0., 0.303215]}, {0.66, 
RGBColor[1., 0.9658, 0.4926, 0.661561]}, {1., 
RGBColor[1., 0.6436, 0.03622, 1.]}}, #]& )] // ImageAdjust // 
 Blur[#, 3] &

In Version 11.3, we added MailServerConnect for connecting directly to mail servers. In 12.1, we’ve added a layer of caching, as well as a variety of new capabilities, that make the Wolfram Language uniquely powerful for programmatic mail processing. In addition, in Version 12.1 we’ve upgraded our capabilities for importing mail messages from EML and MBOX, in particular adding more controls for attachments.

Yet another new feature of 12.1 is stronger support for ZIP and TAR files, both their creation through CreateArchive, and their extraction through ExtractArchive—so you can now routinely handle tens of thousands of files, and gigabytes of data.

Whenever one is connecting to external sites or services, there are often issues of authentication. We’d had some nice symbolic ways to represent authentication for some time, like SecuredAuthenticationKey (that stores OAuth credentials). But for example in cases where you’ve got to give a username and password, there’s always been the issue of where you store those. In the end, you want to give them as part of the value for an Authentication option. But you don’t want to have them lying around in plaintext.

And in Version 12.1 there’s a nice solution to this: SystemCredential. SystemCredential ties into your system keychain—the encrypted storage that’s provided by your operating system, and secured by the login on your computer.

It’s as easy as this to store things in your system keychain:

SystemCredential
&#10005
SystemCredential["secret"] = "it's a secret"
SystemCredential
&#10005
SystemCredential["secret"]

Paclets for All

Length
&#10005
Length[PacletFind[]]

In my Wolfram Language system right now, I have 467 paclets installed. What is a paclet? It’s a modular package of code and other resources that gets installed into a Wolfram Language system to deliver pretty much any kind of functionality.

We first invented paclets in 2006, and we’ve been increasingly using them to do incremental distribution of a great many pieces of Wolfram Language functionality. Paclets are versioned, and can be set to automatically update themselves. Up until now, paclets have basically been something that (at least officially) only we create and distribute, from our central paclet server.

But as of Version 12.1, we’re opening up our paclet system so anyone can use it, and we’re making it a fully documented and supported part of the Wolfram Language. Ultimately, a paclet is a file structure that contains assets or resources of various kinds, together with a special PacletInfo.wl file that defines how the paclet should integrate itself into a Wolfram Language system.

A paclet can set up code to execute at startup time. It can set up symbols whose definitions will be autoloaded. It can install documentation. It can put items into menus. And in general it can set up assets to be used in almost any part of the fairly complex structure that is a deployed Wolfram Language system.

Typically a paclet is distributed as a single archive file, and there are many ways someone can get such a paclet file. We maintain a central paclet server that’s used by the Wolfram Language system to get automatic downloads. But in the near future, we’re also going to have a full paclet repository through which users will be able to distribute paclets. (We’re also going to make it possible for Wolfram Enterprise Private Clouds to have their own paclet repositories.)

I might mention how I see the Wolfram Paclet Repository relating to our Wolfram Function Repository. The Function Repository is built to extend the functionality of the Wolfram Language one function at a time, always maintaining the overall structure and consistency of the language. The Paclet Repository will let people distribute complete environments that may serve some particular purpose, but may not maintain the structure and consistency of the overall language. Paclets will be extremely useful, and we intend them to be as freely used as possible. So whereas the Wolfram Function Repository has a curation process to ensure a certain level of design consistency, we plan to make the Wolfram Paclet Repository basically completely open.

The goal is to allow a rich ecosystem of user-contributed paclets to develop. The Paclet Repository will serve as a smooth distribution channel for Wolfram Language material. (And by the way, I think it will be quite common for functions in the Function Repository to actually be based on code that’s distributed through the Paclet Repository—with the Function Repository serving as a streamlined and structured presentation mechanism for the functions.)

In Version 12.1 there are a variety of functions for creating and managing paclets. With the Wolfram Language, PacletObject is the symbolic representation of a paclet. Here’s a trivial example of what a paclet might be like:

PacletObject
&#10005
PacletObject[<|"Name" -> "TrivialPaclet", "Version" -> "1.0", 
  "Extensions" -> {{"Kernel", "Context" -> "TrivialPackage`"}}|>]

This can then be packaged into a .paclet file using CreatePacletArchive—and this file can then be distributed just as a file, or can be put on a paclet server. Once someone has the file, it’s just a question of using PacletInstall, and the paclet will come to life, inserting the necessary hooks into a Wolfram Language system so that its contents are appropriately used or exposed.

And Even More…

Well, that’s already a lot. But there’s even more too. FaceAlign. FindImageText. Around in Classify and ListPlot3D. SubsetCount. GenerateFileSignature. RSA digital signatures. Much faster MailSearch. Initialization tied to notebooks or cells. And so on. Here’s the whole list. Check it out!

Summary of New Features in 12.1


How We Got Here: The Backstory of the Wolfram Physics Project

$
0
0
physics-history-thumbnail

The Wolfram Physics Project

“Someday…”

I’ve been saying it for decades: “Someday I’m going to mount a serious effort to find the fundamental theory of physics.” Well, I’m thrilled that today “someday” has come, and we’re launching the Wolfram Physics Project. And getting ready to launch this project over the past few months might be the single most intellectually exciting time I’ve ever had. So many things I’d wondered about for so long getting solved. So many exciting moments of “Surely it can’t be that simple?” And the dawning realization, “Oh my gosh, it’s actually going to work!”

Physics was my first great intellectual passion. And I got started young, publishing my first paper when I was 15. I was lucky enough to be involved in physics in one of its golden ages, in the late 1970s. Not that I was trying to find a fundamental theory of physics back then. Like essentially all physicists, I spent my time on the hard work of figuring out the consequences of the theories we already had.

But doing that got me progressively more involved with computers. And then I realized: computation is its own paradigm. There’s a whole way of thinking about the world using the idea of computation. And it’s very powerful, and fundamental. Maybe even more fundamental than physics can ever be. And so it was that I left physics, and began to explore the computational universe: in a sense the universe of all possible universes.

This essay is also in:
SoundCloud »

That was forty years ago, and much has happened since then. My science led me to develop technology. The technology led me to more science. I did big science projects. I did big technology projects. And between the science and the technology, I felt like I was gradually building a tower that let me see and do more and more.

I never forgot physics, though. And as I studied the computational universe I couldn’t help wondering whether maybe somewhere, out in this abstract computational world, might be our physical universe, just waiting to be discovered. Thirty years ago I had my first idea about how this might work. And over the decade that followed I figured out quite a bit—found some encouraging signs—and eventually started to tell the world about it.

I kept on thinking about really pushing it further. I’d talk about it when I could, sometimes in very public venues. But I was off doing other, very satisfying things. It so happened that technology I’d built became very widely used by physicists. But to most of the physics community I was basically an ex-physicist, who sometimes said strange and alien things about fundamental physics.

Meanwhile, two decades went by. I always hoped that one day I’d get to do my physics project. But I didn’t know when, and my hopes were dimming. But then, a bit more than a year ago, I had a little idea that solved a nagging problem I’d had with my approach. And when I talked about it with two young physicists at our annual Summer School they were so enthusiastic. And I realized, “Yes, there are people who really want to see this problem solved.” And after everything I’ve built and thought about, I have a responsibility to see if it can be done. Oh, and by the way, I really want to do it! It just seems like such a fun and fascinating thing. So why not just do it?

We got started in earnest late last fall. I started doing lots of new computer experiments. New ideas started flowing. And it was incredible. We started to figure out so much. My plan had been that we’d mostly just describe as clearly as possible what I basically already knew, then launch it as a project for other people to get involved. But it was just too easy and too fun to figure things out. We had a new paradigm and things just started tumbling out. In all my years of doing science and technology, I’ve never seen anything like it. It’s been wonderful.

But the plan was always to share the fun, and now we’re ready to do that. We’re publishing everything we’ve done so far (including all the tools, archives, even working-session videos), and we’re looking forward to seeing if this is the time in history when we finally get to figure out the fundamental theory for our universe. Oh, and I finally get to bring to closure something I’ve wanted to do for more than half my life, and that in some ways I’ve spent half a century preparing for.

Why Wasn’t This Already Figured Out?

People have thought about what we’d now call the fundamental theory of physics throughout recorded history. From creation myths, to philosophy, to science, it’s been a long story. And most of the time, it’s actually seemed as if the answer was not far away, at least to the standards of explanation of the day. But it never quite got solved.

And if—as I believe—our project is finally on the right track, we kind of now know why. We just didn’t have the modern paradigm of computation before, and so we didn’t have the right way of thinking about things. Looking back, though, there were an awful lot of good ideas, that were very much in the right direction. And particularly in recent times, there was an awful lot of mathematical methodology developed that’s very relevant and on target.

What does it matter what the fundamental theory of physics is? It’d certainly be an impressive achievement for science to figure it out. And my guess is that knowing it is eventually going to have some far-reaching long-term consequences for our general ways of thinking about things. Conceivably the theory will have near-term applications too. But in terms of what’s done year after year in developing technology, doing science or even understanding theological questions, knowing the fundamental theory of physics isn’t directly relevant; it’s more like an ultimate “background question”. And that’s realistically pretty much how it’s been treated throughout most of history.

Back in ancient Greek times, almost every serious Greek philosopher seems to have had a theory. The details were different. But there was a theme. That somehow everything in the universe consists of the same thing or things repeated over and over again. Maybe it was all water. Maybe it was four elements. Maybe Platonic solids. Maybe atoms. Maybe the world is assembled like sentences from a grammar. To us today these seem quite vague and almost allegorical. But there was an important idea: that everything we see in the world might actually be the result of something simple and formalizable underneath.

As the centuries went by, the idea of “natural laws” sharpened, sometimes with an almost computational feel. “God can only run the world by natural laws”, or “The universe is the thoughts of God actualized”. The 1600s brought the whole idea of describing the world using what amount to mathematical models. But while this had a huge effect on what could be studied and computed in physics, it didn’t immediately change the thinking that much about what the universe might ultimately be made of. It was still just tiny corpuscles (AKA atoms), though now presumed to be bound by gravitational forces.

But what did begin to change was the whole idea that there should be any kind of “explicit explanation” for the universe that one could reason about: maybe there were just equations that were true about the universe, and that was all that could be said, a bit like Euclid’s axioms for geometry. But around the same time, systematic experimental science began to rise—and there implicitly emerged the picture (charmingly resonant with modern debates about machine learning) that physics should consist of finding equations that would represent theories that could fit experimental data.

In the 1800s, as mathematical physics reached the point where it could deal with partial differential equations, and the notion of fields became popular, there started to be ideas about the universe being “all fields”. First there was the ether (along with the rather good idea that atoms might be knotted vortices in the ether). Later people wondered if the electromagnetic field could underlie everything. When the electron was discovered people wondered if perhaps everything was in fact made of electrons. And so on. But a key theme was that to figure out things about the universe, you should either just do experiments, or you should take known equations and compute: you weren’t expected to be able to reason about the universe.

That made special relativity in 1905 quite a shock: because, once again, one was figuring out physics by abstract reasoning. But somehow that just reinforced the “trust the mathematics” idea, and for example—in what I consider to be one of the most important wrong turns in the history of physics—there emerged the idea of a mathematical notion of “spacetime”, in which (despite our strong intuitive sense to the contrary) space and time are treated as “the same kind of thing”.

The introduction of general relativity in 1915—in addition to giving us the theory of gravity—brought with it the notion that “high-end” modern mathematics (in this case tensors and differential geometry) could inform physics. And that was an important piece of methodological input when quantum mechanics, and soon quantum field theory, were developed in the 1920s. Yes, it was difficult to “understand” the theory. But really the mathematics was the intellectual meat of what was going on, and should guide it. And that was what let one calculate things anyway. “Interpretation” was more philosophy than physics.

The question of what space “is” had been discussed by philosophers since antiquity. Euclid had implicitly made a pretty definitive statement with his very first common notion “a point is that which has no part”: i.e. there is no discreteness to points, or, in other words, space is continuous. And by the time calculus arose in the late 1600s, it was pretty much taken for granted that space was continuous, and position was a continuous variable.

At different times, Descartes, Riemann and Einstein all had their doubts. But the force of the mathematical methodology provided by calculus was just too great. Still, in the 1930s there started to be problems with infinities in quantum calculations—and with quantization all the rage it started almost being assumed that space must be quantized too. But with the calculus-based thinking of the time nobody managed to make that work. (Graph theory was presumably too immature—and basically unknown—for the idea of space as a graph to arise.) Then in the 1940s the mathematical problems with infinities were avoided (by the idea of renormalization), and—with some important exceptions—the notion that space might be discrete basically disappeared from physics.

Meanwhile, mathematical methods based on calculus were doing great in advancing physics. Quantum electrodynamics (QED) and general relativity were both particularly successful, and it started to seem as if figuring out everything in physics was just a question of doing the math well enough.

But then there were the particles. The muon. The pion. The hyperons. For a while it had seemed that electrons, photons, protons and neutrons were what everything was made of. But by the 1950s particle accelerators were starting to discover many tens of new “elementary” particles. What were all these things?

Oh, and they were all tied up with the strong nuclear force that held nuclei together. And despite all its success in QED, quantum field theory and its “just-work-everything-out-step-by-step” mathematics just didn’t seem to apply. So a different approach developed: S-matrix theory. It was mathematically elaborate (functions of many complex variables), in some ways elegant, but in a sense very formal. Instead of saying “this is how things are built up from something underneath” it basically just said “Here are some mathematical constraints. Whatever solutions they have are what will happen. Don’t ask why.”

And when it came to the particles, there were two approaches. One—roughly allied with quantum field theory—said that inside all these particles was something more fundamental, a new kind of particle called a quark. The other approach—allied with S-matrix theory—imagined something more “democratic” with different particles all just related by some kind of consistency condition.

Through the 1960s these two approaches duked it out. S-matrix theory was definitely ahead—notably spawning Regge theory and what later became string theory. There were quantum-field-theory ideas like what became QCD, but they didn’t look promising. But by the early 1970s it began to be clear that quarks were something real, and in 1973 the phenomenon of asymptotic freedom was discovered, and quantum field theory was saved.

In 1974 came the surprise discovery of a new kind of quark, and physics entered a golden age of rapid progress, essentially powered by quantum field theory. (And, yes, I was involved in that, and it was a lot of fun.) Soon the Standard Model emerged, and everything seemed to be fitting together, and it seemed once again that it was just a matter of calculating, and everything could be figured out.

There were still mysteries: for example, why these particular particles, with these particular masses? But there was a methodology, and there was a sense that somehow this would all work out. An important piece of the story was the use of the theory of Lie groups (a piece of “high-end math” that made its way into physics in the 1950s). Which group was the one for the universe? The Standard Model involved three groups: SU(3), SU(2) and U(1). But could these all be combined into a single, bigger group, perhaps SU(5) or SO(10)—a single “grand unified” model? Around 1980 it all looked very promising.

But there was one key prediction: the proton should be unstable, decaying, albeit very slowly. But then the experiments started coming in: no proton decay. What was particle physics to do? There were new theories, with new particles. But no new particles showed up. Meanwhile, people kept computing more and more with the Standard Model. And everything kept on working. One decimal place, two, three, four.

It was difficult—but somehow routine—to do these calculations. And it seemed like particle physics had entered a phase like atomic physics and nuclear physics before it, where it was really just a question of calculating what one needed.

But there was a crack in all of this. And it was gravity. Yes, quantum field theory had worked well in particle physics. But when it was applied to gravity, it really didn’t work at all. That wasn’t important for computing things about the Standard Model. But it showed there was something else that had to be figured out in physics.

Meanwhile, the theory of gravity had steadily been developing, based on general relativity, which was unchanged since 1915. Until about the 1950s, there had been hopes of generalizing general relativity to make a “unified field theory” that could encompass “matter” as well as gravity. (And in fact, once again, there were good ideas here, about “everything being made of space”.) But it hadn’t worked out—though for example Einstein remarked that perhaps that was because it was being incorrectly assumed that space is continuous.

General relativity is a difficult mathematical theory, fraught with issues of what is real, and what is “just math”. It didn’t get nearly as much attention as quantum field theory, but by the 1960s it was becoming better understood, there were starting to be sensitive experimental tests, and it was pretty clear that things like black holes were real predictions of the theory. And the discovery of the cosmic microwave background heightened interest in cosmology in general, and in the early universe in particular.

In a way, particle physics had been propelled at the end of World War II by the success of the Manhattan Project. But a generation had passed, and by the end of the 1980s it no longer seemed compelling to spend billions of dollars to build the next particle accelerator. But right around then, there started to be more and more done in cosmology. More and more details about the early universe. More and more mystery about the dark matter that seems to exist around galaxies.

And somehow this progress in cosmology just emphasized the importance of figuring out how particle physics (and quantum field theory) could be brought together with general relativity. But what was to be done?

At the tail end of the 1970s-golden-age of particle physics there was another injection of “fancy math”, this time around fiber bundles and algebraic topology. The original application (instanton solutions to the equations of QCD) didn’t work out. But there began to develop a new kind of interchange between the front lines of pure mathematics and theoretical physics.

And as traditional particle physics plateaued, there was more and more emphasis on quantum gravity. First there was supergravity—a kind of extension of the quark model and group theory “let’s just figure out more particles” tradition. But soon the focus turned to something new: string theory. Well, actually, it wasn’t new at all. String theory had been developed, and rejected, as part of the S-matrix initiative in the 1960s. But now it was retooled, and directed at quantum gravity with enough vigor that by the end of the 1980s a large fraction of all particle physicists were working on it.

It didn’t really connect to anything experimentally visible. And it also had all sorts of weird problems—like implying that the universe should really be 26-dimensional, or maybe 10-dimensional. But the physics community was committed to it, and the theory kept on getting patched, all the while becoming more complicated. But even though the physics wasn’t terribly compelling, there were starting to be some spinoffs in math, that made elegant—and important—connections across different areas of high-end pure mathematics.

And in the mid-1990s the high-end math paid back to physics again, bringing M-theory—which seemed to miraculously weave together disparate directions in string theory. For a while there were claims that M theory would be “it”—the fundamental theory of physics. But gradually the hopes faded, with rather little to show.

There was another frontier, though. In the 1970s an initially rather rickety calculation had suggested that black holes—instead of being “completely black”—should emit particles as a result of quantum mechanics. For a while this was basically a curiosity, but slowly the calculations became more streamlined—and it began to look as if whole black holes in a sense had enough mathematical perfection that they could actually be studied a bit like little particles. And in the late 1990s from the mathematics of string theory there emerged the so-called AdS/CFT correspondence—an elaborate mathematical connection between a limiting case of general relativity and a limiting case of quantum field theory.

I don’t think anyone would claim that AdS/CFT is itself anything like a fundamental theory of physics. But in the past 20 years it’s steadily grown to become perhaps the central hope for fundamental physics, mathematically hinting at a variety of deep connections—which, as it happens, look like they may actually dovetail quite beautifully with what we’ve recently figured out.

The last time there was a widespread “we’re almost there” feeling about the fundamental theory of physics was probably around 1980, with another blip in the mid-1990s. And since then the focus of physics has definitely turned elsewhere. But there have been some initiatives—many actually dating from the 1970s and before—that have still continued in various pockets of the physics and math communities. Twistor theory. Causal set theory. Loop quantum gravity. Spin networks. Non-commutative geometry. Quantum relativity. Typically these have seemed like increasingly disconnected—and sometimes almost quixotic—efforts. But one of the wonderful things that’s come out of our project so far is that actually the core formalisms of a surprising number of these initiatives look to be directly and richly relevant.

But what about other approaches to finding a fundamental theory of physics? Realistically I think the landscape has been quite barren of late. There’s a steady stream of people outside the physics community making proposals. But most of them are deeply handicapped by not connecting to quantum field theory and general relativity. Yes, these are mathematically sophisticated theories that pretty much take a physics PhD’s worth of study to understand. But they’re the best operational summaries we have right now of what’s known in physics, and if one doesn’t connect to them, one’s basically throwing away everything that was achieved in 20th-century physics.

Is it surprising that a fundamental theory of physics hasn’t been found yet? If—as I think—we’re now finally on the right track, then no, not particularly. Because it requires ideas and paradigms that just hadn’t been developed until quite recently. Of course, to find a fundamental theory of physics you have to go to the effort of trying to do it, and you have to believe it’s possible. And here perhaps the biggest impediment has been the sheer size of physics as an enterprise. After its successes in the mid-20th century, physics became big—and being a physicist became a top aspiration. But with size came institutionalization and inertia.

About a hundred years ago it was a small collection of people who originally invented the two great theories (general relativity and quantum field theory) that have basically defined fundamental physics for the past century. I don’t think it would have been a great surprise to any of them that to make further progress one would need new ideas and new directions. But to most modern physicists—perhaps six or seven academic generations removed from the founders of these fields, and embedded in a large structure with particular ways of doing things—the existing ideas and directions, and the kinds of things that can be done with them, just seem like the only way it can be. So if in the normal course of science, a fundamental theory of physics does not appear—as it did not—there ends up being almost a collective conclusion that to find it must be too hard, or even impossible. And not really what physics, or physicists, should be doing.

So who will do it? There’s a pretty thin set of possibilities. It pretty much has to be someone who knows the methods and achievements of mainstream physics well, or, essentially, someone who has been a physicist. It also pretty much has to be someone who’s not too deeply embedded in the current world of physics and its existing ideas, prejudices and “market forces”. Oh, and it requires tools and resources. It doesn’t hurt to have experience in doing and leading large projects. And it requires the confidence and resolve to try a big and difficult project that few people will believe in—as well, of course, as a deep interest in actually finding out the answer.

The project that we’re now launching almost didn’t happen. And if a few more years had gone by, it’s pretty certain it wouldn’t have. But things lined up in just such a way that a small window was opened. And I’m thrilled at the way it’s turning out. But let me now tell a little more of the story of how it got here.

The Beginning of the Story

As a young kid growing up in England in the 1960s I viewed the space program as a kind of beacon of the future, and I intently followed it. But when I wanted to know how spacecraft and their instruments worked I realized I had to learn about physics—and soon I left space behind, and was deeply into physics.

I was probably 11 years old when I started reading my first college physics textbook, and when I was 12 I compiled a “Concise Directory of Physics” with 111 pages of carefully typed information and data about physics:

Concise Directory of Physics

All this information collection had definite shades of “Wolfram|Alpha-ism”. And the “visualizations” presaged a lifelong interest in information presentation. But my Concise Directory also had something else: pages listing the “elementary particles”. And soon these became my great obsession. The pions. The kaons. The muon. The cascade hyperons. To me they were the ultimate story in science, and I was soon learning all their quirks (“The K zero isn’t its own antiparticle!” “The omega minus has strangeness -3!”)

I spent the summer when I was 13 writing the 132-page, single-spaced, typed “The Physics of Subatomic Particles”. At the time, I basically showed it to no one, and it’s strange to look at it now—47 years later. It’s basically an exposition of particle physics, told in a historical arc. Some of it is shockingly similar to what I just wrote in the previous section of this piece—except for the change of tense and the Americanization:

The Physics of Subatomic Particles

It’s charming to read my 13-year-old self’s explanation of quantum field theory (not bad), or my authoritative description of a just-proposed theory of the muon that I’m guessing I found out about from New Scientist, and that turned out to be completely wrong.

By the next summer I was writing a 230-page treatise “Introduction to the Weak Interaction”, featuring some of my most favorite elementary particles, and showing a pretty good grasp of quantum mechanics and field theory:

Introduction to the Weak Interaction

Pretty soon I had reached the edge of what was known in particle physics, but so far what I had done was basically exposition; I hadn’t really tried to figure out new things. But by the summer of 1974 it was increasingly clear that something unexpected was going in physics. Several experiments were showing an unanticipated rise in the electron-positron annihilation cross-section—and then, rather dramatically, in November, the J/ψ particle was discovered. It was all a big surprise, and at first people had no idea what was going on. But 14-year-old me decided I was going to figure it out.

Those were days long before the web, and it wasn’t easy to get the latest information. But from where I lived when I wasn’t at school, it was about a 6-mile bicycle ride to the nearest university library, and I did it often. And pretty soon I had come up with a theory: maybe—contrary to what had long been believed—the electron is not in fact a point particle, but actually has internal structure.

By then, I had read many academic papers, and pretty soon I had written one of my own. It took two tries, but then, there it was, my first published paper, complete—I now notice—with some self-references to earlier work of mine, in true academic style:

Hadronic Electrons?

It was a creative and decently written paper, but it was technically a bit weak (heck, I was only 15), and, at least at the time, its main idea did not pan out. But of course there’s an irony to all this. Because—guess what—45 years later, in our current model for fundamental physics, the electron is once again not a point particle! Back in 1975, though, I thought maybe it had a radius of 10-18 meters; now I think it’s more likely 10-81 meters. So at the very least 15-year-old me was wrong by 63 orders of magnitude!

Being a “teenage physicist” had its interesting features. At my boarding school (the older-than-the-discovery-of-America Eton), there was much amusement when mail came addressed to me as “Dr. S. Wolfram”. Soon I started doing day trips to go to physics seminars in Oxford—and interacting with “real physicists” from the international physics community. I think I was viewed as an exotic phenomenon, usually referred to in a rather Wild-West way as “The Kid”. (Years later, I was amused when one of my children, precocious in a completely different domain, earned the very same nickname.)

I really loved physics. And I wanted to do as much physics as I could. I had started using computers back in 1973—basically to do physics simulations. And by 1976 I’d realized something important about computers. The one thing I didn’t like about physics was that it involved doing all sorts of—to me, tedious—mathematical calculations. But I realized that I could get computers to do those for me. And, needless to say, that’s how, eventually, Mathematica, Wolfram|Alpha, etc. came to be.

I left high school when I was 16, worked doing physics at a government lab in England for about 6 months, and then went to Oxford. By this point, I was producing physics papers at a decent rate, and the papers were getting progressively better. (Or at least good enough that by age 17 I’d had my first run-in with academic thievery.)

Mostly I worked on particle physics—at that time by far the hottest area of physics. But I was also very interested in questions like the origin of the Second Law of thermodynamics, and particularly its relation to gravity. (If things always become more disordered, how come galaxies form, etc.?) And from this (as well as questions like “where’s the antimatter in the universe?”) I got interested in cosmology, and, inevitably, in connecting it to particle physics.

Nowadays everyone knows about that connection, but back then few people were interested in it.

Particle physics, though, was a completely different story. There were exciting discoveries practically every week, and the best and brightest were going into the field. QCD (the theory of quarks and gluons) was taking off, and I had a great time doing some of the “obvious” calculations. And of course, I had my secret weapon: computers. I’ve never really understood why other people weren’t using them, but for me they were critical. They let me figure out all this stuff other people couldn’t. And I think the process of writing programs made me a better physicist too. Looking at my papers from back then, the notation and structure got cleaner and cleaner—as might befit a future lifelong language designer.

After a bit more than a year in Oxford, and now with ten physics papers to my name, I “dropped out of college”, and went to Caltech as a graduate student. It was a very productive time for me. At the peak I was writing a physics paper every couple of weeks, on quite a range of topics. (And it’s nice to see that some of those papers still get referenced today, 40 years later.)

Caltech was at the time a world center for particle physics, with almost everyone who was someone coming through at one time or another. Most of them were much older than me, but I still got to know them—not just as names in the physics literature but as real people with their various quirks.

Murray Gell-Mann and Richard Feynman were the two biggest names in physics at Caltech at the time. I got on particularly well with Feynman, even if—in his rather competitive way—he would often lament that he was three times my age. (In the way these things come around, I’m now the same age as he was when I first met him…)

After a bit more than a year, I put together some of the papers I’d written, officially got my PhD, and took up a nice research faculty position at Caltech. I’d had the goal of “being a physicist” since I was about 10 years old, and now, at age 20, I was actually officially a physicist.

“So what now?” I wondered. There were lots of things I wanted to do in physics. But I felt limited by the computer tools I had. So—actually within a couple of weeks of getting my PhD—I resolved that I should spend the time just to build the tools I needed. And that’s how I came to start developing my first big computer system and language.

I approached it a bit like a problem in natural science, trying to develop a theory, find principles, etc. But it was different from anything I’d done before: it wasn’t constrained by the universe as the universe is. I just had to invent abstract structures that would fit together and be useful.

The system I built (that I called SMP, for “Symbolic Manipulation Program”) had all sorts of ideas, some good, some not so good. One of the most abstract—and, arguably, obscure—ideas had to do with controlling how recursive evaluation works. I thought it was neat, and perhaps powerful. But I don’t think anyone (including me) ever really understood how to use it, and in the end it was effectively relegated to a footnote.

But here’s the irony: that footnote is now a front-and-center issue in our models of fundamental physics. And there’s more. Around the time I was building SMP I was also thinking a lot about gauge theories in physics. So there I was thinking about recursion control and about gauge invariance. Two utterly unrelated things, or so I thought. Until just recently, when I realized that in some fundamental sense they’re actually the same thing!

“You Can’t Leave Physics”

It took a couple of years to build the first version of SMP. I continued to do particle physics, though I could already feel that the field was cooling, and my interests were beginning to run to more general, theoretical questions. SMP was my first large-scale “practical” project. And not only did it involve all sorts of software engineering, it also involved managing a team—and ultimately starting my first company.

Physicists I knew could already tell I was slipping away from physics. “You can’t leave physics”, they would say. “You’re really good at this.” I still liked physics, and I particularly liked its “let’s just figure this out” attitude. But now I wasn’t just applying that methodology in quantum field theory and cosmology; I was also using it in language design, in software development, in entrepreneurism, and in other things. And it was working great.

The process of starting my first company was fraught with ahead-of-my-time-in-the-interaction-between-companies-and-universities issues, that ultimately caused me to leave Caltech. And right in the middle of that, I decided I needed to take a break from my mainline “be a physicist” activities, and just spend some time doing “something fun”.

I had been thinking for a long time about how it is that complex things manage to happen in nature. My two favorite examples were neural networks (yes, back in 1981, though I never figured out how to make them do anything very useful back then) and self-gravitating gases. And in my “just have fun” approach I decided to try to make the most minimal model I could, even if it didn’t really have much to do with either of these examples, or officially with “physics”.

It probably helped that I’d spent all that time developing SMP, and was basically used to just inventing abstract things from scratch. But in any case, what I came up with were very simple rules for arrays of 0s and 1s. I was pretty sure that—as such—they wouldn’t do anything interesting. But it was basically trivial for me to just try running them on a computer. And so I did. And what I found was amazing, and gradually changed my whole outlook on science and really my whole worldview—and sowed the seeds that have now, I believe, brought us a path to the fundamental theory of physics.

What I was looking at were basically some of the very simplest programs one can imagine. And I assumed that programs that simple wouldn’t be able to behave in anything other than simple ways. But here’s what I actually saw in my first computer experiment (here rendered a bit more crisply than in my original printouts):

GraphicsGrid
&#10005
GraphicsGrid[
 Partition[
  Table[ArrayPlot[CellularAutomaton[n, {{1}, 0}, {40, All}]], {n, 0, 
    63}], 8], ImageSize -> Full]

Yes, some of the behavior is simple. And some of it involves nice, recognizable fractal patterns. But then there are other things going on, like my all-time favorite—what I called “rule 30”:

ArrayPlot[CellularAutomaton[30, {{1}, 0}, 200]]
&#10005
ArrayPlot[CellularAutomaton[30, {{1}, 0}, 200]]

At first, I didn’t understand what I was seeing, and I was convinced that somehow the simplicity of the underlying rules must ultimately force the behavior to be simple. I tried using all sorts of methods from physics, mathematics, computer science, statistics, cryptography and so on to “crack” these systems. But I always failed. And gradually I began to realize that something fundamental was going on—that somehow in just running their rules, simple as they were, these systems were intrinsically creating some kind of irreducible complexity.

I started writing papers about what I’d discovered, at first couched in very physics-oriented terms:

Statistical Mechanics of Cellular Automata

The papers were well received—in physics, in mathematics, and in other fields too, like biology. (Where perhaps it helped that—in a nod to historical antecedents—I called my models “cellular automata”, though I meant abstract cells, not biological ones.)

Meanwhile, I had moved to the Institute for Advanced Study, in Princeton (where there were still people telling stories about their interactions with Kurt Gödel and “Johnny” von Neumann and his computer, and where, yes, my office was upstairs from where Einstein had once worked). I started building up a whole effort around studying “complexity” and how it could arise from simple rules. And gradually I started to realize that what I’d seen in that little computer experiment in 1981 was actually a first sign of something very big and very important.

Looking back, I see that experiment as my personal analog of turning a telescope to the sky and seeing the moons of Jupiter. But the challenge was really to understand the significance of what I’d seen—which in the end took me decades. But the first step was just to start thinking not in terms of the kinds of methods I’d used in physics, but instead fundamentally in terms of computation, treating computation not just as a methodology but a paradigm.

The summer of 1984 was when I think I finally began to seriously understand computation as a paradigm. Early that summer I’d finally recognized rule 30 for what it was: a powerful computational system. Then—in writing an article for Scientific American (nominally on “Computer Software in Science and Mathematics”)—I came up with the term “computational irreducibility”, and began to understand its significance.

That fall I wrote a short paper that outlined the correspondence with physics, and the fundamental implications (which now loom large in our current project) of computational irreducibility for physics:

Undecidability and Intractability in Theoretical Physics

One of the nice things for me about the Institute for Advanced Study is that it was a small place, with not only physicists, but also lots of world-class mathematicians. (I had interacted a bit with Michael Atiyah and Roger Penrose about mathematics-for-physics when I was in Oxford, but at Caltech it was physics and nothing but.) Two top-of-the-line mathematicians, John Milnor and Bill Thurston, both got interested in my cellular automata. But try as they might, they could prove pretty much nothing; they basically hit a wall of computational irreducibility.

Yes, there is undecidability in mathematics, as we’ve known since Gödel’s theorem. But the mathematics that mathematicians usually work on is basically set up not to run into it. But just being “plucked from the computational universe”, my cellular automata don’t get to avoid it. And ultimately our physics project will run into the same issues. But one of the wonderful things that’s become clear in the last few months is that actually there’s quite a layer of computational reducibility in our models of physics—which is critical for our ability to perceive the world coherently, but also makes math able to be useful.

But back to the story. In addition to my life doing basic science, I had a “hobby” of doing consulting for tech companies. One of those companies was a certain ultimately-poorly-named Thinking Machines Corporation, that made massively parallel computers that happened to be ideally suited to running cellular automata. And in an effort to find uses for their computers, I decided to see whether one could model fluid flow with cellular automata. The idea was to start not with the standard physics equations for fluid flow, but instead just to have lots of computational particles with very simple rules, and then see whether on a large scale fluid flow could emerge.

As it turned out, with my interest in the Second Law of thermodynamics, I’d actually tried something quite similar back in 1973, as one of the very first programs I ever wrote. But I hadn’t seen anything interesting then, partly because of what one might think of as a piece of technical bad luck, but probably more importantly because I didn’t yet grasp the paradigm that would allow me to understand what was going on. But in 1985 I did understand, and it was neat: from tiny computational rules that didn’t immediately have physics in them was emerging a piece of physics that was normally described with the equations of physics. And, yes, now it looks like that’s how all of physics may work—but we’ll come to that.

By 1985 I was pretty clear on the notion that one could use the computational paradigm and the methods around it to explore a wide range of phenomena and questions. But for me the “killer app” was understanding the origins of complexity, and trying to build a general “theory of complexity”. It wasn’t physics, it wasn’t mathematics, it wasn’t computer science. It was something new. I called it “complex systems theory” (avoiding, at least for a while, a preexisting and completely different field of computer science called “complexity theory”).

I was 25 years old but already pretty established in science, with “mainstream cred” from my early work in physics, and a lot of momentum from my work in complexity and in practical computing. I liked a lot doing complex systems research myself, but I thought that to really make progress more people needed to be involved. So I started organizing. I launched a journal (which is still thriving today). And then I talked to universities (and other places) to see where the best place to start a research center would be.

Eventually I picked the University of Illinois, and so in the fall of 1986 there I went, themed as a professor of physics, mathematics and computer science, and director of the Center for Complex Systems Research. It was a good setup, but I quickly realized it wasn’t a good fit for me. Yes, I can organize things (and, yes, I’ve been a CEO now for more than half my life). But I do best when I’m organizing my own things, rather than being inside another organization. And, most important, I like actually doing things—like science—myself.

So rather quickly, I went to Plan B: instead of trying to get lots of other people to help push forward the science I wanted to see done, I’d set myself up to be as efficient as possible, and then I’d try to just do what I thought should be done myself. But the first thing I needed was good computational tools. And so it was that I started to build Mathematica, and what’s now the Wolfram Language, and to start my company, Wolfram Research.

We launched the first version of Mathematica in June 1988, and I think it’s fair to say that it was an instant hit. Physicists were particularly keen on it, and rather quickly it induced an interesting transition. Before Mathematica, if a typical physicist needed to compute something on a computer, they’d delegate it to someone else to actually do. But Mathematica for the first time made computing “high level” enough that physicists themselves could do their own computations. And it’s been wonderful to see over the years immense amounts of physics research done with the tools we’ve built. (It’s very nice to have been told many times that, apart from the internet, Mathematica is the largest methodological advance in the doing of physics in this generation.)

For a few years, the rapid development of Mathematica and our company entirely consumed me. But by 1991 it was clear that if I concentrated full-time on it, I could generate far more ideas than our company—at the size it then was—could possibly absorb. And so I decided it was time for me to execute the next step in my plan—and start actually using the tools we’d developed, to do the science I wanted to do. And so in 1991 I became a remote CEO (as I still am) and started work on my “science project”.

Maybe It Could Apply to Physics

Pretty quickly I had a table of contents for a book I planned to write—that would work through the consequences of the computational paradigm for complexity and other things. Part of it was going to be exploration: going out into the computational universe and studying what programs do—and part of it was going to be applications: seeing how to apply what I’d learned to different areas of science, and beyond. I didn’t know what I’d end up discovering, but I figured the process of writing the book would take a year or two.

My first question was just how general the phenomena I’d discovered in cellular automata actually were. Did they depend on things updating in parallel? Did they depend on having discrete cells? And so on. I started doing computer experiments. Often I’d think “this is finally a kind of system that isn’t going to do anything interesting”. And I kept on being wrong. I developed a mantra, “The computational animals are always smarter than you are”. Even when you can give all sorts of arguments about why such-and-such a system can’t do anything interesting, it’ll find a way to surprise you, and do something you’d never predict.

What was going on? I realized it was something very general, and very fundamental to basically any system. I call it the Principle of Computational Equivalence, and it’s now the guiding principle for a lot of my thinking. It explains computational irreducibility. It gives us a way to organize the computational universe. It tells us about the power of minds. It shows us how to think about the possibilities of artificial intelligence. It gives us perspectives on alien intelligence. It gives us a way to think about free will. And now it seems to give us a way to understand some ultimate questions about our perception of possible physical universes.

I think it was in 1990, right before I began the book project, that I started wondering about applying my ideas to fundamental physics. There’d been a whole “digital physics” movement (particularly involving my friend Ed Fredkin) around using cellular automata to model fundamental physics. But frankly it had put me off. I’d hear “I’ve discovered an electron in my cellular automaton”, but it just sounded like nonsense to me. “For goodness’ sake, learn what’s already known in physics!”, I would say. Of course I loved cellular automata, but—particularly with their rigid built-in notions of space and time—I didn’t think they could ever be more than allegories or toy models for actual physics, and pushing them as more than that seemed damaging, and I didn’t like it.

But, OK, so not cellular automata. But what underlying computational structure might actually work? I was pretty sure it had to be something that didn’t have its own built-in notion of space. And immediately I started thinking about networks.

Things like cellular automata are very clean and easy to define, and program on a computer. Networks—at least in their most obvious form—aren’t. My first foray into studying network-based systems was in 1992 and wound up as part of “Chapter 5: Two Dimensions and Beyond”. And like every other kind of system I studied, I found that these network systems could generate complex behavior.

Network systems

By 1993 I’d studied lots of kinds of abstract systems. And I was working down the table of contents of my planned book, and starting to ask questions like: “What can all this tell us about biology?” “What about human perception?” “Mathematics?” And it was quite exciting, because every time I’d look at a new area I’d realize “Yes, the things I’ve found in the computational universe really tell us new and interesting things here!”

So finally in 1994 I decided to try and tackle fundamental physics. I’ve got this whole shelf of drafts of what became my book, and I just pulled down the versions from 1994. It’s already got “Chapter 9: Fundamental Physics”, but the contents are still embryonic. It gradually grows through 1995 and 1996. And then in 1997, there it is: “Space as a Network”, “Time and Causal Networks”, etc.

Space as a Network

Time and Causal Networks

I’d figured out the story of how space could be made as the limit of a discrete network and how different possible updating sequences for graphs led to different threads of time. And I’d come up with the idea of causal invariance, and realized that it implied special relativity. I’d also begun to understand how curvature in space worked, but I didn’t yet “have” general relativity.

I’ve got all my notebooks from those times (and they’re even now in our online archives). It’s a little weird to pull them up now, and realize how tiny screens were back then. But for the most part everything still runs, and I can see how I started to do searches for “the rule” that could build something like our universe.

By then I was in year 6 of my “one-year” book project. At the beginning I’d called my book A Science of Complexity. But even by 1994 I’d realized that it was a bigger story than that—and I’d renamed the book A New Kind of Science. There was a whole intellectual edifice to discover, and I was determined to work through all the “obvious questions” so I could coherently describe it.

From a personal point of view it’s certainly the hardest project I’ve ever done. I was still remote-CEOing my company, but every day from early in the evening until perhaps 6 am I’d work on science, painstakingly trying to figure out everything I could. On a good day, I’d write a whole page of the book. Sometimes I’d spend the whole day just computing one number that would end up in tiny print in the notes at the back.

When I first embarked on the book project I talked to people quite a bit about it. But they’d always be saying “What about this? What about that?” But no! I had a plan and if I was ever going to get the project done, I knew I had to stick to it, and not get distracted. And so I basically decided to become a hermit, focus intensely on doing the project, and not talk to anyone about it (except that I did have a sequence of research assistants, including some very talented individuals).

The years went by. I’d started the book not long after I turned 30. Now I was approaching 40. But, slowly, inexorably, I was working through the table of contents, and getting towards the end. It was 2001 when I returned to put the finishing touches on Chapter 9. By then I had a pretty good idea how general relativity could work in my model, but in 2001 I got it: a derivation of general relativity that was kind of an analog for the emergence of spacetime from networks of my derivation from 16 years earlier of the emergence of fluid flow from simple cellular automata.

And finally, in 2002, after ten and a half years of daily work, my book was finished. And what I had imagined might be a short “booklet” of perhaps 150 pages had become a 1280-page tome, with nearly a third of a million words of detailed notes at the back. I intended the book to be a presentation (as its title said) of a new kind of science, based on the computational paradigm, and informed by studying the computational universe of simple programs.

But I had wanted to include some “use cases”, and physics was one of those, along with biology, mathematics and more. I thought what I had done in physics was a pretty interesting beginning, and gave great evidence that the computational paradigm would provide an important new way to think about fundamental physics. As I look back now, I realize that a whole 100 pages of A New Kind of Science are devoted to physics, but at the time I think I considered them mostly just a supporting argument for the value of the new kind of science I was developing.

“Please Don’t Do That Project”

A New Kind of Science launched on May 14, 2002, and quickly climbed onto bestseller lists. I don’t think there’s a perfect way to deliver big ideas to the world, but all the trouble I’d taken trying to “package” what I’d figured out, and trying make my book as clear and accessible as possible, seemed to be paying off. And it was great: lots of people seemed to get the core ideas of the book. Looking back, though, it’s remarkable how often media coverage of the book talked about physics, and the idea that the universe might be described by a simple program (complete with headlines like “Is the Universe a Computer?” and “The Cosmic Code”).

But as someone who’d studied the history of science for a long time, I full well knew that if the new paradigm I was trying to introduce was as important as I believed, then inevitably it would run into detractors, and hostility. But what surprised me was that almost all the hostility came from just one field: physics. There were plenty of physicists who were very positive, but there were others for whom my book somehow seemed to have touched a nerve.

As an almost lifelong lover of physics, I didn’t see a conflict. But maybe from the outside it was more obvious—as a cartoon in a review of my book in the New York Times (with a remarkably prescient headline) perhaps captured:

“You Know That Space-Time Thing? Never Mind”

If social media had existed at the time, it would undoubtedly have been different. But as it was, it was a whole unchecked parade: from Nobel prizewinners with pitchforks, to a then-graduate-student launching their career by “proving” that my physics was “wrong”. Why did they feel so strongly? I think they thought (and some of them told me as much) that if I was right, then what they’d done with their traditional mathematical methods, and all the wonderful things they’d built, would get thrown away.

I never saw it that way (and, ironically, I made my living building a tool used to support those traditional mathematical methods). But at the time—without social media—I didn’t have a useful way to respond. (To be fair, it often wasn’t clear there was much to say beyond “I don’t share your convictions”, or “Read what the book actually says… and don’t forget the 300,000 words of notes at the back!”.)

But there was unfortunately a casualty from all this: physics. As it now turns out (and I’m very happy about it), far from my ideas being in conflict with what’s been done in physics, they are actually beautifully aligned. Yes, the foundations are different. But all those traditional mathematical methods now get extra power and extra relevance. But it’s taken an additional 18 years for us to find that out. And it almost didn’t happen at all.

It’s been interesting to watch the general progression of the ideas I discussed in A New Kind of Science. What’s been most dramatic (and I’m certainly not solely responsible) has been the quiet but rapid transition—after three centuries—of new models for things being based not on equations but instead on programs. It’s happened across almost every area. With one notable exception: fundamental physics.

Perhaps it’s partly because the tower of mathematical sophistication in models is highest there. Perhaps it’s because of the particular stage of development of fundamental physics as a field, and the fact that, for the most part, it’s in a “work out the existing models” phase rather than in a “new models” phase.

A few months after my book appeared, I did a big lecture tour of universities and the like. People would ask about all kinds of things. But pretty much everywhere, some people (quite often physicists) would ask about fundamental physics. But, somewhat to my disappointment, their questions tended to be more philosophical than technical. Somehow the notion of applying these ideas to fundamental physics was just a little too dangerous to discuss.

But I decided that whatever other people might think, I should see what it would take to make progress. So in 2004 I set about expanding what I’d figured out so far. I made my explorations more streamlined than before, and pretty soon I was beginning to write summaries of what was out there:

Network Substitution Systems

But there was something that bugged me. Somehow my model felt a bit fragile, a bit contrived. At least with the formalism I had, I couldn’t just “write down any rule”; it was a bit like writing down numbers, but they had to be prime. And there was another, more technical, problem as well. For my derivations of special and general relativity to work, I needed a model that was causal invariant, and my searches were having a hard time finding nontrivial examples.

And right in the middle of trying to figure out what to do about this, something else happened: I started working on Wolfram|Alpha. In a sense Wolfram|Alpha was an outgrowth of A New Kind of Science. Before the book I had assumed that to build a serious computational knowledge engine (which is something I had, in one form or another, been interested in since I was a kid) one would first have to solve the general problem of AI. But one of the implications of my Principle of Computational Equivalence is that there is no bright line between “intelligence” and “mere computation”. And that meant that with all our computational capabilities we should already be able to build a computational knowledge engine.

And so I decided to try it. Of course at the beginning we didn’t know if it would work. (Is there too much data in the world? Is it too hard to make it computable? Is it too hard to understand natural language? Etc.) But it did work. And in 2009 we launched Wolfram|Alpha.

But I was still enthusiastic about my physics project. And in February 2010 I made it a major part of a talk I gave at TED, which the TED team initially titled “Computing a Theory of Everything” (confusingly, there also now seems to be a version of the same talk with the alternate title “Computing a Theory of All Knowledge”). And—as I was recently reminded—I told the audience that I was committed to seeing the project done, “to see if, within this decade, we can finally hold in our hands the rule for our universe”.

Computing a Theory of Everything

OK, well, it’s now April 2020. So we didn’t make it “within the decade”. Though, almost exactly 10 years later, we’re now launching the Wolfram Physics Project and I think we’re finally on a path to it.

So why didn’t this happen sooner? Frankly, in retrospect, it should have. And if I’d known what I know now, I absolutely would have done it. Yes, our Wolfram Language technology has gotten better in the course of the decade, and that’s made the project considerably easier. But looking back at what I had done even in 2004, I can now see that I was absolutely on the right track, and I could have done then almost everything I’m doing now.

Most of the projects I’ve ever done in my life—from my “Concise Directory of Physics” onward—I’ve done first and foremost because I was interested in them, and because I thought I would find them intellectually fulfilling. But particularly as I’ve gotten older, there’s been another increasingly important factor: I find I get pleasure out of doing projects that I think other people will find useful—and will get their own fulfillment out of. And with the tools I’ve built—like Mathematica and Wolfram|Alpha and the Wolfram Language—as well as with A New Kind of Science and my other books and writings, that’s worked well, and it’s been a source of great satisfaction to me.

But with the physics project, there was a problem. Because after I effectively “tested the market” in 2002, it seemed as if my core “target customers” (i.e. physicists interested in fundamental physics) didn’t want the project. And in fact a few of them came right out and said it: “Please don’t do that project”.

I personally thought the project would be really interesting. But it wasn’t the only project I thought would be interesting. And basically I said “Nah, let me not put lots of effort into a project people basically don’t want”.

What did I do instead? The most important theme of the past decade for me has been the emergence of the Wolfram Language as a full-scale computational language, and my increasing realization of the significance of having such a language. I view it as being a key step in the development of the computational paradigm—and the crucial link between what computation makes possible, and the way we humans think about things.

It provides a way for us to express ourselves—and organize our thoughts—in computational terms. I view it in some ways as analogous to the creation of mathematical notation four centuries or so ago. And just as that launched the modern development of mathematical science and mathematical thinking, so now I believe that having a full-scale computational language will open up the development of all the “computational X” fields, and the full potential of computational thinking. And this is not something just limited to science. Through ideas like computational contracts I think it’s going to inform a lot of how our world operates in the years to come, and how we want to shape (through ethics, etc.) what AIs do, and how we define the future of the human condition.

It’s not yet nearly as obvious as it will become. But I think computational language is eventually going to be seen as a pivotal intellectual idea of our times. It also has the rare and interesting feature of being something that is both fundamental and creative. It’s about “drilling down” to find the essence both of our thinking and of what computation makes possible. But it’s also about the creative design of a language.

And for me personally it’s in many ways the ideal project. It involves developing deep understanding across as many areas as possible. It involves the continual exercise of creativity. And it’s also a big project, that benefits from organizational skills and resources. And I’m very happy indeed to have spent the past decade on it.

Sometimes I’ve thought about how it compares as a project to fundamental physics. At a practical level, building a computational language is like building a progressively taller tower—from which one can progressively see further, and occasionally reach major new kinds of applications and implications. Fundamental physics is much more of a one-shot project: you try an approach to fundamental physics and either it works, or it doesn’t; there’s not the same kind of feeling of progressively building something.

Computational language also began to feel to me like an ultimately more fundamental project—at least for us humans—than fundamental physics. Because it’s about the generality of computation and the generality of our ways of thinking, not the specifics of the physical universe in which we “happen to exist”. And as I thought about the distant future (complete with my “box of a trillion souls” image), the physical universe seemed less and less relevant to the essence of the human condition. As a kind of “disembodied digital soul”, it doesn’t matter what the underlying “machine code” of the universe is; you’re operating just at the level of abstract computation. So maybe the fundamental theory of physics is ultimately just an “implementation note”. (As I now realize from our recent discoveries, the actual situation is more nuanced, and much more philosophically fascinating.)

But even though my main focus has been computational language and its implications, I’ve been doing quite a few other things. Occasionally I’ve even written about physics. And I’ve kept thinking about the fundamental physics project. Is there a “positive” way, I wondered, to do the project, so as many people as possible will be pleased to see it done?

I wondered about offering a prize for finishing what I had started. I had a great experience with something like that in 2007, when Alex Smith won the prize I had set up for proving my conjecture that a particular Turing machine was universal, thereby establishing what the very simplest possible universal Turing machine is. And in fact last fall I put up some new prizes for longstanding questions about rule 30. But for fundamental physics, I didn’t think a prize could work. For the Turing machine problem or the rule 30 problems it’s realistic for someone to just “swoop in” and figure it out. For fundamental physics, there’s a big a tower of ideas to learn just to get started.

From time to time I would talk to physicist friends of mine about the fundamental physics project. (I usually didn’t even try with physicists I didn’t know; they would just give me quizzical looks, and I could tell they were uncomfortably wondering if I had lost my marbles.) But even with my friends, when I started to describe the details of the project, I don’t think over the course of 18 years I managed to keep anyone’s attention for more than 15 minutes. And quite soon I would just ask “So, what’s new in physics as far as you are concerned?”, and off we would go talking about string theory or particle phenomenology or conformal field theory or whatever. (And sometimes they would say, surprised that I cared, “Wow, you still really know about this stuff!”)

Finally, though, a few years ago I had an idea about the fundamental physics project: why not just do the project as an educational project? Say, more or less, “We’re going to try to climb the Mount Everest of science. We don’t know if we’ll succeed, but you might enjoy seeing what we do in trying to make the climb.” After all, when I talked to non-physicists—or kids–about the project, they were often very excited and very curious. And with all the effort put into STEM education, and into encouraging people to learn about science, I thought this would be a good opportunity. But whenever I really thought about doing the project (and I was still assuming that we’d just be “starting the climb”; I had no idea we’d be able to get as far as we have now), I came back to the “problem of the physicists” (or “phyzzies” as I nicknamed them). And I didn’t have a solution.

And so it was that year after year, my project of trying to find the fundamental theory of physics languished.

Two Young Physicists and a Little Idea

Every year for the past 17 years—starting the year after A New Kind of Science was published—we’ve held an annual summer school. It always ends up with an outstanding group of students (mostly college, grad and postdoc). And for me (and also some of our R&D staff) it’s become a once-a-year three-week opportunity to explore all sorts of new ideas. In the early years, the Summer School concentrated specifically on what was in my book (it was originally designed to solve the problem of people asking us for guidance on how to do the kind of science in the book). In more recent years, it’s basically become about all aspects of the methodology that I and our company have developed.

But from the beginning until now, there’ve always been a few students each year who say they want to work on “Chapter 9”. Many interesting projects have come out of that, though few really used the full network models I’d developed, basically because those were too technically difficult to use in projects that could get done in three weeks.

In 2014, though, a young student just graduating with a degree in physics from Moscow State University (and with various competitive coding achievements to his name) came to the Summer School, determined to work on network-based models of fundamental physics. As the beginning of his project description put it: “The ultimate goal is to figure out the fundamental theory of physics.” His actual project was a nice study of the longtime behavior of networks with planarity-preserving rules. The next year, having now completed a master’s degree in physics in Moscow, the same student—whose name is Max Piskunov—came to the Summer School a second time (something we rarely allow), to continue his work on network-based models of fundamental physics.

After the Summer School, he was very keen to continue working on these models, and asked me if I could be a PhD advisor for him. I said that unfortunately I wasn’t in that business anymore, and that even more unfortunately I didn’t know any currently active physicists who’d be suitable. As it turned out, he succeeded in finding a university where there were physicists who were now working on “network science”—though eventually they apparently told him “It’s too risky for you to work on network models for physics; there isn’t a well-defined criterion for success”.

From time to time I would ask after Max, and was a little disappointed to hear that he was off doing a PhD on “traditional” cosmology-meets-particle-physics. But then, in 2018 Max showed up again as a visitor at our Summer School—still really wanting to work on network-based models of fundamental physics. I said I’d really like to work on them too, but just didn’t see a way to do it. He said he at least wanted to try his hand at writing more streamlined code for them.

Over the next couple of months I would occasionally talk to Max on the phone, and every time I felt more and more like I really should actually try to do something on the project; I’d been putting it off far too long.

But then I had a little idea. I’d always been saying that I wanted models that are as minimal and structureless as possible. And then I’d say that networks were the best way I knew to get these, but that there were probably others. But even though I thought about lots of abstract structures through my work on the Wolfram Language, I never really came up with anything I was happy with. Until September 9, 2018.

I was asking myself: what lies at the heart of abstract representations, in computation, in mathematics, and so on? Well, I realized, I should know! Because in a sense that’s what I’ve been trying to model all these years in the Wolfram Language, and in SMP before it. And, actually, for more than 40 years, everything I’ve done has basically been built on the same ultimate underlying answer: transformation rules for symbolic expressions.

It’s what the Wolfram Language is based on (and it’s what SMP was based on too). So why hadn’t I ever thought of using it for models of fundamental physics? The main reason was that somehow I never fully internalized that there’d be something useful left if one “took all the content out of it”. Most of the time we’re defining transformation rules for symbolic expressions that are somehow useful and meaningful to us—and that for example contain functions that we think of as “representing something”.

It’s a little shocking that after all these years I could basically make the same mistake again: of implicitly assuming that the setup for a system would be “too simple for it to do anything interesting”. I think I was very lucky all those years ago with cellular automata, that it was so easy to try an experiment that I did it, just “on a whim”.

But in September 2018 I think I was feeling more motivated by the abstract aesthetics than anything else. I realized there might be an elegant way to represent things—even things that were at least vaguely similar to the network-based models I had studied back in the 1990s. My personal analytics record that it took about 8 minutes to write down the basics:

Relation nets

There it was: a model defined by basically a single line of Wolfram Language code. It was very elegant, and it also nicely generalized the network models I had long thought about. And even though my description was written (for myself) in language-designer-ese, I also had the sense that this model had a certain almost-mathematical purity to it. But would it do anything interesting? Pretty soon I was doing what I basically always seem to end up doing: going out into the computational universe of possibilities and exploring. And immediately I was finding things like:

The computational universe of possibilities and exploring

When one looks at the array of squares produced, say, by cellular automata, our human visual system is pretty good at giving us an impression of how much complexity is involved. But that works much less well for things like graphs and networks, where in particular there is inevitably much more arbitrariness in their rendering.

I wanted to do more systematic studies, but I expected it was going to be somewhat complicated, and I was in the middle of working on the final stages of design for Version 12 of the Wolfram Language. Meanwhile, Max took it upon himself to create some optimized low-level code. But in the fall of 2018 he was taking a break from graduate school, working at Lyft in Silicon Valley on machine vision for autonomous driving. Still, by January 2019 he had code running, and within a few minutes of trying it out, I was finding things like:

More discoveries

This was going to be interesting. But I was still in the middle of other things, and Max was going to come to the Summer School again—so I put it aside again for a few months.

Then on May 24 Murray Gell-Mann, the inventor of quarks, and a physicist I had known at Caltech, died. And as has become something of a tradition for me, I spent some days writing an obituary piece about him. And in doing that, I began thinking about all those things I had liked so much so long ago in particle physics. But what had happened to them in the past 40 years?

I started looking around on the web. Some things had definitely advanced. The mass of the lambda, that I had always known as 1115 MeV, was now measured as 1115.683 MeV. Calculations that I’d done to a first order of approximation had now been done to three orders. But in general I was shocked, and saddened. Things that had generated so much excitement and had been the pride of particle physics were now barely making it as stubs on Wikipedia. What had happened to this beautiful field? It felt like I was seeing what had once been a bustling and glorious city, now lying almost abandoned, and in some ways in ruins.

Of course, this is often the rhythm of science: some methodological advance sparks a golden age, and once everything easily accessible with that methodology has been done, one is faced with a long, hard slog that can last a century before there is some new methodological advance.

But going to the Summer School in June, I was again thinking about how to do my fundamental physics project.

Max was there. And so—as an instructor—was Jonathan Gorard. Jonathan had first come to the Summer School in 2017, just before his last year as an undergraduate in mathematics (+ theoretical physics, computer science and philosophy) at King’s College London. He’d been publishing papers on various topics since he was 17, most recently on a new algorithm for graph isomorphism. He said that at the Summer School he wanted to work either on cosmology in the context of “Chapter 9”, or on something related to the foundations of mathematics.

I suggested that he try his hand at what I considered something of an old chestnut: finding a good symbolic way to represent and analyze automated proofs, like the one I had done back in 2000 of the simplest axiom system for logic. And though I had no idea at the time, this turned out to be a remarkably fortuitous choice. But as it was, Jonathan threw himself into the project, and produced the seeds of what would become through his later work the Wolfram Language function FindEquationalProof.

Jonathan had come back to the Summer School in 2018 as an instructor, supervising projects on things like infinite lists and algebraic cryptography. And now he was back again as an instructor in 2019, having now also become a graduate student at Cambridge, with a nice fellowship, and nominally in a group doing general relativity.

It had been planned that Jonathan, Max and I would “talk about physics” at the Summer School. I was hopeful, but after so many years a bit pessimistic. I thought my little idea defined a new, immediate path about what one might do. But I still wasn’t convinced there was a “good way to do the project”.

But then we started discussing things. And I started feeling a stronger and stronger sense of responsibility. These ideas needed to be explored. Max and Jonathan were enthusiastic about them. What excuse did I have not to pursue the ideas, and see where they could go? Wouldn’t it be terrible if we failed to find the fundamental theory of physics just because I somehow got put off working on it?

Of course, there were technical, physics issues too. One of the big ones—which had got me stuck back in 2004—was that I’d had difficulty finding examples of rules that both had nontrivial behavior, and showed the property of causal invariance needed to basically “generate a single thread of time”. Why did I care so much about causal invariance? First, because it gave me derivations of both special and general relativity. But philosophically even more important to me, because it avoided something I considered highly undesirable: a view of quantum mechanics in which there is a giant tree of possible histories, with no way to choose between them.

Jonathan had said a few times early in the Summer School that he didn’t see why I was so concerned about causal invariance. I kept on pushing back. Then one day we went on a long walk, and Jonathan explained an idea he had (which, knowing him, he may have just come up with right there). What if the underlying rules didn’t need to have causal invariance, because us observers would implicitly add it just by the way we analyze things?

What was this idea really? It was an application of things Jonathan knew from working on automated theorem proving, mixing in ideas from general relativity, and applying them to the foundations of quantum mechanics. (Basically, his concept was that we observers, because we’re branching just like the system we’re observing, effectively define “lemmas” to help us make sense of what we observe, and these lead to effective rules that have causal invariance.)

At first I was skeptical. But the issue with not finding enough causal invariance had been a blocker 16 years earlier. And it felt like a big weight lifted if that issue could be removed. So by the end of the walk I was convinced that, yes, it was worth looking at rules even if they were not explicitly causal invariant, because they could still be “saved” by the “Jonathan Interpretation of Quantum Mechanics” as I called it (Jonathan prefers the more formal term “completion interpretation”, referring to the process of creating lemmas, which is called “completion” in automated theorem proving). As it turns out, the jury is still out on whether causal invariance is intrinsic or “in the eye of the observer”. But Jonathan’s idea was crucial as far as I was concerned in clearing the way to exploring these models without first doing a giant search for causal invariance.

It took another month or so, but finally on August 10 I sent back to Jonathan and Max a picture we had taken, saying “The origin picture … and …. *I’m finally ready to get to work*!”

The origin picture

Oh My Gosh, It’s Actually Going to Work!

August 29, 2019, was a big birthday for me. Shockingly quickly I had gone from being “the youngest person in the room” to the oldest. But now I was turning 60. I did a “looking to the future” livestream that day, and a few days later I gave a speech at my birthday party. And both times I said that now, finally, I was going to make a serious effort on my project to find the fundamental theory of physics. And to myself I was saying “This is something I’ve been talking about doing for more than half my life; if I don’t do it now, it’s time to give up on the idea that I ever will.”

“Maybe it’ll work, maybe it won’t”, I was thinking to myself. “But this is sort of the last chance for me to find out, so let’s give it a try.” And so we started. My original plan was in a sense fairly modest. I wanted to take the things I’d already investigated, and “spruce them up” in the context of my new models—then get everything out there for other people to help in what I expected would be a long, hard grind towards a fundamental theory of physics.

The first step was to build tools. Nice, streamlined Wolfram Language tools. Max had already written some core simulation functions. But now it was a question of figuring out about visualizations, enumerations and various forms of analysis. How do you best display a hypergraph? What’s the right way to enumerate rules? And so on.

But by the middle of October we had the basics, and by the end of October I’d pretty much cleared my calendar of everything but the “bare CEO essentials”, and was ready to just “do physics” for a while. It felt a little like being back where I was in the 1970s. Except for one huge difference: now I had the whole technology tower I’d spent most of the intervening 40 years building. No scratch paper. No handwritten calculations. Just notebooks and Wolfram Language. A medium for thinking directly in computational terms.

And it was exhilarating. Everything went so fast. I was basically forming my thoughts directly in the language, typing as I went, then immediately having the computer show me the results. It felt as if the computer was providing about as direct an amplification of my cognitive abilities as I could imagine. And I even started to feel a bit better about the multi-decade delay in the project. Because I realized that even if my only goal from the beginning had been to just do this project, my best chance would pretty much have been to build the Wolfram Language first.

There was something that made me nervous, though. Back in 1991 when I started working on A New Kind of Science, I’d also had the experience of rapid discovery. But what had happened then was that I hadn’t been able to stop—and I’d just dug in and gone on and on discovering things, for a decade. Intellectually it had been very rewarding, but personally it was extremely grueling. And I didn’t want to go through anything like that again. So I resolved that instead of going on until we’d “answered all the obvious questions”, we’d just figure out the minimum needed to coherently explain the ideas, then turn it over to the world to share the fun of taking it further.

Pretty soon we started outlining the website. There’d be lots of technical information and exposition. There’d be a Registry of Notable Universes for candidate models we’d identified. To lighten the load of what I thought might be a project with glacially slow progress to report, there’d be “universe swag”. And on the front of the website I was planning to write, a little apologetically: “We’re going to try to find a fundamental theory of physics. It may be the wrong approach, or the wrong century, but we’re going to try anyway”.

But meanwhile I was spending almost every waking hour doing that “trying”. I was looking at thousands of rules, slowly building up intuition. And we were talking about how what I was seeing might relate to things in physics, like space and time and quantum mechanics and general relativity. And it got more and more interesting.

Things I’d thought vaguely about in the past we were now starting to see very explicitly in the rules I was running. We knew enough to know what to look for. But thinking abstractly about something is very different from seeing an actual example. And there were many surprises. So many “I never thought it might do that”s. But having seen examples one could then start to build up an abstract framework. Without the examples one wouldn’t ever have had the imagination to come up with it. But once one saw it, it often seemed maddeningly “obvious”.

Our first big target was to understand the nature of space. How could the mathematical structures that have been used to characterize space emerge from our simple rules? I thought I already knew the basic answer from what I did back in the 1990s. But now I had a more streamlined model, and more streamlined tools, and I wanted to tighten my understanding.

I generated thousands of screenfuls of visualizations:

Screenfuls of visualizations

I think if I had lived a century earlier I would have been a zoologist. And what I was doing here was a kind of zoology: trying to catalog the strange forms and habits of these rules, and identify their families and phyla. It was a glimpse into an unseen part of the computational universe; a view of something there was no particular reason that us humans would have a way to understand. But I was pretty sure that at least some of these rules would connect with things we already knew. And so I started to hunt for examples.

Most of what I do on a daily basis I can do on just one computer. But now I needed to search millions of cases. Conveniently, there’s pretty seamless support for parallel computation in the Wolfram Language. So soon I’d commandeered about 100 cores, and every computation I could immediately parallelize. (I was also set up to use external cloud services, but most of the time I was doing computations that with the 100X speedup were either taking only seconds, and were part of my “interactive thinking loop”, or were easy enough to run overnight on my own machines, with the minor thrill of seeing in the morning what they’d produced.)

Back when I was studying things like cellular automata in the 1980s and 1990s I used to print out endless arrays of little thumbnails, then look through them, and type in the identifiers for ones I thought were worth another look. Now that was all a lot more streamlined, with images in notebooks, selectable with a simple click. But how could I automate actually looking through all these rules?

One of the things I’ve learned from decades of studying the computational universe is to take seriously my mantra “The computational animals are always smarter than you are”. You think you’ve come up with a foolproof test for catching rules that have such-and-such a behavior. Well, some rule will turn out to have a way around it, doing something you never thought about. And what I’ve found is that in the end the best way to have a chance of catching the unexpected is to use the “broadest spectrum” tools one has, which typically means one’s own eyes.

Pretty soon one begins to have a mental classification of the kinds of forms one’s seeing. And if one verbalizes it, one ends up describing them in terms of objects we’re used to (“ball of wool”, “sea urchin”, etc.) And in modern times that suggests a way to get some help: use machine learning that’s been trained, like we have, to distinguish these different kinds of things. And so instead of just making simple arrays of pictures, I often made feature space plots, where forms that “seem similar” were grouped together.

And that meant that in just a glance I could typically see what unexpected outliers there might be. I looked through a particular collection of 79 million rules this way (with just a little additional filtering). First I found this—something that might seem more in line with my childhood interest in space, as in spacecraft, than with space in fundamental physics:

ResourceFunction["WolframModel"]
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2, 3}, {1, 4, 5}} -> {{3, 3, 6}, {6, 6, 5}, {4,
     5, 6}}, {{0, 0, 0}, {0, 0, 0}}, 500, "FinalStatePlot"]

And pretty soon things I also found things like these:

Manifolds

These are not things I could ever guess would be there. But having found them, they can be connected to existing mathematical ideas (in this case, about manifolds). But seeing these examples embedded in so many others that don’t immediately connect to anything we know immediately makes one wonder whether perhaps our existing mathematical ideas can be generalized—and whether maybe this could be the key to understanding how space can emerge from our underlying rules.

Both in its early history, and in modern times, mathematics has been inspired by the natural world. Now we’re seeing it inspired by the computational world. How does one generalize curvature to fractional-dimensional space? What does it mean to have a space with smoothly varying dimension? And so on. They’re elegant and interesting mathematical questions raised by looking at the computational world.

It could have been that everything in the computational world of our models would immediately run into computational irreducibility, and that mathematical ideas would be essentially powerless—as they were when I was studying cellular automata in the 1980s. But by November of last year, it was beginning to become clear that things were different now, and that there was a chance of a bridge between the mathematical traditions of existing theoretical physics and the kinds of things we needed to know about our models.

Once there’s sophisticated mathematics, we can begin to rely on that. But to explore, we still have to use things like our eyes. And that makes visualization critical. But in our models, what’s ultimately there are graphs, or hypergraphs. Nowadays we’ve got good automated tools in the Wolfram Language for coming up with “good” ways to lay out graphs. But it’s always arbitrary. And it would be much better if we could just “intrinsically” understand the graph. But unfortunately I don’t think we humans are really built for that. Or at least I’m not. (Though years ago, before computers could do automated graph layout, I once looked for a human “graph detangler” and found a young student who was spectacularly better than everyone else. Interestingly, she later became a distinguished knitwear designer.)

But to try to help in “understanding” graphs I did have one plan—that actually I’d already hatched when I was first thinking about these things in the early 1990s: use VR to really “get inside” and experience graphs. So now—with VR back in vogue—I decided to give it a try. We’re still working on a fully interactive VR environment for manipulating graphs, but to start off I tried just using VR to explore static graphs. And, yes, it was somewhat useful. But there was a practical problem for me: rapid descent into motion sickness. An occupational hazard, I suppose. But not one I expected in studying fundamental physics.

Given a better understanding of space in our models, we started looking more carefully at things like my old derivation of the Einstein equations for gravity. Jonathan tightened up the formalism and the mathematics. And it began to become clear that it wasn’t just a question of connecting our models to existing mathematical physics: our models were actually clarifying the existing mathematical physics. What had been pure, abstract mathematics relying on potentially arbitrary collections of “axiomatic” assumptions one could now see could arise from much more explicit structures. Oh, and one could check assumptions by just explicitly running things.

Doing something like deriving the Einstein equations from our models isn’t at some level particularly easy. And inevitably it involves a chain of mathematical derivations. Pure mathematicians are often a little horrified by the way physicists tend to “hack through” subtle mathematical issues (“Do these limits really commute?” “Can one uniquely define that parameter?” Etc.). And this was in many ways an extreme example.

But of course we weren’t adrift with no idea whether things were correct—because at least in many cases we could just go and run a model and measure things, and explicitly check what was going on. But I did feel a little bad. Here we were coming up with beautiful mathematical ideas and questions. But all I could do was barbarically hack through them—and I just kept thinking “These things deserve a mathematician who’ll really appreciate them”. Which hopefully in time they’ll get.

As we went through November, we were starting to figure out more and more. And it seemed like every conversation we had, we were coming up with interesting things. I didn’t know where it would all go. But as a committed preserver of data I thought it was time to start recording our conversations, as well as my own experiments and other work on the project. And altogether we’ve so far accumulated 431 hours of recordings.

We’re going to make these recordings available online. And with Wolfram Language speech-to-text it’s easy to process their audio, and to get word clouds that indicate some of the flow of the project:

WordCloud

What Terrible Timing!

So there we were in the middle of February. Things had gone better than I’d ever imagined they could. And we were working intensely to get everything ready to present to the world. We set a date: March 16, 2020. We were planning announcements, technical documents, an 800ish-page book, an extensive website, livestreams, outreach. All the kinds of things needed to launch this as a project, and explain it to people.

But meanwhile—like so many other people—we were watching the developing coronavirus epidemic. I’d asked the data science and biology teams at our company to start curating data and making it available. I’d been looking at some epidemic modeling—some of it even done with cellular automata. I’d noted that the spreading of an epidemic in a human network was bizarrely similar to the growth of geodesic balls in hypergraphs.

What should we do? We kept going, steadily checking off items on our project-management tracking list. But as March 16 approached, it was clear there was now a pandemic. The US began to shut down. I did an AMA on my experience as a 29-year work-from-home CEO. Meetings about physics were now interspersed with meetings about shutting down offices. Numerous people at our company pointed out to me that Isaac Newton had come up with the core ideas for both calculus and his theory of gravity in 1665, when Cambridge University had been closed because of the plague.

I oscillated between thinking that in the midst of such a worldwide crisis it was almost disrespectful to be talking about something like a fundamental theory of physics, and thinking that perhaps people might like an intellectual distraction. But in the end we decided to wait. We’d get everything ready, but then pause.

And after all, I thought, after waiting more than thirty years to do this project, what’s a few more weeks?

What Happens Now

If you’re reading this, it means our project is finally released. And we begin the next stage in the long journey I’ve described here. I can’t help echoing Isaac Newton’s words from the 1686 preface to his Principia: “I heartily beg that what I have done here may be read with forbearance; and that my labors in a subject so difficult may be examined, not so much with a view to censure, as to remedy their defects.”

But the world has changed since then, and now we can send out tweets and do livestreams. I’m thrilled about what we’ve been able to figure out, not least because I consider it so elegant and so intellectually satisfying. Sometimes back when I was doing particle physics, I’d think “That’s a bit hacky, but if that’s how our universe works, so be it”. Now I feel a certain pride that we seem to live in a universe that works in such an elegant way.

Forty years ago I thought I’d spend my life as a physicist. Things didn’t work out that way, and I’m very happy with what happened instead. But now after decades “in the wilderness” I’m back. Not just “doing physics”, but trying to attack the very center of it. I’m quite certain that if I’d spent the past 40 years as a physicist nothing like this would have been possible. It’s one of those cases where it’s almost inevitable that making progress will need that strange combination of having inside knowledge yet “being an outsider”.

Of course, we’re not finished. I think we finally have a path to a fundamental theory of physics. But we’re not there yet. And what I’m hoping now is that we can mount a project that will succeed in getting us there.

It’s going to take physicists, mathematicians, computer scientists and others. It’s going to take ideas and work, and perhaps quite a bit of time. I hope it will be a worldwide effort that can happen across a spectrum of academic and other environments. Most of it will be a decentralized effort.

I personally look forward to continuing to be deeply involved—and I’m hoping that we’ll be able to set up a substantial centralized effort to apply the decades of experience we’ve had in doing highly challenging R&D projects to make progress as rapidly as possible on this project.

It’s been a great privilege for me to be “in the right place at the right time” to discover what we’ve discovered. Physics did so much for me in my early years, and I’m thrilled to have the opportunity to “give something back” so many years later. I can’t wait to see what will develop as we home in on a fundamental theory of physics. But at this stage in my life perhaps my greatest pleasure is to see others get excitement and fulfillment from things I put into the world. And to provide something for the next generation of 12-year-old physics wannabes.

So let’s all go and try to find the fundamental theory of physics together! It’s going to be great!

Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful

$
0
0
physics-announce-thumbnail

Visual summary of the Wolfram Physics Project

I Never Expected This

It’s unexpected, surprising—and for me incredibly exciting. To be fair, at some level I’ve been working towards this for nearly 50 years. But it’s just in the last few months that it’s finally come together. And it’s much more wonderful, and beautiful, than I’d ever imagined.

In many ways it’s the ultimate question in natural science: How does our universe work? Is there a fundamental theory? An incredible amount has been figured out about physics over the past few hundred years. But even with everything that’s been done—and it’s very impressive—we still, after all this time, don’t have a truly fundamental theory of physics.

Back when I used do theoretical physics for a living, I must admit I didn’t think much about trying to find a fundamental theory; I was more concerned about what we could figure out based on the theories we had. And somehow I think I imagined that if there was a fundamental theory, it would inevitably be very complicated.

But in the early 1980s, when I started studying the computational universe of simple programs I made what was for me a very surprising and important discovery: that even when the underlying rules for a system are extremely simple, the behavior of the system as a whole can be essentially arbitrarily rich and complex.

And this got me thinking: Could the universe work this way? Could it in fact be that underneath all of this richness and complexity we see in physics there are just simple rules? I soon realized that if that was going to be the case, we’d in effect have to go underneath space and time and basically everything we know. Our rules would have to operate at some lower level, and all of physics would just have to emerge.

By the early 1990s I had a definite idea about how the rules might work, and by the end of the 1990s I had figured out quite a bit about their implications for space, time, gravity and other things in physics—and, basically as an example of what one might be able to do with science based on studying the computational universe, I devoted nearly 100 pages to this in my book A New Kind of Science.

I always wanted to mount a big project to take my ideas further. I tried to start around 2004. But pretty soon I got swept up in building Wolfram|Alpha, and the Wolfram Language and everything around it. From time to time I would see physicist friends of mine, and I’d talk about my physics project. There’d be polite interest, but basically the feeling was that finding a fundamental theory of physics was just too hard, and only kooks would attempt it.

It didn’t help that there was something that bothered me about my ideas. The particular way I’d set up my rules seemed a little too inflexible, too contrived. In my life as a computational language designer I was constantly thinking about abstract systems of rules. And every so often I’d wonder if they might be relevant for physics. But I never got anywhere. Until, suddenly, in the fall of 2018, I had a little idea.

It was in some ways simple and obvious, if very abstract. But what was most important about it to me was that it was so elegant and minimal. Finally I had something that felt right to me as a serious possibility for how physics might work. But wonderful things were happening with the Wolfram Language, and I was busy thinking about all the implications of finally having a full-scale computational language.

But then, at our annual Summer School in 2019, there were two young physicists (Jonathan Gorard and Max Piskunov) who were like, “You just have to pursue this!” Physics had been my great passion when I was young, and in August 2019 I had a big birthday and realized that, yes, after all these years I really should see if I can make something work.

So—along with the two young physicists who’d encouraged me—I began in earnest in October 2019. It helped that—after a lifetime of developing them—we now had great computational tools. And it wasn’t long before we started finding what I might call “very interesting things”. We reproduced, more elegantly, what I had done in the 1990s. And from tiny, structureless rules out were coming space, time, relativity, gravity and hints of quantum mechanics.

We were doing zillions of computer experiments, building intuition. And gradually things were becoming clearer. We started understanding how quantum mechanics works. Then we realized what energy is. We found an outline derivation of my late friend and mentor Richard Feynman’s path integral. We started seeing some deep structural connections between relativity and quantum mechanics. Everything just started falling into place. All those things I’d known about in physics for nearly 50 years—and finally we had a way to see not just what was true, but why.

I hadn’t ever imagined anything like this would happen. I expected that we’d start exploring simple rules and gradually, if we were lucky, we’d get hints here or there about connections to physics. I thought maybe we’d be able to have a possible model for the first seconds of the universe, but we’d spend years trying to see whether it might actually connect to the physics we see today.

In the end, if we’re going to have a complete fundamental theory of physics, we’re going to have to find the specific rule for our universe. And I don’t know how hard that’s going to be. I don’t know if it’s going to take a month, a year, a decade or a century. A few months ago I would also have said that I don’t even know if we’ve got the right framework for finding it.

But I wouldn’t say that anymore. Too much has worked. Too many things have fallen into place. We don’t know if the precise details of how our rules are set up are correct, or how simple or not the final rules may be. But at this point I am certain that the basic framework we have is telling us fundamentally how physics works.

It’s always a test for scientific models to compare how much you put in with how much you get out. And I’ve never seen anything that comes close. What we put in is about as tiny as it could be. But what we’re getting out are huge chunks of the most sophisticated things that are known about physics. And what’s most amazing to me is that at least so far we’ve not run across a single thing where we’ve had to say “oh, to explain that we have to add something to our model”. Sometimes it’s not easy to see how things work, but so far it’s always just been a question of understanding what the model already says, not adding something new.

At the lowest level, the rules we’ve got are about as minimal as anything could be. (Amusingly, their basic structure can be expressed in a fraction of a line of symbolic Wolfram Language code.) And in their raw form, they don’t really engage with all the rich ideas and structure that exist, for example, in mathematics. But as soon as we start looking at the consequences of the rules when they’re applied zillions of times, it becomes clear that they’re very elegantly connected to a lot of wonderful recent mathematics.

There’s something similar with physics, too. The basic structure of our models seems alien and bizarrely different from almost everything that’s been done in physics for at least the past century or so. But as we’ve gotten further in investigating our models something amazing has happened: we’ve found that not just one, but many of the popular theoretical frameworks that have been pursued in physics in the past few decades are actually directly relevant to our models.

I was worried this was going to be one of those “you’ve got to throw out the old” advances in science. It’s not. Yes, the underlying structure of our models is different. Yes, the initial approach and methods are different. And, yes, a bunch of new ideas are needed. But to make everything work we’re going to have to build on a lot of what my physicist friends have been working so hard on for the past few decades.

And then there’ll be the physics experiments. If you’d asked me even a couple of months ago when we’d get anything experimentally testable from our models I would have said it was far away. And that it probably wouldn’t happen until we’d pretty much found the final rule. But it looks like I was wrong. And in fact we’ve already got some good hints of bizarre new things that might be out there to look for.

OK, so what do we need to do now? I’m thrilled to say that I think we’ve found a path to the fundamental theory of physics. We’ve built a paradigm and a framework (and, yes, we’ve built lots of good, practical, computational tools too). But now we need to finish the job. We need to work through a lot of complicated computation, mathematics and physics. And see if we can finally deliver the answer to how our universe fundamentally works.

It’s an exciting moment, and I want to share it. I’m looking forward to being deeply involved. But this isn’t just a project for me or our small team. This is a project for the world. It’s going to be a great achievement when it’s done. And I’d like to see it shared as widely as possible. Yes, a lot of what has to be done requires top-of-the-line physics and math knowledge. But I want to expose everything as broadly as possible, so everyone can be involved in—and I hope inspired by—what I think is going to be a great and historic intellectual adventure.

Today we’re officially launching our Physics Project. From here on, we’ll be livestreaming what we’re doing—sharing whatever we discover in real time with the world. (We’ll also soon be releasing more than 400 hours of video that we’ve already accumulated.) I’m posting all my working materials going back to the 1990s, and we’re releasing all our software tools. We’ll be putting out bulletins about progress, and there’ll be educational programs around the project.

Oh, yes, and we’re putting up a Registry of Notable Universes. It’s already populated with nearly a thousand rules. I don’t think any of the ones in there yet are our own universe—though I’m not completely sure. But sometime—I hope soon—there might just be a rule entered in the Registry that has all the right properties, and that we’ll slowly discover that, yes, this is it—our universe finally decoded.

The Wolfram Physics Project

How It Works

OK, so how does it all work? I’ve written a 448-page technical exposition (yes, I’ve been busy the past few months!). Another member of our team (Jonathan Gorard) has written two 60-page technical papers. And there’s other material available at the project website. But here I’m going to give a fairly non-technical summary of some of the high points.

It all begins with something very simple and very structureless. We can think of it as a collection of abstract relations between abstract elements. Or we can think of it as a hypergraph—or, in simple cases, a graph.

We might have a collection of relations like

{{1, 2}, {2, 3}, {3, 4}, {2, 4}}

that can be represented by a graph like

ResourceFunction
&#10005
ResourceFunction[
  "WolframModelPlot"][{{1, 2}, {2, 3}, {3, 4}, {2, 4}}, 
 VertexLabels -> Automatic]

All we’re specifying here are the relations between elements (like {2,3}). The order in which we state the relations doesn’t matter (although the order within each relation does matter). And when we draw the graph, all that matters is what’s connected to what; the actual layout on the page is just a choice made for visual presentation. It also doesn’t matter what the elements are called. Here I’ve used numbers, but all that matters is that the elements are distinct.

OK, so what do we do with these collections of relations, or graphs? We just apply a simple rule to them, over and over again. Here’s an example of a possible rule:

{{x, y}, {x, z}} {{x, z}, {x, w}, {y, w}, {z, w}}

What this rule says is to pick up two relations—from anywhere in the collection—and see if the elements in them match the pattern {{x,y},{x,z}} (or, in the Wolfram Language, {{x_,y_},{x_,z_}}), where the two x’s can be anything, but both have to be the same, and the y and z can be anything. If there’s a match, then replace these two relations with the four relations on the right. The w that appears there is a new element that’s being created, and the only requirement is that it’s distinct from all other elements.

We can represent the rule as a transformation of graphs:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}], VertexLabels -> Automatic, "RulePartsAspectRatio" -> 0.5]

Now let’s apply the rule once to:

{{1, 2}, {2, 3}, {3, 4}, {2, 4}}

The {2,3} and {2,4} relations get matched, and the rule replaces them with four new relations, so the result is:

{{1, 2}, {3, 4}, {2, 4}, {2, 5}, {3, 5}, {4, 5}}

We can represent this result as a graph (which happens to be rendered flipped relative to the graph above):

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}, {{1, 2}, {2, 3}, {3, 4}, {2, 4}}, 1]["FinalStatePlot", 
 VertexLabels -> Automatic]

OK, so what happens if we just keep applying the rule over and over? Here’s the result:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
    w}}, {{1, 2}, {2, 3}, {3, 4}, {2, 4}}, 10, "StatesPlotsList"]

Let’s do it a few more times, and make a bigger picture:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
    w}}, {{1, 2}, {2, 3}, {3, 4}, {2, 4}}, 14, "FinalStatePlot"]

What happened here? We have such a simple rule. Yet applying this rule over and over again produces something that looks really complicated. It’s not what our ordinary intuition tells us should happen. But actually—as I first discovered in the early 1980s—this kind of intrinsic, spontaneous generation of complexity turns out to be completely ubiquitous among simple rules and simple programs. And for example my book A New Kind of Science is about this whole phenomenon and why it’s so important for science and beyond.

But here what’s important about it is that it’s what’s going to make our universe, and everything in it. Let’s review again what we’ve seen. We started off with a simple rule that just tells us how to transform collections of relations. But what we get out is this complicated-looking object that, among other things, seems to have some definite shape.

We didn’t put in anything about this shape. We just gave a simple rule. And using that simple rule a graph was made. And when we visualize that graph, it comes out looking like it has a definite shape.

If we ignore all matter in the universe, our universe is basically a big chunk of space. But what is that space? We’ve had mathematical idealizations and abstractions of it for two thousand years. But what really is it? Is it made of something, and if so, what?

Well, I think it’s very much like the picture above. A whole bunch of what are essentially abstract points, abstractly connected together. Except that in the picture there are 6704 of these points, whereas in our real universe there might be more like 10400 of them, or even many more.

All Possible Rules

We don’t (yet) know an actual rule that represents our universe—and it’s almost certainly not the one we just talked about. So let’s discuss what possible rules there are, and what they typically do.

One feature of the rule we used above is that it’s based on collections of “binary relations”, containing pairs of elements (like {2,3}). But the same setup lets us also consider relations with more elements. For example, here’s a collection of two ternary relations:

{{1, 2, 3}, {3, 4, 5}}

We can’t use an ordinary graph to represent things like this, but we can use a hypergraph—a construct where we generalize edges in graphs that connect pairs of nodes to “hyperedges” that connect any number of nodes:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][{{1, 2, 3}, {3, 4, 5}}, 
 VertexLabels -> Automatic]

(Notice that we’re dealing with directed hypergraphs, where the order in which nodes appear in a hyperedge matters. In the picture, the “membranes” are just indicating which nodes are connected to the same hyperedge.)

We can make rules for hypergraphs too:

{{x, y, z}} {{w, w, y}, {w, x, z}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2, 3}} -> {{4, 4, 2}, {4, 1, 3}}]]

And now here’s what happens if we run this rule starting from the simplest possible ternary hypergraph—the ternary self-loop {{0,0,0}}:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{1, 2, 3}} -> {{4, 4, 2}, {4, 1, 3}}, {{0, 0, 0}},
   8]["StatesPlotsList", "MaxImageSize" -> 180]

Alright, so what happens if we just start picking simple rules at random? Here are some of the things they do:

urules24
&#10005
urules24 = 
 Import["https://www.wolframcloud.com/obj/wolframphysics/Data/22-24-\
2x0-unioned-summary.wxf"]; SeedRandom[6783]; GraphicsGrid[
 Partition[
  ResourceFunction["WolframModelPlot"][List @@@ EdgeList[#]] & /@ 
   Take[Select[
     ParallelMap[
      UndirectedGraph[
        Rule @@@ 
         ResourceFunction["WolframModel"][#, {{0, 0}, {0, 0}}, 8, 
          "FinalState"], 
        GraphLayout -> "SpringElectricalEmbedding"] &, #Rule & /@ 
       RandomSample[urules24, 150]], 
     EdgeCount[#] > 10 && ConnectedGraphQ[#] &], 60], 10], 
 ImageSize -> Full]

Somehow this looks very zoological (and, yes, these models are definitely relevant for things other than fundamental physics—though probably particularly molecular-scale construction). But basically what we see here is that there are various common forms of behavior, some simple, and some not.

Here are some samples of the kinds of things we see:

GraphicsGrid
&#10005
GraphicsGrid[
 Partition[
  ParallelMap[
   ResourceFunction["WolframModel"][#[[1]], #[[2]], #[[3]], 
     "FinalStatePlot"] &, {{{{1, 2}, {1, 3}} -> {{1, 2}, {1, 4}, {2, 
        4}, {4, 3}}, {{0, 0}, {0, 0}}, 
     12}, {{{1, 2}, {1, 3}} -> {{1, 4}, {1, 4}, {2, 4}, {3, 2}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {1, 4}, {3, 4}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {3, 4}, {1, 4}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {3, 4}, {4, 1}}, {{0, 
       0}, {0, 0}}, 
     12}, {{{1, 2}, {1, 3}} -> {{2, 4}, {2, 1}, {4, 1}, {4, 3}}, {{0, 
       0}, {0, 0}}, 
     9}, {{{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {1, 4}, {3, 4}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {2, 1}, {3, 4}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {1, 3}} -> {{4, 1}, {1, 4}, {4, 2}, {4, 3}}, {{0, 
       0}, {0, 0}}, 
     12}, {{{1, 2}, {2, 3}} -> {{1, 2}, {2, 1}, {4, 1}, {4, 3}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {2, 3}} -> {{1, 3}, {1, 4}, {3, 4}, {3, 2}}, {{0, 
       0}, {0, 0}}, 
     10}, {{{1, 2}, {2, 3}} -> {{2, 3}, {2, 4}, {3, 4}, {1, 2}}, {{0, 
       0}, {0, 0}}, 9}}], 4], ImageSize -> Full]

And the big question is: if we were to run rules like these long enough, would they end up making something that reproduces our physical universe? Or, put another way, out in this computational universe of simple rules, can we find our physical universe?

A big question, though, is: How would we know? What we’re seeing here are the results of applying rules a few thousand times; in our actual universe they may have been applied 10500 times so far, or even more. And it’s not easy to bridge that gap. And we have to work it from both sides. First, we have to use the best summary of the operation of our universe that what we’ve learned in physics over the past few centuries has given us. And second, we have to go as far as we can in figuring out what our rules actually do.

And here there’s potentially a fundamental problem: the phenomenon of computational irreducibility. One of the great achievements of the mathematical sciences, starting about three centuries ago, has been delivering equations and formulas that basically tell you how a system will behave without you having to trace each step in what the system does. But many years ago I realized that in the computational universe of possible rules, this very often isn’t possible. Instead, even if you know the exact rule that a system follows, you may still not be able to work out what the system will do except by essentially just tracing every step it takes.

One might imagine that—once we know the rule for some system—then with all our computers and brainpower we’d always be able to “jump ahead” and work out what the system would do. But actually there’s something I call the Principle of Computational Equivalence, which says that almost any time the behavior of a system isn’t obviously simple, it’s computationally as sophisticated as anything. So we won’t be able to “outcompute” it—and to work out what it does will take an irreducible amount of computational work.

Well, for our models of the universe this is potentially a big problem. Because we won’t be able to get even close to running those models for as long as the universe does. And at the outset it’s not clear that we’ll be able to tell enough from what we can do to see if it matches up with physics.

But the big recent surprise for me is that we seem to be lucking out. We do know that whenever there’s computational irreducibility in a system, there are also an infinite number of pockets of computational reducibility. But it’s completely unclear whether in our case those pockets will line up with things we know from physics. And the surprise is that it seems a bunch of them do.

What Is Space?

Let’s look at a particular, simple rule from our infinite collection:

{{x, y, y}, {z, x, u}} {{y, v, y}, {y, z, v}, {u, v, v}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2, 2}, {3, 1, 4}} -> {{2, 5, 2}, {2, 3, 
     5}, {4, 5, 5}}]]

Here’s what it does:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> 50] & /@ 
 ResourceFunction[
   "WolframModel"][{{{1, 2, 2}, {3, 1, 4}} -> {{2, 5, 2}, {2, 3, 
      5}, {4, 5, 5}}}, {{0, 0, 0}, {0, 0, 0}}, 20, "StatesList"]

And after a while this is what happens:

Row
&#10005
Row[Append[
  Riffle[ResourceFunction[
       "WolframModel"][{{1, 2, 2}, {3, 1, 4}} -> {{2, 5, 2}, {2, 3, 
         5}, {4, 5, 5}}, {{0, 0, 0}, {0, 0, 0}}, #, 
      "FinalStatePlot"] & /@ {200, 500}, " ... "], " ..."]]

It’s basically making us a very simple “piece of space”. If we keep on going longer and longer it’ll make a finer and finer mesh, to the point where what we have is almost indistinguishable from a piece of a continuous plane.

Here’s a different rule:

{{x, x, y}, {z, u, x}} {{u, u, z}, {v, u, v}, {v, y, x}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{x, x, y}, {z, u, x}} -> {{u, u, z}, {v, u, 
     v}, {v, y, x}}]]
ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> 50] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 1, 2}, {3, 4, 1}} -> {{4, 4, 3}, {5, 4, 
     5}, {5, 2, 1}}, {{0, 0, 0}, {0, 0, 0}}, 20, "StatesList"]
ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 1, 2}, {3, 4, 1}} -> {{4, 4, 3}, {5, 4, 5}, {5,
     2, 1}}, {{0, 0, 0}, {0, 0, 0}}, 2000, "FinalStatePlot"]

It looks it’s “trying to make” something 3D. Here’s another rule:

{{x, y, z}, {u, y, v}} {{w, z, x}, {z, w, u}, {x, y, w}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2, 3}, {4, 2, 5}} -> {{6, 3, 1}, {3, 6, 
     4}, {1, 2, 6}}]]
ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> 50] & /@ 
 ResourceFunction[
   "WolframModel"][{{x, y, z}, {u, y, v}} -> {{w, z, x}, {z, w, 
     u}, {x, y, w}}, {{0, 0, 0}, {0, 0, 0}}, 20, "StatesList"]
ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2, 3}, {4, 2, 5}} -> {{6, 3, 1}, {3, 6, 4}, {1,
     2, 6}}, {{0, 0, 0}, {0, 0, 0}}, 1000, "FinalStatePlot"]

Isn’t this strange? We have a rule that’s just specifying how to rewrite pieces of an abstract hypergraph, with no notion of geometry, or anything about 3D space. And yet it produces a hypergraph that’s naturally laid out as something that looks like a 3D surface.

Even though the only thing that’s really here is connections between points, we can “guess” where a surface might be, then we can show the result in 3D:

ResourceFunction
&#10005
ResourceFunction["GraphReconstructedSurface"][
 ResourceFunction[
   "WolframModel"][ {{1, 2, 3}, {4, 2, 5}} -> {{6, 3, 1}, {3, 6, 
     4}, {1, 2, 6}}, {{0, 0, 0}, {0, 0, 0}}, 2000, "FinalState"]]

If we keep going, then like the example of the plane, the mesh will get finer and finer, until basically our rule has grown us—point by point, connection by connection—something that’s like a continuous 3D surface of the kind you might study in a calculus class. Of course, in some sense, it’s not “really” that surface: it’s just a hypergraph that represents a bunch of abstract relations—but somehow the pattern of those relations gives it a structure that’s a closer and closer approximation to the surface.

And this is basically how I think space in the universe works. Underneath, it’s a bunch of discrete, abstract relations between abstract points. But at the scale we’re experiencing it, the pattern of relations it has makes it seem like continuous space of the kind we’re used to. It’s a bit like what happens with, say, water. Underneath, it’s a bunch of discrete molecules bouncing around. But to us it seems like a continuous fluid.

Needless to say, people have thought that space might ultimately be discrete ever since antiquity. But in modern physics there was never a way to make it work—and anyway it was much more convenient for it to be continuous, so one could use calculus. But now it’s looking like the idea of space being discrete is actually crucial to getting a fundamental theory of physics.

The Dimensionality of Space

A very fundamental fact about space as we experience it is that it is three-dimensional. So can our rules reproduce that? Two of the rules we just saw produce what we can easily recognize as two-dimensional surfaces—in one case flat, in the other case arranged in a certain shape. Of course, these are very bland examples of (two-dimensional) space: they are effectively just simple grids. And while this is what makes them easy to recognize, it also means that they’re not actually much like our universe, where there’s in a sense much more going on.

So, OK, take a case like:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2, 3}, {4, 3, 5}} -> {{3, 5, 2}, {5, 2, 4}, {2,
     1, 6}}, {{0, 0, 0}, {0, 0, 0}}, 22, "FinalStatePlot"]

If we were to go on long enough, would this make something like space, and, if so, with how many dimensions? To know the answer, we have to have some robust way to measure dimension. But remember, the pictures we’re drawing are just visualizations; the underlying structure is a bunch of discrete relations defining a hypergraph—with no information about coordinates, or geometry, or even topology. And, by the way, to emphasize that point, here is the same graph—with exactly the same connectivity structure—rendered four different ways:

GridGraph
&#10005
GridGraph[{10, 10}, GraphLayout -> #, 
   VertexStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph",
      "VertexStyle"], 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph",
      "EdgeLineStyle"] ] & /@ {"SpringElectricalEmbedding", 
  "TutteEmbedding", "RadialEmbedding", "DiscreteSpiralEmbedding"}

But getting back to the question of dimension, recall that the area of a circle is πr2; the volume of a sphere is . In general, the “volume” of the d-dimensional analog of a sphere is a constant multiplied by rd. But now think about our hypergraph. Start at some point in the hypergraph. Then follow r hyperedges in all possible ways. You’ve effectively made the analog of a “spherical ball” in the hypergraph. Here are examples for graphs corresponding to 2D and 3D lattices:

MakeBallPicture
&#10005
MakeBallPicture[g_, rmax_] := 
  Module[{gg = UndirectedGraph[g], cg}, cg = GraphCenter[gg]; 
   Table[HighlightGraph[gg, NeighborhoodGraph[gg, cg, r]], {r, 0, 
     rmax}]];
Graph[#, ImageSize -> 60, 
   VertexStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph",
      "VertexStyle"], 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph",
      "EdgeLineStyle"] ] & /@ MakeBallPicture[GridGraph[{11, 11}], 7]
MakeBallPicture
&#10005
MakeBallPicture[g_, rmax_] := 
 Module[{gg = UndirectedGraph[g], cg}, cg = GraphCenter[gg]; 
  Table[HighlightGraph[gg, NeighborhoodGraph[gg, cg, r]], {r, 0, 
    rmax}]]; 
Graph[#, ImageSize -> 80, 
   VertexStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph",
      "VertexStyle"], 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph",
      "EdgeLineStyle"] ] & /@ MakeBallPicture[GridGraph[{7, 7, 7}], 5]

And if you now count the number of points reached by going “graph distance r” (i.e. by following r connections in the graph) you’ll find in these two cases that they indeed grow like r2 and r3.

So this gives us a way to measure the effective dimension of our hypergraphs. Just start at a particular point and see how many points you reach by going r steps:

gg = UndirectedGraph
&#10005
gg = UndirectedGraph[
   ResourceFunction["HypergraphToGraph"][
    ResourceFunction[
      "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
        w}, {z, w}}, {{1, 2}, {1, 3}}, 11, "FinalState"]]];
With[{cg = GraphCenter[gg]}, 
 Table[HighlightGraph[gg, NeighborhoodGraph[gg, cg, r], 
   ImageSize -> 90], {r, 6}]]

Now to work out effective dimension, we in principle just have to fit the results to rd. It’s a bit complicated, though, because we need to avoid small r (where every detail of the hypergraph is going to matter) and large r (where we’re hitting the edge of the hypergraph)—and we also need to think about how our “space” is refining as the underlying system evolves. But in the end we can generate a series of fits for the effective dimension—and in this case these say that the effective dimension is about 2.7:

HypergraphDimensionEstimateList
&#10005
HypergraphDimensionEstimateList[hg_] := 
  ResourceFunction["LogDifferences"][
   MeanAround /@ 
    Transpose[
     Values[ResourceFunction["HypergraphNeighborhoodVolumes"][hg, All,
        Automatic]]]];
ListLinePlot[
 Select[Length[#] > 3 &][
  HypergraphDimensionEstimateList /@ 
   Drop[ResourceFunction[
      "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z,
         w}}, {{1, 2}, {1, 3}}, 16, "StatesList"], 4]], Frame -> True,
  PlotStyle -> {Hue[0.9849884156577183, 0.844661839156126, 0.63801], 
   Hue[0.05, 0.9493847125498949, 0.954757], Hue[
   0.0889039442504032, 0.7504362741954692, 0.873304], Hue[
   0.06, 1., 0.8], Hue[0.12, 1., 0.9], Hue[0.08, 1., 1.], Hue[
   0.98654716551403, 0.6728487861309527, 0.733028], Hue[
   0.04, 0.68, 0.9400000000000001], Hue[
   0.9945149844324427, 0.9892162267509705, 0.823529], Hue[
   0.9908289627180552, 0.4, 0.9]}]

If we do the same thing for

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2, 2}, {3, 1, 4}} -> {{2, 5, 2}, {2, 3, 
    5}, {4, 5, 5}}, {{0, 0, 0}, {0, 0, 0}}, 200, "FinalStatePlot"]

it’s limiting to dimension 2, as it should:

CenteredDimensionEstimateList
&#10005
CenteredDimensionEstimateList[g_Graph] := 
  ResourceFunction["LogDifferences"][
   N[First[Values[
      ResourceFunction["GraphNeighborhoodVolumes"][g, 
       GraphCenter[g]]]]]];
Show[ListLinePlot[
  Table[CenteredDimensionEstimateList[
    UndirectedGraph[
     ResourceFunction["HypergraphToGraph"][
      ResourceFunction[
        "WolframModel"][{{1, 2, 2}, {3, 1, 4}} -> {{2, 5, 2}, {2, 3, 
          5}, {4, 5, 5}}, {{0, 0, 0}, {0, 0, 0}}, t, 
       "FinalState"]]]], {t, 500, 2500, 500}], Frame -> True, 
  PlotStyle -> {Hue[0.9849884156577183, 0.844661839156126, 0.63801], 
    Hue[0.05, 0.9493847125498949, 0.954757], Hue[
    0.0889039442504032, 0.7504362741954692, 0.873304], Hue[
    0.06, 1., 0.8], Hue[0.12, 1., 0.9], Hue[0.08, 1., 1.], Hue[
    0.98654716551403, 0.6728487861309527, 0.733028], Hue[
    0.04, 0.68, 0.9400000000000001], Hue[
    0.9945149844324427, 0.9892162267509705, 0.823529], Hue[
    0.9908289627180552, 0.4, 0.9]}], 
 Plot[2, {r, 0, 50}, PlotStyle -> Dotted]]

What does the fractional dimension mean? Well, consider fractals, which our rules can easily make:

{{x, y, z}} {{x, u, w}, {y, v, u}, {z, w, v}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2, 3}} -> {{1, 4, 6}, {2, 5, 4}, {3, 6, 5}}]]
ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, "MaxImageSize" -> 100] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2, 3}} -> {{1, 4, 6}, {2, 5, 4}, {3, 6, 
     5}}, {{0, 0, 0}}, 6, "StatesList"]

If we measure the dimension here we get 1.58—the usual fractal dimension for a Sierpiński structure:

HypergraphDimensionEstimateList
&#10005
HypergraphDimensionEstimateList[hg_] := 
 ResourceFunction["LogDifferences"][
  MeanAround /@ 
   Transpose[
    Values[ResourceFunction["HypergraphNeighborhoodVolumes"][hg, All, 
      Automatic]]]]; Show[
 ListLinePlot[
  Drop[HypergraphDimensionEstimateList /@ 
    ResourceFunction[
      "WolframModel"][{{1, 2, 3}} -> {{1, 4, 6}, {2, 5, 4}, {3, 6, 
        5}}, {{0, 0, 0}}, 8, "StatesList"], 2], 
  PlotStyle -> {Hue[0.9849884156577183, 0.844661839156126, 0.63801], 
    Hue[0.05, 0.9493847125498949, 0.954757], Hue[
    0.0889039442504032, 0.7504362741954692, 0.873304], Hue[
    0.06, 1., 0.8], Hue[0.12, 1., 0.9], Hue[0.08, 1., 1.], Hue[
    0.98654716551403, 0.6728487861309527, 0.733028], Hue[
    0.04, 0.68, 0.9400000000000001], Hue[
    0.9945149844324427, 0.9892162267509705, 0.823529], Hue[
    0.9908289627180552, 0.4, 0.9]}, Frame -> True, 
  PlotRange -> {0, Automatic}], 
 Plot[Log[2, 3], {r, 0, 150}, PlotStyle -> {Dotted}]]

Our rule above doesn’t create a structure that’s as regular as this. In fact, even though the rule itself is completely deterministic, the structure it makes looks quite random. But what our measurements suggest is that when we keep running the rule it produces something that’s like 2.7-dimensional space.

Of course, 2.7 is not 3, and presumably this particular rule isn’t the one for our particular universe (though it’s not clear what effective dimension it’d have if we ran it 10100 steps). But the process of measuring dimension shows an example of how we can start making “physics-connectable” statements about the behavior of our rules.

By the way, we’ve been talking about “making space” with our models. But actually, we’re not just trying to make space; we’re trying to make everything in the universe. In standard current physics, there’s space—described mathematically as a manifold—and serving as a kind of backdrop, and then there’s everything that’s in space, all the matter and particles and planets and so on.

But in our models there’s in a sense nothing but space—and in a sense everything in the universe must be “made of space”. Or, put another way, it’s the exact same hypergraph that’s giving us the structure of space, and everything that exists in space.

So what this means is that, for example, a particle like an electron or a photon must correspond to some local feature of the hypergraph, a bit like in this toy example:

Graph
&#10005
Graph[EdgeAdd[
  EdgeDelete[
   NeighborhoodGraph[
    IndexGraph@ResourceFunction["HexagonalGridGraph"][{6, 5}], {42, 
     48, 54, 53, 47, 41}, 4], {30 <-> 29, 42 <-> 41}], {30 <-> 41, 
   42 <-> 29}], 
 VertexSize -> {Small, 
   Alternatives @@ {30, 36, 42, 41, 35, 29} -> Large}, 
 EdgeStyle -> {ResourceFunction["WolframPhysicsProjectStyleData"][
    "SpatialGraph", "EdgeLineStyle"], 
   Alternatives @@ {30 \[UndirectedEdge] 24, 24 \[UndirectedEdge] 18, 
      18 \[UndirectedEdge] 17, 17 \[UndirectedEdge] 23, 
      23 \[UndirectedEdge] 29, 29 \[UndirectedEdge] 35, 
      35 \[UndirectedEdge] 34, 34 \[UndirectedEdge] 40, 
      40 \[UndirectedEdge] 46, 46 \[UndirectedEdge] 52, 
      52 \[UndirectedEdge] 58, 58 \[UndirectedEdge] 59, 
      59 \[UndirectedEdge] 65, 65 \[UndirectedEdge] 66, 
      66 \[UndirectedEdge] 60, 60 \[UndirectedEdge] 61, 
      61 \[UndirectedEdge] 55, 55 \[UndirectedEdge] 49, 
      49 \[UndirectedEdge] 54, 49 \[UndirectedEdge] 43, 
      43 \[UndirectedEdge] 37, 37 \[UndirectedEdge] 36, 
      36 \[UndirectedEdge] 30, 30 \[UndirectedEdge] 41, 
      42 \[UndirectedEdge] 29, 36 \[UndirectedEdge] 42, 
      35 \[UndirectedEdge] 41, 41 \[UndirectedEdge] 47, 
      47 \[UndirectedEdge] 53, 53 \[UndirectedEdge] 54, 
      54 \[UndirectedEdge] 48, 48 \[UndirectedEdge] 42} -> 
    Directive[AbsoluteThickness[2.5], Darker[Red, .2]]}, 
 VertexStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph", 
   "VertexStyle"]]

To give a sense of scale, though, I have an estimate that says that 10200 times more “activity” in the hypergraph that represents our universe is going into “maintaining the structure of space” than is going into maintaining all the matter we know exists in the universe.

Curvature in Space & Einstein’s Equations

Here are a few structures that simple examples of our rules make:

GraphicsRow
&#10005
GraphicsRow[{ResourceFunction[
    "WolframModel"][{{1, 2, 2}, {1, 3, 4}} -> {{4, 5, 5}, {5, 3, 
      2}, {1, 2, 5}}, {{0, 0, 0}, {0, 0, 0}}, 1000, "FinalStatePlot"],
   ResourceFunction[
    "WolframModel"][{{1, 1, 2}, {1, 3, 4}} -> {{4, 4, 5}, {5, 4, 
      2}, {3, 2, 5}}, {{0, 0, 0}, {0, 0, 0}}, 1000, "FinalStatePlot"],
   ResourceFunction[
    "WolframModel"][{{1, 1, 2}, {3, 4, 1}} -> {{3, 3, 5}, {2, 5, 
      1}, {2, 6, 5}}, {{0, 0, 0}, {0, 0, 0}}, 2000, 
   "FinalStatePlot"]}, ImageSize -> Full]

But while all of these look like surfaces, they’re all obviously different. And one way to characterize them is by their local curvature. Well, it turns out that in our models, curvature is a concept closely related to dimension—and this fact will actually be critical in understanding, for example, how gravity arises.

But for now, let’s talk about how one would measure curvature on a hypergraph. Normally the area of a circle is πr2. But let’s imagine that we’ve drawn a circle on the surface of a sphere, and now we’re measuring the area on the sphere that’s inside the circle:

cappedSphere
&#10005
cappedSphere[angle_] := 
  Module[{u, v}, 
   With[{spherePoint = {Cos[u] Sin[v], Sin[u] Sin[v], Cos[v]}}, 
    Graphics3D[{First@
         ParametricPlot3D[spherePoint, {v, #1, #2}, {u, 0, 2 \[Pi]}, 
          Mesh -> None, ##3] & @@@ {{angle, \[Pi], 
         PlotStyle -> Lighter[Yellow, .5]}, {0, angle, 
         PlotStyle -> Lighter[Red, .3]}}, 
      First@ParametricPlot3D[
        spherePoint /. v -> angle, {u, 0, 2 \[Pi]}, 
        PlotStyle -> Darker@Red]}, Boxed -> False, 
     SphericalRegion -> False, Method -> {"ShrinkWrap" -> True}]]];
Show[GraphicsRow[Riffle[cappedSphere /@ {0.3, Pi/6, .8}, Spacer[30]]],
  ImageSize -> 250]

This area is no longer πr2. Instead it’s π, where a is the radius of the sphere. In other words, as the radius of the circle gets bigger, the effect of being on the sphere is ever more important. (On the surface of the Earth, imagine a circle drawn around the North Pole; once it gets to the equator, it can never get any bigger.)

If we generalize to d dimensions, it turns out the formula for the growth rate of the volume is , where R is a mathematical object known as the Ricci scalar curvature.

So what this all means is that if we look at the growth rates of spherical balls in our hypergraphs, we can expect two contributions: a leading one of order rd that corresponds to effective dimension, and a “correction” of order r2 that represents curvature.

Here’s an example. Instead of giving a flat estimate of dimension (here equal to 2), we have something that dips down, reflecting the positive (“sphere-like”) curvature of the surface:

res = CloudGet
&#10005
res = CloudGet["https://wolfr.am/L1ylk12R"];
GraphicsRow[{ResourceFunction["WolframModelPlot"][
   ResourceFunction[
     "WolframModel"][{{1, 2, 3}, {4, 2, 5}} -> {{6, 3, 1}, {3, 6, 
       4}, {1, 2, 6}}, {{0, 0, 0}, {0, 0, 0}}, 800, "FinalState"]], 
  ListLinePlot[res, Frame -> True, 
   PlotStyle -> {Hue[0.9849884156577183, 0.844661839156126, 0.63801], 
     Hue[0.05, 0.9493847125498949, 0.954757], Hue[
     0.0889039442504032, 0.7504362741954692, 0.873304], Hue[
     0.06, 1., 0.8], Hue[0.12, 1., 0.9], Hue[0.08, 1., 1.], Hue[
     0.98654716551403, 0.6728487861309527, 0.733028], Hue[
     0.04, 0.68, 0.9400000000000001], Hue[
     0.9945149844324427, 0.9892162267509705, 0.823529], Hue[
     0.9908289627180552, 0.4, 0.9]}]}]

What is the significance of curvature? One thing is that it has implications for geodesics. A geodesic is the shortest distance between two points. In ordinary flat space, geodesics are just lines. But when there’s curvature, the geodesics are curved:

hyperboloidGeodesics
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-04/Geodesics-01.wl*)
CloudGet["https://wolfr.am/L1PH6Rne"];
hyperboloidGeodesics = Table[
Part[
NDSolve[{Sinh[
         2 u[t]] ((2 Derivative[1][u][t]^2 - Derivative[1][v][t]^2)/(
         2 Cosh[2 u[t]])) + Derivative[2][u][t] == 0, ((2 Tanh[
u[t]]) Derivative[1][u][t]) Derivative[1][v][t] + Derivative[2][v][
        t] == 0, u[0] == -0.9, v[0] == v0, u[1] == 0.9, v[1] == v0}, {
     
u[t], 
v[t]}, {t, 0, 1}, MaxSteps -> Infinity], 1], {v0, 
Range[-0.1, 0.1, 0.025]}];
{SphereGeodesics[Range[-.1, .1, .025]], 
 PlaneGeodesics[Range[-.1, .1, .025]], 
 Show[ParametricPlot3D[{Sinh[u], Cosh[u] Sin[v], 
    Cos[v] Cosh[u]}, {u, -1, 1}, {v, -\[Pi]/3, \[Pi]/3}, 
   Mesh -> False, Boxed -> False, Axes -> False, PlotStyle -> color], 
  ParametricPlot3D[{Sinh[u[t]], Cosh[u[t]] Sin[v[t]], 
       Cos[v[t]] Cosh[u[t]]} /. #, {t, 0, 1}, PlotStyle -> Red] & /@ 
   hyperboloidGeodesics, ViewAngle -> 0.3391233203265557`, 
  ViewCenter -> {{0.5`, 0.5`, 0.5`}, {0.5265689095305934`, 
     0.5477310383268459`}}, 
  ViewPoint -> {1.7628482856617167`, 0.21653966523483362`, 
    2.8801868854502355`}, 
  ViewVertical -> {-0.1654573174671554`, 0.1564093539158781`, 
    0.9737350718261054`}]}

In the case of positive curvature, bundles of geodesics converge; for negative curvature they diverge. But, OK, even though geodesics were originally defined for continuous space (actually, as the name suggests, for paths on the surface of the Earth), one can also have them in graphs (and hypergraphs). And it’s the same story: the geodesic is the shortest path between two points in the graph (or hypergraph).

Here are geodesics on the “positive-curvature surface” created by one of our rules:

findShortestPath
&#10005
findShortestPath[edges_, endpoints : {{_, _} ...}] := 
  FindShortestPath[
     Catenate[Partition[#, 2, 1, 1] & /@ edges], #, #2] & @@@ 
   endpoints;
pathEdges[edges_, path_] := 
  Select[Count[Alternatives @@ path]@# >= 2 &]@edges;
plotGeodesic[edges_, endpoints : {{_, _} ...}, o : OptionsPattern[]] := 
  With[{vertexPaths = findShortestPath[edges, endpoints]}, 
   ResourceFunction["WolframModelPlot"][edges, o, 
    GraphHighlight -> Catenate[vertexPaths], 
    EdgeStyle -> <|
      Alternatives @@ Catenate[pathEdges[edges, #] & /@ vertexPaths] -> 
       Directive[AbsoluteThickness[4], Red]|>]];
plotGeodesic[edges_, endpoints : {__ : Except@List}, 
   o : OptionsPattern[]] := plotGeodesic[edges, {endpoints}, o];
plotGeodesic[
 ResourceFunction[
   "WolframModel"][{{1, 2, 3}, {4, 2, 5}} -> {{6, 3, 1}, {3, 6, 
     4}, {1, 2, 6}}, Automatic, 1000, 
  "FinalState"], {{123, 721}, {24, 552}, {55, 671}}, 
 VertexSize -> 0.12]

And here they are for a more complicated structure:

gtest = UndirectedGraph
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-04/Geodesics-01.wl*)

CloudGet["https://wolfr.am/L1PH6Rne"];(*Geodesics*)

gtest = UndirectedGraph[
   Rule @@@ 
    ResourceFunction[
      "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z,
         w}}, {{1, 2}, {1, 3}}, 10, "FinalState"], Sequence[
   VertexStyle -> ResourceFunction["WolframPhysicsProjectStyleData"][
     "SpatialGraph", "VertexStyle"], 
    EdgeStyle -> ResourceFunction["WolframPhysicsProjectStyleData"][
     "SpatialGraph", "EdgeLineStyle"]] ];
Geodesics[gtest, #] & /@ {{{79, 207}}, {{143, 258}}}

Why are geodesics important? One reason is that in Einstein’s general relativity they’re the paths that light (or objects in “free fall”) follows in space. And in that theory gravity is associated with curvature in space. So when something is deflected going around the Sun, that happens because space around the Sun is curved, so the geodesic the object follows is also curved.

General relativity’s description of curvature in space turns out to all be based on the Ricci scalar curvature R that we encountered above (as well as the slightly more sophisticated Ricci tensor). But so if we want to find out if our models are reproducing Einstein’s equations for gravity, we basically have to find out if the Ricci curvatures that arise from our hypergraphs are the same as the theory implies.

There’s quite a bit of mathematical sophistication involved (for example, we have to consider curvature in space+time, not just space), but the bottom line is that, yes, in various limits, and subject to various assumptions, our models do indeed reproduce Einstein’s equations. (At first, we’re just reproducing the vacuum Einstein equations, appropriate when there’s no matter involved; when we discuss matter, we’ll see that we actually get the full Einstein equations.)

It’s a big deal to reproduce Einstein’s equations. Normally in physics, Einstein’s equations are what you start from (or sometimes they arise as a consistency condition for a theory): here they’re what comes out as an emergent feature of the model.

It’s worth saying a little about how the derivation works. It’s actually somewhat analogous to the derivation of the equations of fluid flow from the limit of the underlying dynamics of lots of discrete molecules. But in this case, it’s the structure of space rather than the velocity of a fluid that we’re computing. It involves some of the same kinds of mathematical approximations and assumptions, though. One has to assume, for example, that there’s enough effective randomness generated in the system that statistical averages work. There is also a whole host of subtle mathematical limits to take. Distances have to be large compared to individual hypergraph connections, but small compared to the whole size of the hypergraph, etc.

It’s pretty common for physicists to “hack through” the mathematical niceties. That’s actually happened for nearly a century in the case of deriving fluid equations from molecular dynamics. And we’re definitely guilty of the same thing here. Which in a sense is another way of saying that there’s lots of nice mathematics to do in actually making the derivation rigorous, and understanding exactly when it’ll apply, and so on.

By the way, when it comes to mathematics, even the setup that we have is interesting. Calculus has been built to work in ordinary continuous spaces (manifolds that locally approximate Euclidean space). But what we have here is something different: in the limit of an infinitely large hypergraph, it’s like a continuous space, but ordinary calculus doesn’t work on it (not least because it isn’t necessarily integer-dimensional). So to really talk about it well, we have to invent something that’s kind of a generalization of calculus, that’s for example capable of dealing with curvature in fractional-dimensional space. (Probably the closest current mathematics to this is what’s been coming out of the very active field of geometric group theory.)

It’s worth noting, by the way, that there’s a lot of subtlety in the precise tradeoff between changing the dimension of space, and having curvature in it. And while we think our universe is three-dimensional, it’s quite possible according to our models that there are at least local deviations—and most likely there were actually large deviations in the early universe.

Time

In our models, space is defined by the large-scale structure of the hypergraph that represents our collection of abstract relations. But what then is time?

For the past century or so, it’s been pretty universally assumed in fundamental physics that time is in a sense “just like space”—and that one should for example lump space and time together and talk about the “spacetime continuum”. And certainly the theory of relativity points in this direction. But if there’s been one “wrong turn” in the history of physics in the past century, I think it’s the assumption that space and time are the same kind of thing. And in our models they’re not—even though, as we’ll see, relativity comes out just fine.

So what then is time? In effect it’s much as we experience it: the inexorable process of things happening and leading to other things. But in our models it’s something much more precise: it’s the progressive application of rules, that continually modify the abstract structure that defines the contents of the universe.

The version of time in our models is in a sense very computational. As time progresses we are in effect seeing the results of more and more steps in a computation. And indeed the phenomenon of computational irreducibility implies that there is something definite and irreducible “achieved” by this process. (And, for example, this irreducibility is what I believe is responsible for the “encrypting” of initial conditions that is associated with the law of entropy increase, and the thermodynamic arrow of time.) Needless to say, of course, our modern computational paradigm did not exist a century ago when “spacetime” was introduced, and perhaps if it had, the history of physics might have been very different.

But, OK, so in our models time is just the progressive application of rules. But there is a subtlety in exactly how this works that might at first seem like a detail, but that actually turns out to be huge, and in fact turns out to be the key to both relativity and quantum mechanics.

At the beginning of this piece, I talked about the rule

{{x, y}, {x, z}} {{x, z}, {x, w}, {y, w}, {z, w}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}], VertexLabels -> Automatic, "RulePartsAspectRatio" -> 0.55]

and showed the “first few steps” in applying it

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"] /@ 
 ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}, {{1, 2}, {2, 3}, {3, 4}, {2, 4}}, 4, "StatesList"]

But how exactly did the rule get applied? What is “inside” these steps? The rule defines how to take two connections in the hypergraph (which in this case is actually just a graph) and transform them into four new connections, creating a new element in the process. So each “step” that we showed before actually consists of several individual “updating events” (where here newly added connections are highlighted, and ones that are about to be removed are dashed):

With
&#10005
With[{eo = 
   ResourceFunction[
     "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
       w}}, {{1, 2}, {2, 3}, {3, 4}, {2, 4}}, 4]}, 
 TakeList[eo["EventsStatesPlotsList", ImageSize -> 130], 
  eo["GenerationEventsCountList", 
   "IncludeBoundaryEvents" -> "Initial"]]]

But now, here is the crucial point: this is not the only sequence of updating events consistent with the rule. The rule just says to find two adjacent connections, and if there are several possible choices, it says nothing about which one. And a crucial idea in our model is in a sense just to do all of them.

We can represent this with a graph that shows all possible paths:

CloudGet
&#10005
CloudGet["https://wolfr.am/LmHho8Tr"]; (*newgraph*)newgraph[
 Graph[ResourceFunction["MultiwaySystem"][
   "WolframModel" -> {{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z,
         w}}}, {{{1, 2}, {2, 3}, {3, 4}, {2, 4}}}, 3, "StatesGraph", 
   VertexSize -> 3, PerformanceGoal -> "Quality"], 
  AspectRatio -> 1/2], {3, 0.7}]

For the very first update, there are two possibilities. Then for each of the results of these, there are four additional possibilities. But at the next update, something important happens: two of the branches merge. In other words, even though we have done a different sequence of updates, the outcome is the same.

Things rapidly get complicated. Here is the graph after one more update, now no longer trying to show a progression down the page:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][
  "WolframModel" -> {{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
       w}}}, {{{1, 2}, {2, 3}, {3, 4}, {2, 4}}}, 4, "StatesGraph", 
  VertexSize -> 3, PerformanceGoal -> "Quality"]]

So how does this relate to time? What it says is that in the basic statement of the model there is not just one path of time; there are many paths, and many “histories”. But the model—and the rule that is used—determines all of them. And we have seen a hint of something else: that even if we might think we are following an “independent” path of history, it may actually merge with another path.

It will take some more discussion to explain how this all works. But for now let me say that what will emerge is that time is about causal relationships between things, and that in fact, even when the paths of history that are followed are different, these causal relationships can end up being the same—and that in effect, to an observer embedded in the system, there is still just a single thread of time.

The Graph of Causal Relationships

In the end it’s wonderfully elegant. But to get to the point where we can understand the elegant bigger picture we need to go through some detailed things. (It isn’t terribly surprising that a fundamental theory of physics—inevitably built on very abstract ideas—is somewhat complicated to explain, but so it goes.)

To keep things tolerably simple, I’m not going to talk directly about rules that operate on hypergraphs. Instead I’m going to talk about rules that operate on strings of characters. (To clarify: these are not the strings of string theory—although in a bizarre twist of “pun-becomes-science” I suspect that the continuum limit of the operations I discuss on character strings is actually related to string theory in the modern physics sense.)

OK, so let’s say we have the rule:

{A BBB, BB A}

This rule says that anywhere we see an A, we can replace it with BBB, and anywhere we see BB we can replace it with A. So now we can generate what we call the multiway system for this rule, and draw a “multiway graph” that shows everything that can happen:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"A" -> "BBB", 
  "BB" -> "A"}, {"A"}, 8, "StatesGraph"]

At the first step, the only possibility is to use ABBB to replace the A with BBB. But then there are two possibilities: replace either the first BB or the second BB—and these choices give different results. On the next step, though, all that can be done is to replace the A—in both cases giving BBBB.

So in other words, even though we in a sense had two paths of history that diverged in the multiway system, it took only one step for them to converge again. And if you trace through the picture above you’ll find out that’s what always happens with this rule: every pair of branches that is produced always merges, in this case after just one more step.

This kind of balance between branching and merging is a phenomenon I call “causal invariance”. And while it might seem like a detail here, it actually turns out that it’s at the core of why relativity works, why there’s a meaningful objective reality in quantum mechanics, and a host of other core features of fundamental physics.

But let’s explain why I call the property causal invariance. The picture above just shows what “state” (i.e. what string) leads to what other one. But at the risk of making the picture more complicated (and note that this is incredibly simple compared to the full hypergraph case), we can annotate the multiway graph by including the updating events that lead to each transition between states:

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"A" -> "BBB", 
   "BB" -> "A"}, {"A"}, 8, "EvolutionEventsGraph"], AspectRatio -> 1]

But now we can ask the question: what are the causal relationships between these events? In other words, what event needs to happen before some other event can happen? Or, said another way, what events must have happened in order to create the input that’s needed for some other event?

Let us go even further, and annotate the graph above by showing all the causal dependencies between events:

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"A" -> "BBB", 
   "BB" -> "A"}, {"A"}, 7, "EvolutionCausalGraph"], AspectRatio -> 1]

The orange lines in effect show which event has to happen before which other event—or what all the causal relationships in the multiway system are. And, yes, it’s complicated. But note that this picture shows the whole multiway system—with all possible paths of history—as well as the whole network of causal relationships within and between these paths.

But here’s the crucial thing about causal invariance: it implies that actually the graph of causal relationships is the same regardless of which path of history is followed. And that’s why I originally called this property “causal invariance”—because it says that with a rule like this, the causal properties are invariant with respect to different choices of the sequence in which updating is done.

And if one traced through the picture above (and went quite a few more steps), one would find that for every path of history, the causal graph representing causal relationships between events would always be:

ResourceFunction
&#10005
ResourceFunction["SubstitutionSystemCausalGraph"][{"A" -> "BBB", 
   "BB" -> "A"}, "A", 10] // LayeredGraphPlot

or, drawn differently,

ResourceFunction
&#10005
ResourceFunction["SubstitutionSystemCausalGraph"][{"A" -> "BBB", 
  "BB" -> "A"}, "A", 12]

The Importance of Causal Invariance

To understand more about causal invariance, it’s useful to look at an even simpler example: the case of the rule BAAB. This rule says that any time there’s a B followed by an A in a string, swap these characters around. In other words, this is a rule that tries to sort a string into alphabetical order, two characters at a time.

Let’s say we start with BBBAAA. Then here’s the multiway graph that shows all the things that can happen according to the rule:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][{"BA" -> "AB"}, "BBBAAA", 12,
    "EvolutionEventsGraph"], AspectRatio -> 1.5] // LayeredGraphPlot

There are lots of different paths that can be followed, depending on which BA in the string the rule is applied to at each step. But the important thing we see is that at the end all the paths merge, and we get a single final result: the sorted string AAABBB. And the fact that we get this single final result is a consequence of the causal invariance of the rule. In a case like this where there’s a final result (as opposed to just evolving forever), causal invariance basically says: it doesn’t matter what order you do all the updates in; the result you’ll get will always be the same.

I’ve introduced causal invariance in the context of trying to find a model of fundamental physics—and I’ve said that it’s going to be critical to both relativity and quantum mechanics. But actually what amounts to causal invariance has been seen before in various different guises in mathematics, mathematical logic and computer science. (Its most common name is “confluence”, though there are some technical differences between this and what I call causal invariance.)

Think about expanding out an algebraic expression, like (x + (1 + x)2)(x + 2)2. You could expand one of the powers first, then multiply things out. Or you could multiply the terms first. It doesn’t matter what order you do the steps in; you’ll always get the same canonical form (which in this case Mathematica tells me is 4 + 16x + 17x2 + 7x3 + x4). And this independence of orders is essentially causal invariance.

Here’s one more example. Imagine you’ve got some recursive definition, say f[n_]:=f[n-1]+f[n-2] (with f[0]=f[1]=1). Now evaluate f[10]. First you get f[9]+f[8]. But what do you do next? Do you evaluate f[9], or f[8]? And then what? In the end, it doesn’t matter; you’ll always get 55. And this is another example of causal invariance.

When one thinks about parallel or asynchronous algorithms, it’s important if one has causal invariance. Because it means one can do things in any order—say, depth-first, breadth-first, or whatever—and one will always get the same answer. And that’s what’s happening in our little sorting algorithm above.

OK, but now let’s come back to causal relationships. Here’s the multiway system for the sorting process annotated with all causal relationships for all paths:

Magnify
&#10005
Magnify[LayeredGraphPlot[
  ResourceFunction["MultiwaySystem"][{"BA" -> "AB"}, "BBBAAA", 12, 
   "EvolutionCausalGraph"], AspectRatio -> 1.5], .6]

And, yes, it’s a mess. But because there’s causal invariance, we know something very important: this is basically just a lot of copies of the same causal graph—a simple grid:

centeredRange
&#10005
centeredRange[n_] := # - Mean@# &@Range@n;
centeredLayer[n_] := {#, n} & /@ centeredRange@n;
diamondLayerSizes[layers_?OddQ] := 
  Join[#, Reverse@Most@#] &@Range[(layers + 1)/2];
diamondCoordinates[layers_?OddQ] := 
  Catenate@MapIndexed[
    Thread@{centeredRange@#, (layers - First@#2)/2} &, 
    diamondLayerSizes[layers]];
diamondGraphLayersCount[graph_] := 2 Sqrt[VertexCount@graph] - 1;
With[{graph = 
   ResourceFunction["SubstitutionSystemCausalGraph"][{"BA" -> "AB"}, 
    "BBBBAAAA", 12]}, 
 Graph[graph, 
  VertexCoordinates -> 
   diamondCoordinates@diamondGraphLayersCount@graph, VertexSize -> .2]]

(By the way—as the picture suggests—the cross-connections between these copies aren’t trivial, and later on we’ll see they’re associated with deep relations between relativity and quantum mechanics, that probably manifest themselves in the physics of black holes. But we’ll get to that later…)

OK, so every different way of applying the sorting rule is supposed to give the same causal graph. So here’s one example of how we might apply the rule starting with a particular initial string:

evo = (SeedRandom
&#10005
evo = (SeedRandom[2424]; 
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    "BBAAAABAABBABBBBBAAA", 15, {"Random", 4}]);
ResourceFunction["SubstitutionSystemCausalPlot"][evo, 
 EventLabels -> False, CellLabels -> True, CausalGraph -> False]

But now let’s show the graph of causal connections. And we see it’s just a grid:

evo = (SeedRandom
&#10005
evo = (SeedRandom[2424]; 
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    "BBAAAABAABBABBBBBAAA", 15, {"Random", 4}]);
ResourceFunction["SubstitutionSystemCausalPlot"][evo, 
 EventLabels -> False, CellLabels -> False, CausalGraph -> True]

Here are three other possible sequences of updates:

SeedRandom
&#10005
SeedRandom[242444]; GraphicsRow[
 Table[ResourceFunction["SubstitutionSystemCausalPlot"][
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    "BBAAAABAABBABBBBBAAA", 15, {"Random", 4}], EventLabels -> False, 
   CellLabels -> False, CausalGraph -> True], 3], ImageSize -> Full]

But now we see causal invariance in action: even though different updates occur at different times, the graph of causal relationships between updating events is always the same. And having seen this—in the context of a very simple example—we’re ready to talk about special relativity.

Deriving Special Relativity

It’s a typical first instinct in thinking about doing science: you imagine doing an experiment on a system, but you—as the “observer”—are outside the system. Of course if you’re thinking about modeling the whole universe and everything in it, this isn’t ultimately a reasonable way to think about things. Because the “observer” is inevitably part of the universe, and so has to be modeled just like everything else.

In our models what this means is that the “mind of the observer”, just like everything else in the universe, has to get updated through a series of updating events. There’s no absolute way for the observer to “know what’s going on in the universe”; all they ever experience is a series of updating events, that may happen to be affected by updating events occurring elsewhere in the universe. Or, said differently, all the observer can ever observe is the network of causal relationships between events—or the causal graph that we’ve been talking about.

So as toy model let’s look at our BAAB rule for strings. We might imagine that the string is laid out in space. But to our observer the only thing they know is the causal graph that represents causal relationships between events. And for the BAAB system here’s one way we can draw that:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[10, {0, 0}, {0.0, 0.0}, lorentz[0]]

But now let’s think about how observers might “experience” this causal graph. Underneath, an observer is getting updated by some sequence of updating events. But even though that’s “really what’s going on”, to make sense of it, we can imagine our observers setting up internal “mental” models for what they see. And a pretty natural thing for observers like us to do is just to say “one set of things happens all across the universe, then another, and so on”. And we can translate this into saying that we imagine a series of “moments” in time, where things happen “simultaneously” across the universe—at least with some convention for defining what we mean by simultaneously. (And, yes, this part of what we’re doing is basically following what Einstein did when he originally proposed special relativity.)

Here’s a possible way of doing it:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[10, {1, 0}, {0.0, 0.0}, lorentz[0]]

One can describe this as a “foliation” of the causal graph. We’re dividing the causal graph into leaves or slices. And each slice our observers can consider to be a “successive moment in time”.

It’s important to note that there are some constraints on the foliation we can pick. The causal graph defines what event has to happen before what. And if our observers are going to have a chance of making sense of the world, it had better be the case that their notion of the progress of time aligns with what the causal graph says. So for example this foliation wouldn’t work—because basically it says that the time we assign to events is going to disagree with the order in which the causal graph says they have to happen:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[6, {.2, 0}, {5, 0.0}, lorentz[0]]

But, so given the foliation above, what actual order of updating events does it imply? It basically just says: as many events as possible happen at the same time (i.e. in the same slice of the foliation), as in this picture:

ResourceFunction["SubstitutionSystemCausalPlot"]
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-08/BoostedEvolution.wl*)
CloudGet["https://wolfr.am/LbaDFVSn"]; (*boostedEvolution*) \
ResourceFunction["SubstitutionSystemCausalPlot"][
  boostedEvolution[
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    StringRepeat["BA", 10], 10], 0], EventLabels -> False, 
  CellLabels -> True, CausalGraph -> False]

OK, now let’s connect this to physics. The foliation we had above is relevant to observers who are somehow “stationary with respect to the universe” (the “cosmological rest frame”). One can imagine that as time progresses, the events a particular observer experiences are ones in a column going vertically down the page:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[5, {1, 0.01}, {0.0, 0.0}, {1.5, 0}, {Red, 
  Directive[Dotted, Thick, Red]}, lorentz[0]]

But now let’s think about an observer who is uniformly moving in space. They’ll experience a different sequence of events, say:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[5, {1, 0.01}, {0.0, 0.3}, {0.6, 0}, {Red, 
  Directive[Dotted, Thick, Red]}, lorentz[0]]

And that means that the foliation they’ll naturally construct will be different. From the “outside” we can draw it on the causal graph like this:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[10, {1, 0.01}, {0.3, 0.3}, {0, 0}, {Red, 
  Directive[Dotted, Thick, Red]}, lorentz[0.]]

But to the observer each slice just represents a successive moment of time. And they don’t have any way to know how the causal graph was drawn. So they’ll construct their own version, where the slices are horizontal:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[10, {1, 0.01}, {0.3, 0.3}, {0, 0}, {Red, 
  Directive[Dotted, Thick, Red]}, lorentz[0.3]]

But now there’s a purely geometrical fact: to make this rearrangement, while preserving the basic structure (and here, angles) of the causal graph, each moment of time has to sample fewer events in the causal graph, by a factor of where β is the angle that represents the velocity of the observer.

If you know about special relativity, you’ll recognize a lot of this. What we’ve been calling foliations correspond directly to relativity’s “reference frames”. And our foliations that represent motion are the standard inertial reference frames of special relativity.

But here’s the special thing that’s going on here: we can interpret all this discussion of foliations and reference frames in terms of the actual rules and evolution of our underlying system. So here now is the evolution of our string-sorting system in the “boosted reference frame” corresponding to an observer going at a certain speed:

CloudGet
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-08/BoostedEvolution.wl*)
CloudGet["https://wolfr.am/LbaDFVSn"]; (*boostedEvolution*) \
ResourceFunction["SubstitutionSystemCausalPlot"][
  boostedEvolution[
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    StringRepeat["BA", 10], 10], 0.3], EventLabels -> False, 
  CellLabels -> True, CausalGraph -> False]

And here’s the crucial point: because of causal invariance it doesn’t matter that we’re in a different reference frame—the causal graph for the system (and the way it eventually sorts the string) is exactly the same.

In special relativity, the key idea is that the “laws of physics” work the same in all inertial reference frames. But why should that be true? Well, in our systems, there’s an answer: it’s a consequence of causal invariance in the underlying rules. In other words, from the property of causal invariance, we’re able to derive relativity.

Normally in physics one puts in relativity by the way one sets up the mathematical structure of spacetime. But in our models we don’t start from anything like this, and in fact space and time are not even at all the same kind of thing. But what we can now see is that—because of causal invariance—relativity emerges in our models, with all the relationships between space and time that that implies.

So, for example, if we look at the picture of our string-sorting system above, we can see relativistic time dilation. In effect, because of the foliation we picked, time operates slower. Or, said another way, in the effort to sample space faster, our observer experiences slower updating of the system in time.

The speed of light c in our toy system is defined by the maximum rate at which information can propagate, which is determined by the rule, and in the case of this rule is one character per step. And in terms of this, we can then say that our foliation corresponds to a speed 0.3 c. But now we can look at the amount of time dilation, and it’s exactly the amount that relativity says it should be.

By the way, if we imagine trying to make our observer go “faster than light”, we can see that can’t work. Because there’s no way to tip the foliation at more than 45° in our picture, and still maintain the causal relationships implied by the causal graph.

OK, so in our toy model we can derive special relativity. But here’s the thing: this derivation isn’t specific to the toy model; it applies to any rule that has causal invariance. So even though we may be dealing with hypergraphs, not strings, and we may have a rule that shows all kinds of complicated behavior, if it ultimately has causal invariance, then (with various technical caveats, mostly about possible wildness in the causal graph) it will exhibit relativistic invariance, and a physics based on it will follow special relativity.

What Is Energy? What Is Mass?

In our model, everything in the universe—space, matter, whatever—is supposed to be represented by features of our evolving hypergraph. So within that hypergraph, is there a way to identify things that are familiar from current physics, like mass, or energy?

I have to say that although it’s a widespread concept in current physics, I’d never thought of energy as something fundamental. I’d just thought of it as an attribute that things (atoms, photons, whatever) can have. I never really thought of it as something that one could identify abstractly in the very structure of the universe.

So it came as a big surprise when we recently realized that actually in our model, there is something we can point to, and say “that’s energy!”, independent of what it’s the energy of. The technical statement is: energy corresponds to the flux of causal edges through spacelike hypersurfaces. And, by the way, momentum corresponds to the flux of causal edges through timelike hypersurfaces.

OK, so what does this mean? First, what’s a spacelike hypersurface? It’s actually a standard concept in general relativity, for which there’s a direct analogy in our models. Basically it’s what forms a slice in our foliation. Why is it called what it’s called? We can identify two kinds of directions: spacelike and timelike.

A spacelike direction is one that involves just moving in space—and it’s a direction where one can always reverse and go back. A timelike direction is one that involves also progressing through time—where one can’t go back. We can mark spacelike () and timelike () hypersurfaces in the causal graph for our toy model:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; (*regularCausalGraphPlot*)

CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)
\
regularCausalGraphPlot[10, {1, 0.5}, {0., 0.}, {-0.5, 0}, {Red, 
  Directive[Dashed, Red]}, lorentz[0.]]

(They might be called “surfaces”, except that “surfaces” are usually thought of as 2-dimensional, and our 3-space + 1-time dimensional universe, these foliation slices are 3-dimensional: hence the term “hypersurfaces”.)

OK, now let’s look at the picture. The “causal edges” are the causal connections between events, shown in the picture as lines joining the events. So when we talk about a “flux of causal edges through spacelike hypersurfaces”, what we’re talking about is the net number of causal edges that go down through the horizontal slices in the pictures.

In the toy model that’s trivial to see. But here’s a causal graph from a simple hypergraph model, where it’s already considerably more complicated:

Graph
&#10005
Graph[ResourceFunction[
   "WolframModel"][  {{x, y}, {z, y}} -> {{x, z}, {y, z}, {w, 
     z}}, {{0, 0}, {0, 0}}, 15, "LayeredCausalGraph"], 
 AspectRatio -> 1/2]

(Our toy-model causal graph starts from a line of events because we set up a long string as the initial condition; this starts from a single event because it’s starting from a minimal initial condition.)

But when we put a foliation on this causal graph (thereby effectively defining our reference frame) we can start counting how many causal edges go down through successive (“spacelike”) slices:

foliationLines
&#10005
foliationLines[{lineDensityHorizontal_ : 1, 
    lineDensityVertical_ : 1}, {tanHorizontal_ : 0.0, 
    tanVertical_ : 0.0}, offset : {_, _} : {0, 0}, 
   lineStyles : {_, _} : {Red, Red}, 
   transform_ : (# &)] := {If[lineDensityHorizontal != 0, 
    Style[Table[
      Line[transform /@ {{-100 + First@offset, 
          k - 100 tanHorizontal + Last@offset}, {100 + First@offset, 
          k + 100 tanHorizontal + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityHorizontal}], First@lineStyles], {}], 
   If[lineDensityVertical != 0, 
    Style[Table[
      Line[transform /@ {{k - 100 tanVertical + First@offset, -100 + 
           Last@offset}, {k + 100 tanVertical + First@offset, 
          100 + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityVertical}], Last@lineStyles], {}]};
ResourceFunction[
   "WolframModel"][{{x, y}, {z, y}} -> {{x, z}, {y, z}, {w, z}}, {{0, 
    0}, {0, 0}}, 15]["LayeredCausalGraph", AspectRatio -> 1/2, 
 Epilog -> 
  foliationLines[{0.44, 0}, {0, 
    0}, {0, -0.5}, {Directive[Red, Opacity[0.2]], Red}]]

We can also ask how many causal edges go “sideways”, through timelike hypersurfaces:

foliationLines
&#10005
foliationLines[{lineDensityHorizontal_ : 1, 
    lineDensityVertical_ : 1}, {tanHorizontal_ : 0.0, 
    tanVertical_ : 0.0}, offset : {_, _} : {0, 0}, 
   lineStyles : {_, _} : {Red, Red}, 
   transform_ : (# &)] := {If[lineDensityHorizontal != 0, 
    Style[Table[
      Line[transform /@ {{-100 + First@offset, 
          k - 100 tanHorizontal + Last@offset}, {100 + First@offset, 
          k + 100 tanHorizontal + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityHorizontal}], First@lineStyles], {}], 
   If[lineDensityVertical != 0, 
    Style[Table[
      Line[transform /@ {{k - 100 tanVertical + First@offset, -100 + 
           Last@offset}, {k + 100 tanVertical + First@offset, 
          100 + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityVertical}], Last@lineStyles], {}]};
ResourceFunction[
   "WolframModel"][{{x, y}, {z, y}} -> {{x, z}, {y, z}, {w, z}}, {{0, 
    0}, {0, 0}}, 15]["LayeredCausalGraph", AspectRatio -> 1/2, 
 Epilog -> 
  foliationLines[{0, 1/3}, {0, 0}, {2.1, 
    0}, {Directive[Red, Opacity[0.5]], 
    Directive[Dotted, Opacity[0.7], Red]}]]

OK, so why do we think these fluxes of edges correspond to energy and momentum? Imagine what happens if we change our foliation, say tipping it to correspond to motion at some velocity, as we did in the previous section. It takes a little bit of math, but what we find out is that our fluxes of causal edges transform with velocity basically just like we saw distance and time transform in the previous section.

In the standard derivation of relativistic mechanics, there’s a consistency argument that energy has to transform with velocity like time does, and momentum like distance. But now we actually have a structural reason for this to be the case. It’s a fundamental consequence of our whole setup, and of causal invariance. In traditional physics, one often says that position is the conjugate variable to momentum, and energy to time. And that’s something that’s burnt into the mathematical structure of the theory. But here it’s not something we’re burning in; it’s something we’re deriving from the underlying structure of our model.

And that means there’s ultimately a lot more we can say about it. For example, we might wonder what the “zero of energy” is. After all, if we look at one of our causal graphs, a lot of the causal edges are really just going into “maintaining the structure of space”. So if in a sense space is uniform, there’s inevitably a uniform “background flux” of causal edges associated with that. And whatever we consider to be “energy” corresponds to the fluctuations of that flux around its background value.

By the way, it’s worth mentioning what a “flux of causal edges” corresponds to. Each causal edge represents a causal connection between events, that is in a sense “carried” by some element in the underlying hypergraph (the “spatial hypergraph”). So a “flux of causal edges” is in effect the communication of activity (i.e. events), either in time (i.e. through spacelike hypersurfaces) or in space (i.e. through timelike hypersurfaces). And at least in some approximation we can then say that energy is associated with activity in the hypergraph that propagates information through time, while momentum is associated with activity that propagates information in space.

There’s a fundamental feature of our causal graphs that we haven’t mentioned yet—that’s related to information propagation. Start at any point (i.e. any event) in a causal graph. Then trace the causal connections from that event. You’ll get some kind of cone (here just in 2D):

CloudGet
&#10005
CloudGet["https://wolfr.am/KVl97Tf4"];(*lorentz*)

foliationLines[{lineDensityHorizontal_ : 1, 
    lineDensityVertical_ : 1}, {tanHorizontal_ : 0.0, 
    tanVertical_ : 0.0}, offset : {_, _} : {0, 0}, 
   lineStyles : {_, _} : {Red, Red}, 
   transform_ : (# &)] := {If[lineDensityHorizontal != 0, 
    Style[Table[
      Line[transform /@ {{-100 + First@offset, 
          k - 100 tanHorizontal + Last@offset}, {100 + First@offset, 
          k + 100 tanHorizontal + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityHorizontal}], First@lineStyles], {}], 
   If[lineDensityVertical != 0, 
    Style[Table[
      Line[transform /@ {{k - 100 tanVertical + First@offset, -100 + 
           Last@offset}, {k + 100 tanVertical + First@offset, 
          100 + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityVertical}], Last@lineStyles], {}]};
squareCausalGraphPlot[
   layerCount_ : 9, {lineDensityHorizontal_ : 1, 
    lineDensityVertical_ : 1}, {tanHorizontal_ : 0.0, 
    tanVertical_ : 0.0}, offset : {_, _} : {0, 0}, 
   lineStyles : {_, _} : {Red, Red}, transform_ : (# &)] := 
  NeighborhoodGraph[
   DirectedGraph[
    Flatten[Table[{v[{i + 1, j}] -> v[{i, j}], 
       v[{i + 1, j + 1}] -> v[{i, j}]}, {i, layerCount - 1}, {j, 
       1 + Round[-layerCount/2 + i/2], (layerCount + i)/2}]], 
    VertexCoordinates -> 
     Catenate[
      Table[v[{i, j}] -> 
        transform[{2 (#2 - #1/2), #1} & @@ {i, j}], {i, 
        layerCount + 1}, {j, 
        1 + Round[-layerCount/2 + i/2] - 1, (layerCount + i)/2 + 1}]],
     VertexSize -> .33, 
    VertexStyle -> 
     Directive[Directive[Opacity[.7], Hue[0.14, 0.34, 1.]], 
      EdgeForm[Directive[Opacity[0.4], Hue[0.09, 1., 0.91]]]], 
    VertexShapeFunction -> "Rectangle", 
    Epilog -> 
     foliationLines[{lineDensityHorizontal, 
       lineDensityVertical}, {tanHorizontal, tanVertical}, offset, 
      lineStyles, transform]], v[{1, 1}], 9];
With[{graph = 
   squareCausalGraphPlot[
    10, {0, 0}, {0., 0.}, {-0.5, 0}, {Red, Directive[Dotted, Red]}, 
    lorentz[0.]]}, 
 Graph[graph, 
  VertexStyle -> {Directive[
     Directive[Opacity[.7], Hue[0.14, 0.34, 1.]], 
     EdgeForm[Directive[Opacity[0.4], Hue[0.09, 1., 0.91]]]], 
    Alternatives @@ VertexOutComponent[graph, v[{9, 5}]] -> 
     Directive[Directive[Opacity[.6], Hue[0, 0.45, 0.87]], EdgeForm[
Hue[0, 1, 0.48]]]}]]

The cone is more complicated in a more complicated causal graph. But you’ll always have something like it. And what it corresponds to physically is what’s normally called a light cone (or “forward light cone”). Assuming we’ve drawn our causal network so that events are somehow laid out in space across the page, then the light cone will show how information (as transmitted by light) can spread in space with time.

When the causal graph gets complicated, the whole setup with light cones gets complicated, as we’ll discuss for example in connection with black holes later. But for now, we can just say there are cones in our causal graph, and in effect the angle of these cones represents the maximum rate of information propagation in the system, which we can identify with the physical speed of light.

And in fact, not only can we identify light cones in our causal graph: in some sense we can think of our whole causal graph as just being a large number of “elementary light cones” all knitted together. And, as we mentioned, much of the structure that’s built necessarily goes into, in effect, “maintaining the structure of space”.

But let’s look more closely at our light cones. There are causal edges on their boundaries that in effect correspond to propagation at the speed of light—and that, in terms of the underlying hypergraph, correspond to events that “reach out” in the hypergraph, and “entrain” new elements as quickly as possible. But what about causal edges that are “more vertical”? These causal edges are associated with events that in a sense reuse elements in the hypergraph, without involving new ones.

And it looks like these causal edges have an important interpretation: they are associated with mass (or, more specifically, rest mass). OK, so the total flux of causal edges through spacelike hypersurfaces corresponds to energy. And now we’re saying that the flux of causal edges specifically in the timelike direction corresponds to rest mass. We can see what happens if we “tip our reference” frames just a bit, say corresponding to a velocity v  c. Again, there’s a small amount of math, but it’s pretty easy to derive formulas for momentum (p) and energy (E). The speed of light c comes into the formulas because it defines the ratio of “horizontal” (i.e. spacelike) to “vertical” (i.e timelike) distances on the causal graph. And for v small compared to c we get:

So from these formulas we can see that just by thinking about causal graphs (and, yes, with a backdrop of causal invariance, and a whole host of detailed mathematical limit questions that we’re not discussing here), we’ve managed to derive a basic (and famous) fact about the relation between energy and mass:

Sometimes in the standard formalism of physics, this relation by now seems more like a definition than something to derive. But in our model, it’s not just a definition, and in fact we can successfully derive it.

General Relativity & Gravity

Earlier on, we talked about how curvature of space can arise in our models. But at that point we were just talking about “empty space”. Now we can go back and also talk about how curvature interacts with mass and energy in space.

In our earlier discussion, we talked about constructing spherical balls by starting at some point in the hypergraph, and then following all possible sequences of r connections. But now we can do something directly analogous in the causal graph: start at some point, and follow possible sequences of t connections. There’s quite a bit of mathematical trickiness, but essentially this gets us “volumes of light cones”.

If space is effectively d-dimensional, then to a first approximation this volume will grow like &period; But like in the spatial case, there’s a correction term, this time proportional to the so-called Ricci tensor . (The actual expression is roughly where the are timelike vectors, etc.)

OK, but we also know something else about what is supposed to be inside our light cones: not only are there “background connections” that maintain the structure of space, there are also “additional” causal edges that are associated with energy, momentum and mass. And in the limit of a large causal graph, we can identify the density of these with the so-called energy-momentum tensor . So in the end we have two contributions to the “volumes” of our light cones: one from “pure curvature” and one from energy-momentum.

Again, there’s some math involved. But the main thing is to think about the limit when we’re looking at a very large causal graph. What needs to be true for us to have d-dimensional space, as opposed to something much wilder? This puts a constraint on the growth rates of our light cone volumes, and when one works everything out, it implies that the following equation must hold:

But this is exactly Einstein’s equation for the curvature of space when matter with a certain energy-momentum is present. We’re glossing over lots of details here. But it’s still, in my view, quite spectacular: from the basic structure of our very simple models, we’re able to derive a fundamental result in physics: the equation that for more than a hundred years has passed every test in describing the operation of gravity.

There’s a footnote here. The equation we’ve just given is without a so-called cosmological term. And how that works is bound up with the question of what the zero of energy is, which in our model relates to what features of the evolving hypergraph just have to do with the “maintenance of space”, and what have to do with “things in space” (like matter).

In existing physics, there’s an expectation that even in the “vacuum” there’s actually a formally infinite density of pairs of virtual particles associated with quantum mechanics. Essentially what’s happening is that there are always pairs of particles and antiparticles being created, that annihilate quickly, but that in aggregate contribute a huge effective energy density. We’ll discuss how this relates to quantum mechanics in our models later. But for now let’s just recall that particles (like electrons) in our models basically correspond to locally stable structures in the hypergraph.

But when we think about how “space is maintained” it’s basically through all sorts of seemingly random updating events in the hypergraph. But in existing physics (or, specifically, quantum field theory) we’re basically expected to analyze everything in terms of (virtual) particles. So if we try to do that with all these random updating events, it’s not surprising that we end up saying that there are these infinite collections of things going on. (Yes, this can be made much more precise; I’m just giving an outline here.)

But as soon as we say this, there is an immediate problem: we’re saying that there’s a formally infinite—or at least huge—energy density that must exist everywhere in the universe. But if we then apply Einstein’s equation, we’ll conclude that this must produce enough curvature to basically curl the universe up into a tiny ball.

One way to get out of this is to introduce a so-called cosmological term, that’s just an extra term in the Einstein equations, and then posit that this term is sized so as to exactly cancel (yes, to perhaps one part in 1060 or more) the energy density from virtual particles. It’s certainly not a pretty solution.

But in our models, the situation is quite different. It’s not that we have virtual particles “in space”, that are having an effect on space. It’s that the same stuff that corresponds to the virtual particles is actually “making the space”, and maintaining its structure. Of course, there are lots of details about this—which no doubt depend on the particular underlying rule. But the point is that there’s no longer a huge mystery about why “vacuum energy” doesn’t basically destroy our universe: in effect, it’s because it’s what’s making our universe.

Black Holes, Singularities, etc.

One of the big predictions of general relativity is the existence of black holes. So how do things like that work in our models? Actually, it’s rather straightforward. The defining feature of a black hole is the existence of an event horizon: a boundary that light signals can’t cross, and where in effect causal connection is broken.

In our models, we can explicitly see that happen in the causal graph. Here’s an example:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{0, 1}, {0, 2}, {0, 3}} -> {{1, 2}, {3, 2}, {3, 
     4}, {4, 3}, {4, 4}}, {{0, 0}, {0, 0}, {0, 0}}, 20, 
  "CausalGraph"] // LayeredGraphPlot

At the beginning, everything is causally connected. But at some point the causal graph splits—and there’s an event horizon. Events happening on one side can’t influence ones on the other, and so on. And that’s how a region of the universe can “causally break off” to form something like a black hole.

But actually, in our models, the “breaking off” can be even more extreme. Not only can the causal graph split; the spatial hypergraph can actually throw off disconnected pieces—each of which in effect forms a whole “separate universe”:

Framed
&#10005
Framed[ResourceFunction["WolframModelPlot"][#, 
    ImageSize -> {UpTo[100], UpTo[60]}], FrameStyle -> LightGray] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
     2}, {4, 2, 1}}, {{0, 0, 0}, {0, 0, 0}}, 20, "StatesList"]

By the way, it’s interesting to look at what happens to the foliations observers make when there’s an event horizon. Causal invariance says that paths in the causal graph that diverge should always eventually merge. But if the paths go into different disconnected pieces of the causal graph, that can’t ever happen. So how does an observer deal with that? Well, basically they have to “freeze time”. They have to have a foliation where successive time slices just pile up, and never enter the disconnected pieces.

It’s just like what happens in general relativity. To an observer far from the black hole, it’ll seem to take an infinite time for anything to fall into the black hole. For now, this is just a phenomenon associated with the structure of space. But later we’ll see that it’s also the direct analog of something completely different: the process of measurement in quantum mechanics.

Coming back to gravity: we can ask questions not only about event horizons, but also about actual singularities in spacetime. In our models, these are places where lots of paths in a causal graph converge to a single point. And in our models, we can immediately study questions like whether there’s always an event horizon associated with any singularity (the “cosmic censorship hypothesis”).

We can ask about other strange phenomena from general relativity. For example, there are closed timelike curves, sometimes viewed as allowing time travel. In our models, closed timelike curves are inconsistent with causal invariance. But we can certainly invent rules that produce them. Here’s an example:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][{"AB" -> "BAB", 
   "BA" -> "A"}, "ABA", 4, "StatesGraph"], 
 GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> "ABA"}]

We start from one “initial” state in this multiway system. But as we go forward we can enter a loop where we repeatedly visit the same state. And this loop also occurs in the causal graph. We think we’re “going forward in time”. But actually we’re just in a loop, repeatedly returning to the same state. And if we tried to make a foliation where we could describe time as always advancing, we just wouldn’t be able to do it.

Cosmology

In our model, the universe can start as a tiny hypergraph—perhaps a single self-loop. But then—as the rule gets applied—it progressively expands. With some particularly simple rules, the total size of the hypergraph has to just uniformly increase; with others it can fluctuate.

But even if the size of the hypergraph is always increasing, that doesn’t mean we’d necessarily notice. It could be that essentially everything we can see just expands too—so in effect the granularity of space is just getting finer and finer. This would be an interesting resolution to the age-old debate about whether the universe is discrete or continuous. Yes, it’s structurally discrete, but the scale of discreteness relative to our scale is always getting smaller and smaller. And if this happens fast enough, we’d never be able to “see the discreteness”—because every time we tried to measure it, the universe would effectively have subdivided before we got the result. (Somehow it’d be like the ultimate calculus epsilon-delta proof: you challenge the universe with an epsilon, and before you can get the result, the universe has made a smaller delta.)

There are some other strange possibilities too. Like that the whole hypergraph for the universe is always expanding, but pieces are continually “breaking off”, effectively forming black holes of different sizes, and allowing the “main component” of the universe to vary in size.

But regardless of how this kind of expansion works in our universe today, it’s clear that if the universe started with a single self-loop, it had to do a lot of expanding, at least early on. And here there’s an interesting possibility that’s relevant for understanding cosmology.

Just because our current universe exhibits three-dimensional space, in our models there’s no reason to think that the early universe necessarily also did. There are very different things that can happen in our models:

ResourceFunction
&#10005
ResourceFunction["WolframModel"][#1, #2, #3, 
   "FinalStatePlot"] & @@@ {{{{1, 2, 3}, {4, 5, 6}, {2, 6}} -> {{7, 
      7, 2}, {6, 2, 8}, {8, 5, 7}, {8, 9, 3}, {1, 6}, {10, 6}, {5, 
      3}, {7, 11}}, {{0, 0, 0}, {0, 0, 0}, {0, 0}}, 
   16}, {{{1, 2, 3}, {1, 4, 5}, {3, 6}} -> {{7, 8, 7}, {7, 5, 6}, {9, 
      5, 5}, {1, 7, 4}, {7, 5}, {5, 10}, {11, 6}, {6, 9}}, {{0, 0, 
     0}, {0, 0, 0}, {0, 0}}, 
   100}, {{{1, 2, 3}, {3, 4}} -> {{5, 5, 5}, {5, 6, 4}, {3, 1}, {1, 
      5}}, {{0, 0, 0}, {0, 0}}, 16}}

In the first example here, different parts of space effectively separate into non-communicating “black hole” tree branches. In the second example, we have something like ordinary—in this case 2-dimensional—space. But in the third example, space is in a sense very connected. If we work out the volume of a spherical ball, it won’t grow like rd; it’ll grow exponentially with r (e.g. like 2r).

If we look at the causal graph, we’ll see that you can effectively “go everywhere in space”, or affect every event, very quickly. It’d be as if the speed of light is infinite. But really it’s because space is effectively infinite dimensional.

In typical cosmology, it’s been quite mysterious how different parts of the early universe managed to “communicate” with each other, for example, to smooth out perturbations. But if the universe starts effectively infinite-dimensional, and only later “relaxes” to being finite-dimensional, that’s no longer a mystery.

So, OK, what might we see in the universe today that would reflect what happened extremely early in its history? The fact that our models deterministically generate behavior that seems for all practical purposes random means that we can expect that most features of the initial conditions or very early stages of the universe will quickly be “encrypted”, and effectively not reconstructable.

But it’s just conceivable that something like a breaking of symmetry associated with the first few hypergraphs might somehow survive. And that suggests the bizarre possibility that—just maybe—something like the angular structure of the cosmic microwave background or the very large-scale distribution of galaxies might reflect the discrete structure of the very early universe. Or, in other words, it’s just conceivable that what amounts to the rule for the universe is, in effect, painted across the whole sky. I think this is extremely unlikely, but it’d certainly be an amazing thing if the universe were “self-documenting” that way.

Elementary Particles—Old and New

We’ve talked several times about particles like electrons. In current physics theories, the various (truly) elementary particles—the quarks, the leptons (electron, muon, neutrinos, etc.), the gauge bosons, the Higgs—are all assumed to intrinsically be point particles, of zero size. In our models, that’s not how it works. The particles are all effectively “little lumps of space” that have various special properties.

My guess is that the precise list of what particles exist will be something that’s specific to a particular underlying rule. In cellular automata, for example, we’re used to seeing complicated sets of possible localized structures arise:

SeedRandom
&#10005
SeedRandom[2525]; ArrayPlot[
 CellularAutomaton[110, RandomInteger[1, 700], 500], 
 ImageSize -> Full, Frame -> None]

In our hypergraphs, the picture will inevitably be somewhat different. The “core feature” of each particle will be some kind of locally stable structure in the hypergraph (a simple analogy might be that it’s a lump of nonplanarity in an otherwise planar graph). But then there’ll be lots of causal edges associated with the particle, defining its particular energy and momentum.

Still, the “core feature” of the particles will presumably define things like their charge, quantum numbers, and perhaps spin—and the fact that these things are observed to occur in discrete units may reflect the fact that it’s a small piece of hypergraph that’s involved in defining them.

It’s not easy to know what the actual scale of discreteness in space might be in our models. But a possible (though potentially unreliable) estimate might be that the “elementary length” is around 10–93 meters. (Note that that’s very small compared to the Planck length ~10–35 meters that arises essentially from dimensional analysis.) And with this elementary length, the radius of the electron might be 10–81 meters. Tiny, but not zero. (Note that current experiments only tell us that the size of the electron is less than about 10–22 meters.)

One feature of our models is that there should be a “quantum of mass”—a discrete amount that all masses, for example of particles, are multiples of. With our estimate for the elementary length, this quantum of mass would be small, perhaps 10–30, or 1036 times smaller than the mass of the electron.

And this raises an intriguing possibility. Perhaps the particles—like electrons—that we currently know about are the “big ones”. (With our estimates, an electron would have hypergraph elements in it.) And maybe there are some much smaller, and much lighter ones. At least relative to the particles we currently know, such particles would have few hypergraph elements in them—so I’m referring to them as “oligons” (after the Greek word ὀλιγος for “few”).

What properties would these oligons have? They’d probably interact very very weakly with other things in the universe. Most likely lots of oligons would have been produced in the very early universe, but with their very weak interactions, they’d soon “drop out of thermal equilibrium”, and be left in large numbers as relics—with energies that become progressively lower as the universe expands around them.

So where might oligons be now? Even though their other interactions would likely be exceptionally weak, they’d still be subject to gravity. And if their energies end up being low enough, they’d basically collect in gravity wells around the universe—which means in and around galaxies.

And that’s interesting—because right now there’s quite a mystery about the amount of mass seen in galaxies. There appears to be a lot of “dark matter” that we can’t see but that has gravitational effects. Well, maybe it’s oligons. Maybe even lots of different kinds of oligons: a whole shadow physics of much lighter particles.

The Inevitability of Quantum Mechanics

“But how will you ever get quantum mechanics?”, physicists would always ask me when I would describe earlier versions of my models. In many ways, quantum mechanics is the pinnacle of existing physics. It’s always had a certain “you-are-not-expected-to-understand-this” air, though, coupled with “just-trust-the-mathematical-formalism”. And, yes, the mathematical formalism has worked well—really well—in letting us calculate things. (And it almost seems more satisfying because the calculations are often so hard; indeed, hard enough that they’re what first made me start using computers to do mathematics 45 years ago.)

Our usual impression of the world is that definite things happen. And before quantum mechanics, classical physics typically captured this in laws—usually equations—that would tell one what specifically a system would do. But in quantum mechanics the formalism involves any particular system doing lots of different things “in parallel”, with us just seeing samples—ultimately with certain probabilities—of these possibilities.

And as soon as one hears of a model in which there are definite rules, one might assume that it could never reproduce quantum mechanics. But, actually, in our models, quantum mechanics is not just possible; it’s absolutely inevitable. And, as we’ll see, in something I consider quite beautiful, the core of what leads to it turns out to be the same as what leads to relativity.

OK, so how does this work? Let’s go back to what we discussed when we first started talking about time. In our models there’s a definite rule for updates to make in our hypergraphs, say:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{y, z}, {y, w}, {z, w}, {x, 
     w}}], VertexLabels -> Automatic, "RulePartsAspectRatio" -> 0.6]

But if we’ve got a hypergraph like this:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x, y}, {x, z}} -> {{y, z}, {y, w}, {z, w}, {x, 
    w}}, {{0, 0}, {0, 0}}, 6, "FinalStatePlot"]

there will usually be many places where this rule can be applied. So which update should we do first? The model doesn’t tell us. But let’s just imagine all the possibilities. The rule tells us what they all are—and we can represent them (as we discussed above) as a multiway system—here illustrated using the simpler case of strings rather than hypergraphs:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"A" -> "AB", 
  "B" -> "A"}, {"A"}, 6, "StatesGraph"]

Each node in this graph now represents a complete state of our system (a hypergraph in our actual models). And each node is joined by arrows to the state or states that one gets by applying a single update to it.

If our model had been operating “like classical physics” we would expect it to progress in time from one state to another, say like this:

ResourceFunction
&#10005
ResourceFunction["GenerationalMultiwaySystem"][{"A" -> "AB", 
  "B" -> "A"}, {"A"}, 5, "StatesGraph"]

But the crucial point is that the structure of our models leaves us no choice but to consider multiway systems. The form of the whole multiway system is completely determined by the rules. But—in a way that is already quite reminiscent of the standard formalism of quantum mechanics—the multiway system defines many different possible paths of history.

But now there is a mystery. If there are always all these different possible paths of history, how is it that we ever think that definite things happen in the world? This has been a core mystery of quantum mechanics for a century. It turns out that if one’s just using quantum mechanics to do calculations, the answer basically doesn’t matter. But if one wants to “really understand what’s going on” in quantum mechanics, it’s something that definitely does matter.

And the exciting thing is that in our models, there’s an obvious resolution. And actually it’s based on the exact same phenomenon—causal invariance—that gives us relativity.

Here’s roughly how this works. The key point is to think about what an observer who is themselves part of the multiway system will conclude about the world. Yes, there are different possible paths of history. But—just as in our discussion of relativity—the only aspect of them that an observer will ever be aware of is the causal relationships between the events they involve. But the point is that—even though when looked at from “outside” the paths are different—causal invariance implies that the network of relationships between causal events (which is all that’s relevant when one’s inside the system) will always be exactly the same.

In other words—much as in the case of relativity—even though from outside the system there may seem to be many possible “threads of time”, from inside the system causal invariance implies that there’s in a sense ultimately just one thread of time, or, in effect, one objective reality.

How does this all relate to the detailed standard formalism of quantum mechanics? It’s a little complicated. But let me make at least a few comments here. (There’s some more detail in my technical document; Jonathan Gorard has given even more.)

The states in the multiway system can be thought of as possible states of the quantum system. But how do we characterize how observers experience them? In particular, which states is the observer aware of when? Just like in the relativity case, the observer can in a sense make a choice of how they define time. One possibility might be through a foliation of the multiway system like this:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][{"A" -> "AB", 
   "B" -> "A"}, {"A"}, 6, "StatesGraph"], AspectRatio -> 1/2, 
 Epilog -> {ResourceFunction["WolframPhysicsProjectStyleData"][
    "BranchialGraph", "EdgeStyle"], AbsoluteThickness[1.5], 
   Table[Line[{{-8, i}, {10, i}}], {i, 1/2, 6 + 1/2}]}]

In the formalism of quantum mechanics, one can then say that at each time, the observer experiences a superposition of possible states of the system. But now there’s a critical point. In direct analogy to the case of relativity, there are many different possible choices the observer can make about how to define time—and each of them corresponds to a different foliation of the multiway graph.

Again by analogy to relativity, we can then think of these choices as what we can call different “quantum observation frames”. Causal invariance implies that as long they respect the causal relationships in the graph, these frames can basically be set up in any way we want. In talking about relativity, it was useful to just have “tipped parallel lines” (“inertial frames”) representing observers who are moving uniformly in space.

In talking about quantum mechanics, other frames are useful. In particular, in the standard formalism of quantum mechanics, it’s common to talk about “quantum measurement”: essentially the act of taking a quantum system and determining some definite (essentially classical) outcome from it. Well, in our setup, a quantum measurement basically corresponds to a particular quantum observation frame.

Here’s an example:

With[{graph = Graph[ResourceFunction["MultiwaySystem"]
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-08/QM-foliations-01.wl*)

CloudGet["https://wolfr.am/LbdPPaXZ"]; Magnify[
  With[{graph = 
     Graph[ResourceFunction["MultiwaySystem"][{"A" -> "AB"}, {"AA"}, 
       7, "StatesGraph"], 
      VertexShapeFunction -> {Alternatives @@ 
          VertexList[
           ResourceFunction[
             "GenerationalMultiwaySystem"][{"A" -> "AB"}, {"AA"}, 5, 
            "StatesGraph"]] -> (Text[
            Framed[Style[stripMetadata[#2] , Hue[0, 1, 0.48]], 
             Background -> 
              Directive[Opacity[.6], Hue[0, 0.45, 0.87]],	
             FrameMargins -> {{2, 2}, {0, 0}}, RoundingRadius -> 0, 
             FrameStyle -> 
              Directive[Opacity[0.5], 
               Hue[0, 0.52, 0.8200000000000001]]], #1, {0, 0}] &)}, 
      VertexCoordinates -> (Thread[
           VertexList[#] -> GraphEmbedding[#, Automatic, 2]] &[
         ResourceFunction["MultiwaySystem"][{"A" -> "AB"}, {"AA"}, 8, 
          "StatesGraph"]])]}, 
   Show[graph, 
    foliationGraphics[graph, #, {0.1, 0.05}, 
       Directive[Hue[0.89, 0.97, 0.71], 
        AbsoluteThickness[1.5]]] & /@ {{{"AA"}}, {{
       "AA", "AAB", "ABA"}}, {{
       "AA", "AAB", "ABA", "AABB", "ABAB", "ABBA"}}, {{
       "AA", "AAB", "ABA", "AABB", "ABAB", "ABBA", "AABBB", "ABABB", 
        "ABBAB", "ABBBA"}}, {{
       "AA", "AAB", "ABA", "AABB", "ABAB", "ABBA", "AABBB", "ABABB", 
        "ABBAB", "ABBBA", "AABBBB", "ABABBB", "ABBABB", "ABBBAB", 
        "ABBBBA"}, {
       "AA", "AAB", "ABA", "AABB", "ABAB", "ABBA", "AABBB", "ABABB", 
        "ABBAB", "ABBBA", "AABBBB", "ABABBB", "ABBABB", "ABBBAB", 
        "ABBBBA", "AABBBBB", "ABABBBB", "ABBBBAB", "ABBBBBA"}, {
       "AA", "AAB", "ABA", "AABB", "ABAB", "ABBA", "AABBB", "ABABB", 
        "ABBAB", "ABBBA", "AABBBB", "ABABBB", "ABBABB", "ABBBAB", 
        "ABBBBA", "AABBBBB", "ABABBBB", "ABBBBAB", "ABBBBBA", 
        "AABBBBBB", "ABABBBBB", "ABBBBBAB", "ABBBBBBA"}, {
       "AA", "AAB", "ABA", "AABB", "ABAB", "ABBA", "AABBB", "ABABB", 
        "ABBAB", "ABBBA", "AABBBB", "ABABBB", "ABBABB", "ABBBAB", 
        "ABBBBA", "AABBBBB", "ABABBBB", "ABBBBAB", "ABBBBBA", 
        "AABBBBBB", "ABABBBBB", "ABBBBBAB", "ABBBBBBA", "AABBBBBBB", 
        "ABABBBBBB", "ABBBBBBAB", "ABBBBBBBA"}}}]], 0.9]

The successive pink lines effectively mark off what the observer is considering to be successive moments in time. So when all the lines bunch up below the state ABBABB what it means is that the observer is effectively choosing to “freeze time” for that state. In other words, the observer is saying “that’s the state I consider the system to be in, and I’m sticking to it”. Or, put another way, even though in the full multiway graph there’s all sorts of other “quantum mechanical” evolution of states going on, the observer has set up their quantum observation frame so that they pick out just a particular, definite, classical-like outcome.

OK, but can they consistently do that? Well, that depends on the actual underlying structure of the multiway graph, which ultimately depends on the actual underlying rule. In the example above, we’ve set up a foliation (i.e. a quantum observation frame) that does the best possible job in this rule at “freezing time” for the ABBABB state. But just how long can this “reality distortion field” be maintained?

The only way to keep the foliation consistent in the multiway graph above is to have it progressively expand over time. In other words, to keep time frozen, more and more quantum states have to be pulled into the “reality distortion field”, and so there’s less and less coherence in the system.

The picture above is for a very trivial rule. Here’s a corresponding picture for a slightly more realistic case:

Show[drawFoliation[
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-08/QM-foliations-01.wl*)
CloudGet["https://wolfr.am/LbdPPaXZ"];
Show[drawFoliation[
  Graph[ResourceFunction["MultiwaySystem"][{"A" -> "AB", 
     "B" -> "A"}, {"A"}, 6, "StatesGraph"], 
   VertexShapeFunction -> {Alternatives @@ 
       VertexList[
        ResourceFunction["GenerationalMultiwaySystem"][{"A" -> "AB", 
          "B" -> "A"}, {"A"}, 5, "StatesGraph"]] -> (Text[
         Framed[Style[stripMetadata[#2] , Hue[0, 1, 0.48]], 
          Background -> Directive[Opacity[.2], Hue[0, 0.45, 0.87]],	
          FrameMargins -> {{2, 2}, {0, 0}}, RoundingRadius -> 0, 
          FrameStyle -> 
           Directive[Opacity[0.5], 
            Hue[0, 0.52, 0.8200000000000001]]], #1, {0, 
          0}] &)}], {{"A", "AB", "AA", "ABB", "ABA"}, {"A", "AB", 
    "AA", "ABB", "ABA", "AAB", "ABBB"}, {"A", "AB", "AA", "ABB", 
    "ABA", "AAB", "ABBB", "AABB", "ABBBB"}}, {0.1, 0}, 
  Directive[Hue[0.89, 0.97, 0.71], AbsoluteThickness[1.5]]], 
 Graphics[{Directive[Hue[0.89, 0.97, 0.71], AbsoluteThickness[1.5]], 
   AbsoluteThickness[1.6`], 
   Line[{{-3.35, 4.05}, {-1.85, 3.3}, {-0.93, 2.35}, {-0.93, 
      1.32}, {0.23, 1.32}, {0.23, 2.32}, {2.05, 2.32}, {2.05, 
      1.51}, {1.15, 1.41}, {1.15, 0.5}, {2.15, 0.5}, {2.25, 
      1.3}, {4.3, 1.3}, {4.6, 0.5}, {8.6, 0.5}}]}]]

And what we see here is that—even in this still incredibly simplified case—the structure of the multiway system will force the observer to construct a more and more elaborate foliation if they are to successfully freeze time. Measurement in quantum mechanics has always involved a slightly uncomfortable mathematical idealization—and this now gives us a sense of what’s really going on. (The situation is ultimately very similar to the problem of decoding “encrypted” thermodynamic initial conditions that I mentioned above.)

Quantum measurement is really about what an observer perceives. But if you are for example trying to construct a quantum computer, it’s not just a question of having a qubit be perceived as being maintained in a particular state; it actually has to be maintained in that state. And for this to be the case we actually have to freeze time for that qubit. But here’s a very simplified example of how that can happen in a multiway graph:

Show[With[{graph = Graph[ResourceFunction["MultiwaySystem"]
&#10005
(*https://www.wolframcloud.com/obj/wolframphysics/TechPaper-Programs/\
Section-08/QM-foliations-01.wl*)
\
CloudGet["https://wolfr.am/LbdPPaXZ"]; Magnify[
 Show[With[{graph = 
     Graph[ResourceFunction["MultiwaySystem"][{"A" -> "AB", 
        "XABABX" -> "XXXX"}, {"XAAX"}, 6, "StatesGraph"], 
      VertexCoordinates -> 
       Append[(Thread[
            VertexList[#] -> GraphEmbedding[#, Automatic, 2]] &[
          ResourceFunction["MultiwaySystem"][{"A" -> "AB", 
            "XABABX" -> "XXXX"}, {"XAAX"}, 8, "StatesGraph"]]), 
        "XXXX" -> {0, 5.5}]]}, 
   Show[graph, 
    foliationGraphics[graph, #, {0.1, 0.05}, 
       Directive[Hue[0.89, 0.97, 0.71], AbsoluteThickness[1.5]]] & /@ {
Sequence[{{"XAAX"}}, {{"XAAX", "XAABX", "XABAX"}}, {{
        "XAAX", "XAABX", "XABAX", "XAABBX", "XABABX", "XABBAX"}}, {{
        "XAAX", "XAABX", "XABAX", "XAABBX", "XABABX", "XABBAX", 
         "XAABBBX", "XABABBX", "XABBABX", "XABBBAX"}}, {{
        "XAAX", "XAABX", "XABAX", "XAABBX", "XABABX", "XABBAX", 
         "XAABBBX", "XABABBX", "XABBABX", "XABBBAX", "XAABBBBX", 
         "XABABBBX", "XABBABBX", "XABBBABX", "XABBBBAX"}, {
        "XAAX", "XAABX", "XABAX", "XAABBX", "XABABX", "XABBAX", 
         "XAABBBX", "XABABBX", "XABBABX", "XABBBAX", "XAABBBBX", 
         "XABABBBX", "XABBABBX", "XABBBABX", "XABBBBAX", "XAABBBBBX", 
         "XABABBBBX", "XABBBBABX", "XABBBBBAX", "XABBABBBX", 
         "XABBBABBX"}}, {}, {}]}]]], .6]

All this discussion of “freezing time” might seem weird, and not like anything one usually talks about in physics. But actually, there’s a wonderful connection: the freezing of time we’re talking about here can be thought of as happening because we’ve got the analog in the space of quantum states of a black hole in physical space.

The picture above makes it plausible that we’ve got something where things can go in, but if they do, they always get stuck. But there’s more to it. If you’re an observer far from a black hole, then you’ll never actually see anything fall into the black hole in finite time (that’s why black holes are called “frozen stars” in Russian). And the reason for this is precisely because (according to the mathematics) time is frozen at the event horizon of the black hole. In other words, to successfully make a qubit, you effectively have to isolate it in quantum space like things get isolated in physical space by the presence of the event horizon of a black hole.

General Relativity and Quantum Mechanics Are the Same Idea!

General relativity and quantum mechanics are the two great foundational theories of current physics. And in the past it’s often been a struggle to reconcile them. But one of the beautiful outcomes of our project so far has been the realization that at some deep level general relativity and quantum mechanics are actually the same idea. It’s something that (at least so far) is only clear in the context of our models. But the basic point is that both theories are consequences of causal invariance—just applied in different situations.

Recall our discussion of causal graphs in the context of relativity above. We drew foliations and said that if we looked at a particular slice, it would tell us the arrangement of the system in space at what we consider to be a particular time. So now let’s look at multiway graphs. We saw in the previous section that in quantum mechanics we’re interested in foliations of these. But if we look at a particular slice in one of these foliations, what does it represent? The foliation has got a bunch of states in it. And it turns out that we can think of them as being laid out in an abstract kind of space that we’re calling “branchial space”.

To make sense of this space, we have to have a way to say what’s near what. But actually the multiway graph gives us that. Take a look at this multiway graph:

foliationLines
&#10005
foliationLines[{lineDensityHorizontal_ : 1, 
    lineDensityVertical_ : 1}, {tanHorizontal_ : 0.0, 
    tanVertical_ : 0.0}, offset : {_, _} : {0, 0}, 
   lineStyles : {_, _} : {Red, Red}, 
   transform_ : (# &)] := {If[lineDensityHorizontal != 0, 
    Style[Table[
      Line[transform /@ {{-100 + First@offset, 
          k - 100 tanHorizontal + Last@offset}, {100 + First@offset, 
          k + 100 tanHorizontal + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityHorizontal}], First@lineStyles], {}], 
   If[lineDensityVertical != 0, 
    Style[Table[
      Line[transform /@ {{k - 100 tanVertical + First@offset, -100 + 
           Last@offset}, {k + 100 tanVertical + First@offset, 
          100 + Last@offset}}], {k, -100.5, 100.5, 
       1/lineDensityVertical}], Last@lineStyles], {}]};
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"A" -> "AB", "B" -> "A"}, "A", 5,
   "EvolutionGraph"], 
 Epilog -> 
  foliationLines[{1, 0}, {0, 0}, {0, 
    0}, {ResourceFunction["WolframPhysicsProjectStyleData"][
     "BranchialGraph", "EdgeStyle"], 
    ResourceFunction["WolframPhysicsProjectStyleData"][
     "BranchialGraph", "EdgeStyle"]}]]

At each slice in the foliation, let’s draw a graph where we connect two states whenever they’re both part of the same “branch pair”, so that—like AA and ABB here—they both come from the same state on the slice before. Here are the graphs we get by doing this for successive slices:

Table
&#10005
Table[ResourceFunction["MultiwaySystem"][{"A" -> "AB", "B" -> "A"}, 
  "A", t, If[t <= 5, "BranchialGraph", 
   "BranchialGraphStructure"]], {t, 2, 8}]

We call these branchial graphs. And we can think of them as representing the correlation—or entanglement—of quantum states. Two states that are nearby in the graph are highly entangled; those further away, less so. And we can imagine that as our system evolves, we’ll get larger and larger branchial graphs, until eventually, just like for our original hypergraphs, we can think of these graphs as limiting to something like a continuous space.

But what is this space like? For our original hypergraphs, we imagined that we’d get something like ordinary physical space (say close to three-dimensional Euclidean space). But branchial space is something more abstract—and much wilder. And typically it won’t even be finite-dimensional. (It might approximate a projective Hilbert space.) But we can still think of it mathematically as some kind of space.

OK, things are getting fairly complicated here. But let me try to give at least a flavor of how things work. Here’s an example of a wonderful correspondence: curvature in physical space is like the uncertainty principle of quantum mechanics. Why do these have anything to do with each other?

The uncertainty principle says that if you measure, say, the position of something, then its momentum, you’ll get a different answer than if you do it in the opposite order. But now think about what happens when you try to make a rectangle in physical space by going in direction x first, then y, and then you do these in the opposite order. In a flat space, you’ll get to the same place. But in a curved space, you won’t:

parallelTransportOnASphere
&#10005
parallelTransportOnASphere[size_] := 
  Module[{\[Phi], \[Theta]}, 
   With[{spherePoint = {Cos[\[Phi]] Sin[\[Theta]], 
       Sin[\[Phi]] Sin[\[Theta]], Cos[\[Theta]]}}, 
    Graphics3D[{{Lighter[Yellow, .2], Sphere[]}, 
      First@ParametricPlot3D[
        spherePoint /. \[Phi] -> 0, {\[Theta], \[Pi]/2, \[Pi]/2 - 
          size}, PlotStyle -> Darker@Red], 
      Rotate[First@
        ParametricPlot3D[
         spherePoint /. \[Phi] -> 0, {\[Theta], \[Pi]/2, \[Pi]/2 - 
           size}, PlotStyle -> Darker@Red], \[Pi]/2, {-1, 0, 0}], 
      Rotate[First@
        ParametricPlot3D[
         spherePoint /. \[Phi] -> 0, {\[Theta], \[Pi]/2, \[Pi]/2 - 
           size}, PlotStyle -> Darker@Red], size, {0, 0, 1}], 
      Rotate[Rotate[
        First@ParametricPlot3D[
          spherePoint /. \[Phi] -> 0, {\[Theta], \[Pi]/2, \[Pi]/2 - 
            size}, PlotStyle -> Darker@Red], \[Pi]/2, {-1, 0, 0}], 
       size, {0, -1, 0}]}, Boxed -> False, SphericalRegion -> False, 
     Method -> {"ShrinkWrap" -> True}, ViewPoint -> {2, size, size}]]];
parallelTransportOnASphere[0 | 0.] := 
  parallelTransportOnASphere[1.*^-10];
parallelTransportOnASphere[0.7]

And essentially what’s happening in the uncertainty principle is that you’re doing exactly this, but in branchial space, rather than physical space. And it’s because branchial space is wild—and effectively very curved—that you get the uncertainty principle.

Alright, so the next question might be: what’s the analog of the Einstein equations in branchial space? And again, it’s quite wonderful: at least in some sense, the answer is that it’s the path integral—the fundamental mathematical construct of modern quantum mechanics and quantum field theory.

This is again somewhat complicated. But let me try to give a flavor of it. Just as we discussed geodesics as describing paths traversed through physical space in the course of time, so also we can discuss geodesics as describing paths traversed through branchial space in the course of time. In both cases these geodesics are determined by curvature in the corresponding space. In the case of physical space, we argued (roughly) that the presence of excess causal edges—corresponding to energy—would lead to what amounts to curvature in the spatial hypergraph, as described by Einstein’s equations.

OK, so what about branchial space? Just like for the spatial hypergraph, we can think about the causal connections between the updating events that define the branchial graph. And we can once again imagine identifying the flux of causal edges—now not through spacelike hypersurfaces, but through branchlike ones—as corresponding to energy. And—much like in the spatial hypergraph case—an excess of these causal edges will have the effect of producing what amounts to curvature in branchial space (or, more strictly, in branchtime—the analog of spacetime). But this curvature will then affect the geodesics that traverse branchial space.

In general relativity, the presence of mass (or energy) causes curvature in space which causes the paths of geodesics to turn—which is what is normally interpreted as the action of the force of gravity. But now we have an analog in quantum mechanics, in our branchial space. The presence of energy effectively causes curvature in branchial space which causes the paths of geodesics through branchial space to turn.

What does turning correspond to? Basically it’s exactly what the path integral talks about. The path integral (and the usual formalism of quantum mechanics) is set up in terms of complex numbers. But it can just as well be thought of in terms of turning through an angle. And that’s exactly what’s happening with our geodesics in branchial space. In the path integral there’s a quantity called the action—which is a kind of relativistic analog of energy—and when one works things out more carefully, our fluxes of causal edges correspond to the action, but are also exactly what determine the rate of turning of geodesics.

It all fits together beautifully. In physical space we have Einstein’s equations—the core of general relativity. And in branchial space (or, more accurately, multiway space) we have Feynman’s path integral—the core of modern quantum mechanics. And in the context of our models they’re just different facets of the same idea. It’s an amazing unification that I have to say I didn’t see coming; it’s something that just emerged as an inevitable consequence of our simple models of applying rules to collections of relations, or hypergraphs.

Branchial Motion and the Entanglement Horizon

We can think of motion in physical space as like the process of exploring new elements in the spatial hypergraph, and potentially becoming affected by them. But now that we’re talking about branchial space, it’s natural to ask whether there’s something like motion there too. And the answer is that there is. And it’s basically exactly the same kind of thing: but instead of exploring new elements in the spatial hypergraph, we’re exploring new elements in the branchial graph, and potentially becoming affected by them.

There’s a way of talking about it in the standard language of quantum mechanics: as we move in branchial space, we’re effectively getting “entangled” with more and more quantum states.

OK, so let’s take the analogy further. In physical space, there’s a maximum speed of motion—the speed of light, c. So what about in branchial space? Well, in our models we can see that there’s also got to be a maximum speed of motion in branchial space. Or, in other words, there’s a maximum rate at which we can entangle with new quantum states.

In physical space we talk about light cones as being the regions that can be causally affected by some event at a particular location in space. In the same way, we can talk about entanglement cones that define regions in branchial space that can be affected by events at some position in branchial space. And just as there’s a causal graph that effectively knits together elementary light cones, there’s something similar that knits together entanglement cones.

That something similar is the multiway causal graph: a graph that represents causal relationships between all events that can happen anywhere in a multiway system. Here’s an example of a multiway causal graph for just a few steps of a very simple string substitution system—and it’s already pretty complicated:

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 Graph[ResourceFunction["MultiwaySystem"][
   "WolframModel" ->  { {{x, y}, {x, z}} -> {{y, w}, {y, z}, {w, 
        x}}}, {{{0, 0}, {0, 0}}}, 6, "CausalGraphStructure"]]]

But in a sense the multiway causal graph is the most complete description of everything that can affect the experience of observers. Some of the causal relationships it describes represent spacelike connections; some represent branchlike connections. But all of them are there. And so in a sense the multiway causal graph is where relativity and quantum mechanics come together. Slice one way and you’ll see relationships in physical space; slice another way and you’ll see relationships in branchial space, between quantum states.

To help see how this works here’s a very toy version of a multiway causal graph:

Graph3D
&#10005
Graph3D[ResourceFunction["GeneralizedGridGraph"][{4 -> "Directed", 4, 
   4}, EdgeStyle -> {Darker[Blue], Darker[Blue], Purple}]]

Each point is an event that happens in some hypergraph on some branch of a multiway system. And now the graph records the causal relationship of that event to other ones. In this toy example, there are purely timelike relationships—indicated by arrows pointing down—in which basically some element of the hypergraph is affecting its future self. But then there are both spacelike and branchlike relationships, where the event affects elements that are either “spatially” separated in the hypergraph, or “branchially” separated in the multiway system.

But in all this complexity, there’s something wonderful that happens. As soon as the underlying rule has causal invariance, this implies all sorts of regularities in the multiway causal graph. And for example it tells us that all those causal graphs we get by taking different branchtime slices are actually the same when we project them into spacetime—and this is what leads to relativity.

But causal invariance has other consequences too. One of them is that there should be an analog of special relativity that applies not in spacetime but in branchtime. The reference frames of special relativity are now our quantum observation frames. And the analog of speed in physical space is the rate of entangling new quantum states.

So what about a phenomenon like relativistic time dilation? Is there an analog of that for motion in branchial space? Well, actually, yes there is. And it turns out to be what’s sometimes called the quantum Zeno effect: if you repeatedly measure a quantum system fast enough it won’t change. It’s a phenomenon that’s implied by the add-ons to the standard formalism of quantum mechanics that describe measurement. But in our models it just comes directly from the analogy between branchial and physical space.

Doing new measurements is equivalent to getting entangled with new quantum states—or to moving in branchial space. And in direct analogy to what happens in special relativity, as you get closer to moving at the maximum speed you inevitably sample things more slowly in time—and so you get time dilation, which means that your “quantum evolution” slows down.

OK, so there are relativistic phenomena in physical space, and quantum analogs in branchial space. But in our models these are all effectively facets of one thing: the multiway causal graph. So are there situations in which the two kinds of phenomena can mix? Normally there aren’t: relativistic phenomena involve large physical scales; quantum phenomena tend to involve small ones.

But one example of an extreme situation where they can mix is black holes. I’ve mentioned several times that the formation of an event horizon around a black hole is associated with disconnection in the causal graph. But it’s more than that. It’s actually disconnection not only in the spacetime causal graph, but in the full multiway causal graph. And that means that there’s not only an ordinary causal event horizon—in physical space—but also an “entanglement horizon” in branchial space. And just as a piece of the spatial hypergraph can get disconnected when there’s a black hole, so can a piece of the branchial graph.

What does this mean? There are a variety of consequences. One of them is that quantum information can be trapped inside the entanglement horizon even when it hasn’t crossed the causal event horizon—so that in effect the black hole is freezing quantum information “at its surface” (at least its surface in branchial space). It’s a weird phenomenon implied by our models, but what’s perhaps particularly interesting about it is that it’s very much aligned with conclusions about black holes that have emerged in some of the latest work in physics on the so-called holographic principle in quantum field theory and general relativity.

Here’s another related, weird phenomenon. If you pass the causal event horizon of a black hole, it’s an inevitable fact that you’ll eventually get infinitely physically elongated (or “spaghettified”) by tidal forces. Well, something similar happens if you pass the entanglement horizon—except now you’ll get elongated in branchial space rather than physical space. And in our models, this eventually means you won’t be able to make a quantum measurement—so in a sense as an observer you won’t be able to “form a classical thought”, or, in other words, beyond the entanglement horizon you’ll never be able to “come to a definite conclusion” about, for example, whether something fell into the black hole or didn’t.

The speed of light c is a fundamental physical constant that relates distance in physical space to time. In our models, there’s now a new fundamental physical constant: the maximum entanglement speed, that relates distance in branchial space to time. I call this maximum entanglement speed ζ (zeta) (ζ looks a bit like a “tangled c”). I’m not sure what its value is, but a possible estimate is that it corresponds to entangling about 10102 new quantum states per second. And in a sense the fact that this is so big is why we’re normally able to “form classical thoughts”.

Because of the relation between (multiway) causal edges and energy, it’s possible to convert ζ to units of energy per second, and our estimate then implies that ζ is about 105 solar masses per second. It’s a big value, although conceivably not irrelevant to something like a merger of galactic black holes. (And, yes, this would mean that for an intelligence to “quantum grok” our galaxy would take maybe six months.)

Finding the Ultimate Rule

I’m frankly amazed at how much we’ve been able to figure out just from the general structure of our models. But to get a final fundamental theory of physics we’ve still got to find a specific rule. A rule that gives us 3 (or so) dimensions of space, the particular expansion rate of the universe, the particular masses and properties of elementary particles, and so on. But how should we set about finding this rule?

And actually even before that, we need to ask: if we had the right rule, would we even know it? As I mentioned earlier, there’s potentially a big problem here with computational irreducibility. Because whatever the underlying rule is, our actual universe has applied it perhaps times. And if there’s computational irreducibility—as there inevitably will be—then there won’t be a way to fundamentally reduce the amount of computational effort that’s needed to determine the outcome of all these rule applications.

But what we have to hope is that somehow—even though the complete evolution of the universe is computationally irreducible—there are still enough “tunnels of computational reducibility” that we’ll be able to figure out at least what’s needed to be able to compare with what we know in physics, without having to do all that computational work. And I have to say that our recent success in getting conclusions just from the general structure of our models makes me much more optimistic about this possibility.

But, OK, so what rules should we consider? The traditional approach in natural science (at least over the past few centuries) has tended to be: start from what you know about whatever system you’re studying, then try to “reverse engineer” what its rules are. But in our models there’s in a sense too much emergence for this to work. Look at something like this:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2, 2}, {2, 3, 4}} -> {{4, 3, 3}, {4, 1, 
    5}, {2, 4, 5}}, {{0, 0, 0}, {0, 0, 0}}, 500, "FinalStatePlot"]

Given the overall form of this structure, would you ever figure that it could be produced just by the rule:

{{x, y, y}, {y, z, u}} {{u, z, z}, {u, x, v}, {y, u, v}}

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{x, y, y}, {y, z, u}} -> {{u, z, z}, {u, x, 
     v}, {y, u, v}}]]

Having myself explored the computational universe of simple programs for some forty years, I have to say that even now it’s amazing how often I’m humbled by the ability of extremely simple rules to give behavior I never expected. And this is particularly common with the very structureless models we’re using here. So in the end the only real way to find out what can happen in these models is just to enumerate possible rules, and then run them and see what they do.

But now there’s a crucial question. If we just start enumerating very simple rules, how far are we going to have to go before we find our universe? Or, put another way, just how simple is the rule for our universe going to end up being?

It could have been that in a sense the rule for the universe would have a special case in it for every element of the universe—every particle, every position in space, etc. But the very fact that we’ve been able to find definite scientific laws—and that systematic physics has even been possible—suggests that the rule at least doesn’t have that level of complexity. But how simple might it be? We don’t know. And I have to say that I don’t think our recent discoveries shed any particular light on this—because they basically say that lots of things in physics are generic, and independent of the specifics of the underlying rule, however simple or complex it may be.

Why This Universe? The Relativity of Rules

But, OK, let’s say we find that our universe can be described by some particular rule. Then the obvious immediate question would be: why that rule, and not another? The history of science—certainly since Copernicus—has shown us over and over again evidence that we’re “not special”. But if the rule we find to describe our universe is simple, wouldn’t that simplicity be a sign of “specialness”?

I have long wondered about this. Could it for example be that the rule is only simple because of the way that we, as entities existing in our particular universe, choose to set up our ways of describing things? And that in some other universe, with some other rule, the entities that exist there would set up their ways of describing things so that the rule for their universe is simple to them, even though it might be very complex to us?

Or could it be that in some fundamental sense it doesn’t matter what the rules for the universe are: that to observers embedded in a universe, operating according to the same rules as that universe, the conclusions about how the universe works will always be the same?

Or could it be that this is a kind of question that’s just outside the realm of science?

To my considerable surprise, the paradigm that’s emerging from our recent discoveries potentially seems to suggest a definite—though at first seemingly bizarre—scientific answer.

In what we’ve discussed so far we’re imagining that there’s a particular, single rule for our universe, that gets applied over and over again, effectively in all possible ways. But what if there wasn’t just one rule that could be used? What if all conceivable rules could be used? What if every updating event could just use any possible rule? (Notice that in a finite universe, there are only ever finitely many rules that can ever apply.)

At first it might not seem as if this setup would ever lead to anything definite. But imagine making a multiway graph of absolutely everything that can happen—including all events for all possible rules. This is a big, complicated object. But far from being structureless, it’s full of all kinds of structure.

And there’s one very important thing about it: it’s basically guaranteed to have causal invariance (basically because if there’s a rule that does something, there’s always another rule somewhere that can undo it).

So now we can make a rule-space multiway causal graph—which will show a rule-space analog of relativity. And what this means is that in the rule-space multiway graph, we can expect to make different foliations, but have them all give consistent results.

It’s a remarkable conceptual unification. We’ve got physical space, branchial space, and now also what we can call rulial space (or just rule space). And the same overall ideas and principles apply to all of them. And just as we defined reference frames in physical space and branchial space, so also we can define reference frames in rulial space.

But what kinds of reference frames might observers set up in rulial space? In a typical case we can think of different reference frames in rulial space as corresponding to different description languages in which an observer can describe their experience of the universe.

In the abstract, it’s a familiar idea that given any particular description language, we can always explicitly program any universal computer to translate it to another description language. But what we’re saying here is that in rulial space it just takes choosing a different reference frame to have our representation of the universe use a different description language.

And roughly the reason this works is that different foliations of rulial space correspond to different choices of sequences of rules in the rule-space multiway graph—which can in effect be set up to “compute” the output that would be obtained with any given description language. That this can work ultimately depends on the fact that sequences of our rules can support universal computation (which the Principle of Computational Equivalence implies they ubiquitously will)—which is in effect why it only takes “choosing a different reference frame in rule space” to “run a different program” and get a different description of the observed behavior of the universe.

It’s a strange but rather appealing picture. The universe is effectively using all possible rules. But as entities embedded in the universe, we’re picking a particular foliation (or sequence of reference frames) to make sense of what’s happening. And that choice of foliation corresponds to a description language which gives us our particular way of describing the universe.

But what is there to say definitely about the universe—independent of the foliation? There’s one immediate thing: that the universe, whatever foliation one uses to describe it, is just a universal computer, and nothing more. And that hypercomputation is never possible in the universe.

But given the structure of our models, there’s more. Just like there’s a maximum speed in physical space (the speed of lightc), and a maximum speed in branchial space (the maximum entanglement speed ζ), so also there must be a maximum speed in rulial space, which we can call ρ—that’s effectively another fundamental constant of nature. (The constancy of ρ is in effect a reflection of the Principle of Computational Equivalence.)

But what does moving in rulial space correspond to? Basically it’s a change of rule. And to say that this can only happen at a finite speed is to say that there’s computational irreducibility: that one rule cannot emulate another infinitely fast. And given this finite “speed of emulation” there are “emulation cones” that are the analog of light cones, and that define how far one can get in rulial space in a certain amount of time.

What are the units of ρ? Essentially they are program length divided by time. But whereas in the theory of computation one typically imagines that program length can be scaled almost arbitrarily by different models of computation, here this is a measure of program length that’s somehow fundamentally anchored to the structure of the rule-space multiway system, and of physics. (By the way, there’ll be an analog of curvature and Einstein’s equations in rulial space too—and it probably corresponds to a geometrization of computational complexity theory and questions like P?=NP.)

There’s more to say about the structure of rulial space. For example, let’s imagine we try to make a foliation in which we freeze time somewhere in rulial space. That’ll correspond to trying to describe the universe using some computationally reducible model—and over time it’ll get more and more difficult to maintain this as emulation cones effectively deliver more and more computational irreducibility.

So what does all this mean for our original goal—of finding a rule to describe our universe? Basically it’s saying that any (computation universal) rule will do—if we’re prepared to craft the appropriate description language. But the point is that we’ve basically already defined at least some elements of our description language: they are the kinds of things our senses detect, our measuring devices measure, and our existing physics describes. So now our challenge is to find a rule that successfully describes our universe within this framework.

For me this is a very satisfactory solution to the mystery of why some particular rule would be picked for our universe. The answer is that there isn’t ultimately ever a particular rule; basically any rule capable of universal computation will do. It’s just that—with some particular mode of description that we choose to use—there will be some definite rule that describes our universe. And in a sense whatever specialness there is to this rule is just a reflection of the specialness of our mode of description. In effect, the only thing special about the universe to us is us ourselves.

And this suggests a definite answer to another longstanding question: could there be other universes? The answer in our setup is basically no. We can’t just “pick another rule and get another universe”. Because in a sense our universe already contains all possible rules, so there can only be one of it. (There could still be other universes that do various levels of hypercomputation.)

But there is something perhaps more bizarre that is possible. While we view our universe—and reality—through our particular type of description language, there are endless other possible description languages which can lead to descriptions of reality that will seem coherent (and even in some appropriate definition “meaningful”) within themselves, but which will seem to us to correspond to utterly incoherent and meaningless aspects of our universe.

I’ve always assumed that any entity that exists in our universe must at least “experience the same physics as us”. But now I realize that this isn’t true. There’s actually an almost infinite diversity of different ways to describe and experience our universe, or in effect an almost infinite diversity of different “planes of existence” for entities in the universe—corresponding to different possible reference frames in rulial space, all ultimately connected by universal computation and rule-space relativity.

The Challenge of Language Design for the Universe

What does it mean to make a model for the universe? If we just want to know what the universe does, well, then we have the universe, and we can just watch what it does. But when we talk about making a model, what we really mean is that we want to have a representation of the universe that somehow connects it to what we humans can understand. Given computational irreducibility, it’s not that we expect a model that will in any fundamental sense “predict in advance” the precise behavior of the universe down to every detail (like that I am writing this sentence now). But we do want to be able to point to the model—whose structure we understand—and then be able to say that this model corresponds to our universe.

In the previous section we said that we wanted to find a rule that we could in a sense connect with the description language that we use for the universe. But what should the description language for the rule itself be? Inevitably there is a great computational distance between the underlying rule and features of the universe that we’re used to describing. So—as I’ve said several times here in different ways—we can’t expect to use the ordinary concepts with which we describe the world (or physics) directly in the construction of the rule.

I’ve spent the better part of my life as a language designer, primarily building what’s now the full-scale computational language that is the Wolfram Language. And I now view the effort to find a fundamental theory of physics as in many ways just another challenge in language design—perhaps even the ultimate such challenge.

In designing a computational language what one is really trying to do is to create a bridge between two domains: the abstract world of what is possible to do computationally, and the “mental” world of what people understand and are interested in doing. There are all sorts of computational processes that one can invent (say running randomly picked cellular automaton rules), but the challenge in language design is to figure out which ones people care about at this point in human history, and then to give people a way to describe these.

Usually in computational language design one is leveraging human natural language—or the more formal languages that have been developed in mathematics and science—to find words or their analogs to refer to particular “lumps of computation”. But at least in the way I have done it, the essence of language design is to try to find the purest primitives that can be expressed this way.

OK, so let’s talk about setting up a model for the universe. Perhaps the single most important idea in my effort to find a fundamental theory of physics is that the theory should be based on the general computational paradigm (and not, for example, specifically on mathematics). So when we talk about having a language in which to describe our model of the universe we can see that it has to bridge three different domains. It has to be a language that humans can understand. It has to be a language that can express computational ideas. And it has to be a language that can actually represent the underlying structure of physics.

So what should this language be like? What kinds of primitives should it contain? The history that has led me to what I describe here is in many ways the history of my attempts to formulate an appropriate language. Is it trivalent graphs? Is it ordered graphs? Is it rules applied to abstract relations?

In many ways, we are inevitably skating at the edge of what humans can understand. Maybe one day we will have built up familiar ways of talking about the concepts that are involved. But for now, we don’t have these. And in a sense what has made this project feasible now is that we’ve come so far in developing ways to express computational ideas—and that through the Wolfram Language in particular those forms of expression have become familiar, at the very least to me.

And it’s certainly satisfying to see that the basic structure of the models we’re using can be expressed very cleanly and succinctly in the Wolfram Language. In fact, in what perhaps can be viewed as some sort of endorsement of the structure of the Wolfram Language, the models are in a sense just a quintessential example of transformation rules for symbolic expressions, which is exactly what the Wolfram Language is based on. But even though the structure is well represented in the Wolfram Language, the “use case” of “running the universe” is different from what the Wolfram Language is normally set up to do.

In the effort to serve what people normally want, the Wolfram Language is primarily about taking input, evaluating it by doing computation, and then generating output. But that’s not what the universe does. The universe in a sense had input at the very beginning, but now it’s just running an evaluation—and with all our different ideas of foliations and so on, we are sampling certain aspects of that ongoing evaluation.

It’s computation, but it’s computation sampled in a different way than we’ve been used to doing it. To a language designer like me, this is something interesting in its own right, with its own scientific and technological spinoffs. And perhaps it will take more ideas before we can finish the job of finding a way to represent a rule for fundamental physics.

But I’m optimistic that we actually already have pretty much all the ideas we need. And we also have a crucial piece of methodology that helps us: our ability to do explorations through computer experiments. If we based everything on the traditional methodology of mathematics, we would in effect only be able to explore what we somehow already understood. But in running computer experiments we are in effect sampling the raw computational universe of possibilities, without being limited by our existing understanding.

Of course, as with physical experiments, it matters how we define and think about our experiments, and in effect what description language we use. But what certainly helps me, at least, is that I’ve now been doing computer experiments for more than forty years, and over that time I’ve been able to slowly refine the art and science of how best to do them.

In a way it’s very much like how we learn from our experience in the physical world. From seeing the results of many experiments, we gradually build up intuition, which in turn lets us start creating a conceptual framework, which then informs the design of our language for describing things. One always has to keep doing experiments, though. In a sense computational irreducibility implies that there will always be surprises, and that’s certainly what I constantly find in practice, not least in this project.

Will we be able to bring together physics, computation and human understanding to deliver what we can reasonably consider to be a final, fundamental theory of physics? It is difficult to know how hard this will be. But I am extremely optimistic that we are finally on the right track, and may even have effectively already solved the fascinating problem of language design that this entails.

Let’s Go Find the Fundamental Theory!

OK, so given all this, what’s it going to take to find the fundamental theory of physics? The most important thing—about which I’m extremely excited—is that I think we’re finally on the right track. Of course, perhaps not surprisingly, it’s still technically difficult. Part of that difficulty comes directly from computational irreducibility and from the difficulty of working out the consequences of underlying rules. But part of the difficulty also comes from the very success and sophistication of existing physics.

In the end our goal must be to build a bridge that connects our models to existing knowledge about physics. And there is difficult work to do on both sides. Trying to frame the consequences of our models in terms that align with existing physics, and trying to frame the (usually mathematical) structures of existing physics in terms that align with our models.

For me, one of the most satisfying aspects of our discoveries over the past couple of months has been the extent to which they end up resonating with a huge range of existing—sometimes so far seemingly “just mathematical”—directions that have been taken in physics in recent years. It almost seems like everyone has been right all along, and it just takes adding a new substrate to see how it all fits together. There are hints of string theory, holographic principles, causal set theory, loop quantum gravity, twistor theory, and much more. And not only that, there are also modern mathematical ideas—geometric group theory, higher-order category theory, non-commutative geometry, geometric complexity theory, etc.—that seem so well aligned that one might almost think they must have been built to inform the analysis of our models.

I have to say I didn’t expect this. The ideas and methods on which our models are based are very different from what’s ever been seriously pursued in physics, or really even in mathematics. But somehow—and I think it’s a good sign all around—what’s emerged is something that aligns wonderfully with lots of recent work in physics and mathematics. The foundations and motivating ideas are different, but the methods (and sometimes even the results) often look to be quite immediately applicable.

There’s something else I didn’t expect, but that’s very important. In studying things (like cellular automata) out in the computational universe of simple programs, I have normally found that computational irreducibility—and phenomena like undecidability—are everywhere. Try using sophisticated methods from mathematics; they will almost always fail. It is as if one hits the wall of irreducibility almost immediately, so there is almost nothing for our sophisticated methods, which ultimately rely on reducibility, to do.

But perhaps because they are so minimal and so structureless our models for fundamental physics don’t seem to work this way. Yes, there is computational irreducibility, and it’s surely important, both in principle and in practice. But the surprising thing is that there’s a remarkable depth of richness before one hits irreducibility. And indeed that’s where many of our recent discoveries come from. And it’s also where existing methods from physics and mathematics have the potential to make great contributions. But what’s important is that it’s realistic that they can; there’s a lot one can understand before one hits computational irreducibility. (Which is, by the way, presumably why we are fundamentally able to form a coherent view of physical reality at all.)

So how is the effort to try to find a fundamental theory of physics going to work in practice? We plan to have a centralized effort that will push forward with the project using essentially the same R&D methods that we’ve developed at Wolfram Research over the past three decades, and that have successfully brought us so much technology—not to mention what exists of this project so far. But we plan to do everything in a completely open way. We’ve already posted the full suite of software tools that we’ve developed, along with nearly a thousand archived working notebooks going back to the 1990s, and soon more than 400 hours of videos of recent working sessions.

We want to make it as easy for people to get involved as possible, whether directly in our centralized effort, or in separate efforts of their own. We’ll be livestreaming what we do, and soliciting as much interaction as possible. We’ll be running a variety of educational programs. And we also plan to have (livestreamed) working sessions with other individuals and groups, as well as providing channels for the computational publishing of results and intermediate findings.

I have to say that for me, working on this project both now and in past years has been tremendously exciting, satisfying, and really just fun. And I’m hoping many other people will be able to share in this as the project goes forward. I think we’ve finally got a path to finding the fundamental theory of physics. Now let’s go follow that path. Let’s have a blast. And let’s try to make this the time in human history when we finally figure out how this universe of ours works!

The Wolfram Physics Project: The First Two Weeks

$
0
0
first-two-weeks

First, Thank You!

We launched the Wolfram Physics Project two weeks ago, on April 14. And, in a word, wow! People might think that interest in fundamental science has waned. But the thousands of messages we’ve received tell a very different story. People really care! They’re excited. They’re enjoying understanding what we’ve figured out. They’re appreciating the elegance of it. They want to support the project. They want to get involved.

It’s tremendously encouraging—and motivating. I wanted this project to be something for the world—and something lots of people could participate in. And it’s working. Our livestreams—even very technical ones—have been exceptionally popular. We’ve had lots of physicists, mathematicians, computer scientists and others asking questions, making suggestions and offering help. We’ve had lots of students and others who tell us how eager they are to get into doing research on the project. And we’ve had lots of people who just want to tell us they appreciate what we’re doing. So, thank you!

The Wolfram Physics Project: The First Two Weeks

Real-Time Science

Science is usually done behind closed doors. But not this project. This project is an open project where we’re sharing—in as real time as we can—what we’re doing and the tools we’re using. In the last two weeks, we’ve done more than 25 hours of livestreams about the project. We’ve given introductions to the project—both lecture style and Q&A. We’ve done detailed technical sessions. And we’ve started livestreaming our actual working research sessions. And in a couple of those sessions we’ve made the beginnings of some real discoveries—live and in public.

Wolfram Physics Livestream Archives

It’s pretty cool to see thousands of people joining us to experience real-time science. (Our peak so far was nearly 8000 simultaneous viewers, and a fairly technical 2-hour session ended up being watched for a total of more than three-quarters of a million minutes.) And we’re starting to see serious “public collaboration” happening, in real time. People are making technical suggestions, sending us links to relevant papers, even sending us pieces of Wolfram Language code to run—all in real time.

One of the great—and unexpected—things about the project is how well what we’ve discovered seems to dovetail with existing initiatives (like string theory, holographic principles, spin networks, higher categories, twistor theory, etc.) We’re keen to understand more about this, so one of the things we’ll be doing is having livestreamed discussions with experts in these various areas.

The Summer School Approaches

It’s only been two weeks since our project was launched—and there’ve already been some interesting things written about it that have helped sharpen my philosophical understanding. There hasn’t yet been time for serious scientific work to have been completed around the project… but we know people are on this path.

We also know that there are lots of people who want to get to the point where they can make serious contributions to the project. And to help with that, we’ve got an educational program coming up: we’ve added a Fundamental Physics track to our annual Wolfram Summer School.

Wolfram Summer School

Our Summer School—which has been running since 2003—is a 3-week program, focused on every participant doing a unique, original project. For the Fundamental Physics track, we’re going to have a “week 0” (June 22–27) that will be lectures and workshops about the Physics Project, followed by a 3-week project-based program (June 28–July 17).

This year’s Summer School will (for the first time) be online (though synchronous), so it’s going to be easier for students from around the world to attend. Many of the students for the Fundamental Physics track will be graduate students or postdocs, but we also expect to have students who are more junior, as well as professors and professionals. Since announcing the program last week, we’ve already received many good applications… but we’re going to try to expand the program to accommodate everyone who makes sense. (So if you’re thinking of applying, please just apply… though do it as soon as you can!)

I’m very excited about what’s going to be achieved at the Summer School. I never expected our whole project to develop as well—or as quickly—as it has. But at this point I think we’ve developed an approach and a methodology that are going to make possible rapid progress in many directions. And I’m fully expecting that there’ll be projects at the Summer School that lead, for example, to academic papers that rapidly become classics.

This is one of those rare times when there’s a lot of exceptionally juicy low-hanging fruit—and I’m looking forward to helping outstanding students find and pick that scientific fruit at our Summer School.

New Science in the First Two Weeks

It’s not too surprising that much of our time in the first two weeks after launching the project has been spent on “interfacing with the world”—explaining what we’re doing, trying to respond to thousands of messages, and setting up internal and external systems that can make future interactions easier.

But we’ve been very keen to go on working on the science, and some of that has been happening too. We’ve so far done five livestreamed working sessions, three on spin and charge, one on the interplay with distributed computing, and one on combinators and physics. Of course, this is just what we’re directly working on ourselves. We’ve also already helped several people get started on projects that use their expertise—in physics, mathematics or computer science—and it’s wonderful to see the beginning of this kind of “scaling up”.

But let me talk a bit about things I think I’ve learned in the past two weeks. Some of this comes from the working sessions we’ve had; some is in response to questions at our Q&As and some is just the result of my slowly growing understanding—particularly helped by my efforts in explaining the project to people.

What Is Angular Momentum?

OK, so here’s something concrete that came out of our working session last Thursday: I think we understand what angular momentum is. Here’s part of where we figured that out:

Physics working session

We already figured out a few months ago what linear momentum is. If you want to know the amount of linear momentum in a particular direction at a particular place in the hypergraph, you just have to see how much “activity” at that place in the hypergraph is being transferred in that “direction”.

Directions are defined by geodesics that give the shortest path between one point and another. Momentum in a particular direction then corresponds to the extent to which an update at one point leads to updates at nearby points along that direction. (More formally, the statement is that momentum is given by the flux of causal edges through timelike hypersurfaces.)

OK, so how about angular momentum? Well, it took us a total of nearly 6 hours, over three sessions, but here’s what we figured out. (And kudos to Jonathan Gorard for having had a crucial idea.)

So, first, what’s the usual concept of angular momentum in physics? It’s all about turning. It’s all about momentum that doesn’t add up to go in any particular direction but just circulates around. Here’s the picture we used on the livestream:

VectorPlot
&#10005
VectorPlot[{y, -x}, {x, -3, 3}, {y, -3, 3}]

Imagine this is a fluid, like water. The fluid isn’t flowing in a particular direction. Instead, it’s just circulating around, creating a vortex. And this vortex has angular momentum.

But what might the analog of this be in a hypergraph? To figure this out, we have to understand what rotation really is. It took us a little while to untangle this, but in the end it’s very simple. In any number of dimensions, a rotation is something that takes two vectors rooted at a particular point, and transforms one into the other. On the livestream, we used the simple example:

Graphics3D
&#10005
Graphics3D[{Thick, InfinitePlane[{{0, 0, 0}, {1, 0, 0}, {0, 1, 2}}], 
  Arrow[{{0, 0, 0}, {1, 0, 0}}], Arrow[{{0, 0, 0}, {0, 1, 2}}]}]

And in the act of transforming one of these vectors into the other we’re essentially sweeping out a plane. We imagined filling in the plane by making something like a string figure that joins points on the two vectors:

Graphics3D
&#10005
Graphics3D[Table[Line[{{i, 0, 0}, {0, j, 0}}], {i, 10}, {j, 10}]]

But now there’s an easy generalization to the hypergraph. A single geodesic defines a direction. Two geodesics—and the geodesics “strung” between them—define a plane. Here’s what we created to give an illustration of this:

Generalization to the hypergraph

So now we are beginning to have a picture of angular momentum: it is “activity” that “circulates around” in this little “patch of plane” defined by two geodesics from a particular point. We can get considerably more formal than this, talking about flux of causal edges in slices of tubes defined by pairs of geodesics. On the livestream, we started relating this to the tensor Jμν which defines relativistic angular momentum (the two indices of Jμν basically correspond to our two geodesics).

There are details to clean up, and further to go. (Rotating frames in general relativity? Rotating black holes? Black-hole “no hair” theorems? Etc.) But this was our first real “aha” moment in a public working session. And of course there’s an open archive both of the livestream itself, and the notebook created in it.

What about Quantum Angular Momentum and Spin?

One of the reasons I wanted to think about angular momentum was because of quantum mechanics. Unlike ordinary momentum, angular momentum is quantized, even in traditional physics. And, more than that, even supposedly point particles—like electrons—have nonzero quantized spin angular momentum.

We don’t yet know how this works in our models. (Stay tuned for future livestreamed working sessions!) But one point is clear: it has to involve not just the spatial hypergraph and the spacetime causal graph (as in our discussion of angular momentum above), but also the multiway causal graph.

And that means we’re dealing not just with a single rotation, but a whole collection of interwoven ones. I have a suspicion that the quantization is going to come from something essentially topological. If you’re looking at, say, fluid flow near a vortex, then when you go around a small circle adding up the flow at every point, you’ll get zero if the circle doesn’t include the center of the vortex, and some quantized value if it does (the value will be directly proportional to the number of times you wind around the vortex).

Assuming we’ve got a causal-invariant system, one feature of the multiway causal graph is that it must consist of many copies of the same spacetime causal graph—in a sense laid out (albeit with complicated interweaving) in branchial space. And it’s also possible (as Jonathan suggested on the livestream) that somehow when one measures an angular momentum—or a spin—one is effectively picking up just a certain discrete number of “histories”, or a discrete number of identical copies of the spacetime causal graph.

But we’ll see. I won’t be surprised if both ideas somehow dovetail together. But maybe we’ll need some completely different idea. Either way, I suspect there’s going to be somewhat sophisticated math involved. We have a guess that the continuum limit of the multiway causal graph is something like a twistor space. So then we might be dealing with homotopy in twistor space—or, more likely, some generalization of that.

On the livestream, various people asked about spinors. We ordinarily think of a rotation through 360° as bringing everything back to where it started from. But in quantum mechanics that’s not how things work. Instead, for something like an electron, it takes a rotation through 720°. And mathematically, that means we’re dealing with so-called spinors, rather than vectors. We don’t yet know how this could come out in our models (though we have some possible ideas)—but this is something we’re planning to explore soon. (It’s again mathematically complicated, because we’re not intrinsically dealing with integer-dimensional space, so we’ve got to generalize the notion of rotation, rotation groups, etc.)

And as I write this, I have a new idea—of trying to see how relativistic wave equations (like the Klein–Gordon equation for spin-0 particles or the Dirac equation for spin-1/2 particles) might arise from thinking about bundles of geodesics in the multiway causal graph. The suspicion is that there would be a subtle relationship between effective spacetime dimension and symmetries associated with the bundle of geodesics, mirroring the way that in traditional relativistic quantum mechanics one can identify different spins with objects transforming according to different irreducible representations of the symmetry group of spacetime.

CPT Invariance?

Related to the whole story about spinors, there’s a fundamental result in quantum field theory called the spin-statistics theorem that says that particles with half-integer spins (like electrons) are fermions (and so obey the exclusion principle), while particles with integer spins (like photons) are bosons (and so can form condensates). And this in turn is related to what’s called CPT invariance.

And one of the things that came out of a livestream last week is that there’s potentially a very beautiful interpretation of CPT invariance in our models.

What is CPT invariance? C, P and T correspond to three potential transformations applied to physical systems. T is time reversal, i.e. having time run in reverse. P is parity, or space inversion: reversing the sign of all spatial coordinates. And C is charge conjugation: turning particles (like electrons) into antiparticles (like positrons). One might think that the laws of physics would be invariant under any of these transformations. But in fact, each of C, P and T invariance is violated somewhere in particle physics (and this fact was a favorite of mine back when I did particle physics for a living). However, the standard formalism of quantum field theory implies that there is still invariance under the combined CPT transformation—and, so far as one can tell, this is experimentally correct.

OK, so what do C, P and T correspond to in our models? Consider the multiway causal graph. Here’s a toy version of it, that we discussed in a livestream last week:

Graph3D
&#10005
Graph3D[GridGraph[{6, 6, 6}]]

Edges in one direction (say, down) correspond to time. Edges in another direction correspond to space. And edges in the third direction correspond to branchial space (i.e. the space of quantum spaces).

T and P then have simple interpretations: they correspond to reversing time edges and space edges, respectively. C is a little less clear, but we suspect that it just corresponds to reversing branchial edges (and this very correspondence probably tells us something about the nature of antiparticles).

So then CPT is like a wholesale inversion of the multiway causal graph. But what can we say about this? Well, we’ve argued that (with certain assumptions) spacetime slices of the multiway causal graph must obey the Einstein equations. Similarly, we’ve argued that branchtime slices follow the Feynman path integral. But now there’s a generalization of both these things: in effect, a generalization of the Einstein equations that applies to the whole multiway causal graph. It’s mathematically complicated—because it must describe the combined geometry of physical and branchial space. But it looks as if CPT invariance must just correspond to a symmetry of this generalized equation. And to me this is something very beautiful—that I can hardly wait to investigate more.

What’s Going On in Quantum Computing?

One feature of our models is that they potentially make it a lot more concrete what’s going on in quantum computing. And over the past couple of weeks we’ve started to think about what this really means.

There are two basic points. First, the multiway graph provides a very explicit representation of “quantum indeterminacy”. And, second, thinking about branchial space (and quantum observation frames) gives more concreteness to the notion of quantum measurement.

A classical computer like an ordinary Turing machine is effectively just following one deterministic path of evolution. But the qualitative picture of a quantum computer is that instead it’s simultaneously following many paths of evolution, so that in effect it can do many Turing-machine-like computations in parallel.

But at the end, there’s always the issue of finding which path or paths have the answer you want: and in effect you have to arrange your measurement to just pick out these paths.

In ordinary Turing machines, there are problems (like multiplying numbers) that are in the class P, meaning that they can be done a number of steps polynomial in the size of the problem (say, the number of digits in the numbers). There are also problems (like factoring numbers) that are in the class NP, meaning that if you were to “non-deterministically” guess the answer, you could check it in polynomial time.

A core question in theoretical computing science (which I have views on, but won’t discuss here) is whether P=NP, that is, whether all NP problems can actually be done in polynomial time.

One way to imagine doing an NP problem in polynomial time is not to use an ordinary Turing machine, but instead to use a “non-deterministic Turing machine” in which there is a tree of possible paths where one can pick any path to follow. Well, our multiway system representing quantum mechanics essentially gives that whole tree (though causal invariance implies that ultimately the branches always merge).

For the last several years, we’ve been developing a framework for quantum computing in the Wolfram Language (which we’re hoping to release soon). And in this framework we’re essentially describing two things: how quantum information is propagated with time through some series of quantum operations, and how the results of quantum processes are measured. More formally, we have time evolution operators, and we have measurement operators.

Well, here’s the first neat thing we’ve realized: we can immediately reformulate our quantum computing framework directly in terms of multiway systems. The quantum computing framework can in effect just be viewed as an application of our MultiwaySystem function that we put in the Wolfram Function Repository for the Physics Project.

But now that we’re thinking in terms of multiway systems—or the multiway causal graph—we realize that standard quantum operations are effectively associated with timelike causal edges, while measurement operations are associated with branchlike causal edges. And the extent to which one can get answers before decoherence takes over has to do with a competition between these kinds of edges.

This is all very much in progress right now, but in the next few weeks we’re expecting to be able to look at well-known quantum algorithms in this context, and see whether we can analyze them in a way that treats time evolution and measurement on a common footing. (I will say that ever since the work I did with Richard Feynman on quantum computing back in the early 1980s, I have always wanted to really understand the “cost of measurement”, and I’m hoping that we’ll finally be able to do that now.)

Numerical Relativity; Numerical Quantum Field Theory

Although the traditional view in physics is that space and time are continuous, when it comes to doing actual computer simulations they usually in the end have to be discretized. And in general relativity (say for simulating a black hole merger) it’s usually a very subtle business, in which the details of the discretization are hard to keep track of, and hard to keep consistent.

In our models, of course, discretization is not something “imposed after the fact”, but rather something completely intrinsic to the model. So we started wondering whether somehow this could be used in practice to set up simulations.

It’s actually a very analogous idea to something I did rather successfully in the mid-1980s for fluid flow. In fluids, as in general relativity, there’s a traditional continuum description, and the most obvious way of doing simulations is by discretizing this. But what I did instead was to start with an idealized model of discrete molecules—and then to simulate lots of these molecules. My interest was to understand the fundamental origins of things like randomness in fluid turbulence, but variants of the method I invented have now become a standard approach to fluid simulation.

So can one do something similar with general relativity? The actual “hypergraph of the universe” would be on much too tiny a scale for it to be directly useful for simulations. But the point is that even on a much larger scale our models can still approximate general relativity—but unlike “imposed after the fact” discretization, they are guaranteed to have a certain internal consistency.

In usual approaches to “numerical relativity” one of the most difficult things is dealing with progressive “time evolution”, not least because of arbitrariness in what coordinates one should use for “space” and “time”. But in our models there’s a way of avoiding this and directly getting a discrete structure that can be used for simulation: just look at the spacetime causal graph.

There are lots of details, but—just like in the fluid flow case—I expect many of them won’t matter. For example, just like lots of rules for discrete molecules yield the same limiting thermodynamic behavior, I expect lots of rules for the updating events that give the causal graph will yield the same limiting spacetime structure. (Like in standard numerical analysis, though, different rules may have different efficiency and show different pathologies.)

It so happens that Jonathan Gorard’s “day job” has centered around numerical relativity, so he was particularly keen to give this a try. But even though we thought we had just started talking about the idea in the last couple of weeks, Jonathan noticed that actually it was already there on page 1053 of A New Kind of Science—and had been languishing for nearly 20 years!

Still, we immediately started thinking about going further. Beyond general relativity, what about quantum field theory? Things like lattice gauge theory typically involve replacing path integrals by “thermal averages”—or effectively operating in Euclidean rather than Minkowski spacetime. But in our models, we potentially get the actual path integral as a limit of the behavior of geodesics in a multiway graph. Usually it’s been difficult to get a consistent “after the fact” discretization of the path integral; but now it’s something that emerges from our models.

We haven’t tried it yet (and someone should!). But independent of nailing down precisely what’s ultimately underneath quantum field theory it seems like the very structure of our models has a good chance of being very helpful just in dealing in practice with quantum field theory as we already know it.

Surprise: It’s Not Just about Physics

One of the big surprises of the past two weeks has been our increasing realization that the formalism and framework we’re developing really aren’t just relevant to physics; they’re potentially very important elsewhere too.

In a sense this shouldn’t be too surprising. After all, our models were constructed to be as minimal and structureless as possible. They don’t have anything intrinsically about physics in them. So there’s no reason they can’t apply to other things too.

But there’s a critical point here: if a model is simple enough, one can expect that it could somehow be a foundation for many different kinds of things. Long ago I found that with the 1D cellular automata I studied. The 256 “elementary” cellular automata are in a sense the very simplest models of a completely discrete system with a definite arrangement of neighbors. And over the years essentially all of these 256 cellular automata found uses as models for bizarrely different things (pigmentation, catalysis, traffic, vision, etc.).

Well, our models now are in a sense the most minimal that describe systems with rules based on arbitrary relationships (as represented by collections of relations).

And the first big place where it seems the models can be applied is in distributed computing. What is distributed computing? Essentially it’s about having a whole collection of computing elements that are communicating with others to collectively perform a computation.

In the simplest setup, one just assumes that all the computing elements are operating in lockstep—like in a cellular automaton. But what if the computing elements are instead operating asynchronously, sending data to each other when it happens to be ready?

Well, this setup immediately seems a lot more like the situation we have in our models—or in physics—where different updates can happen in any order, subject only to following the causal relationships defined by the causal graph.

But now there start to be interesting analogies between the distributed computing case and physics. And indeed what’s got me excited is that I think there’s going to be a very fruitful interplay between these areas. Ideas in distributed computing are going to be useful for thinking about physics—and vice versa.

I’m guessing that phenomena and results in distributed computing are going to have direct analogs in general relativity and in quantum mechanics. (“A livelock is like a closed timelike curve”, etc.) And that ideas from physics in the context of our models are going to give one new ways to think about distributed computing. (Imagine “programming in a particular reference frame”, etc.)

In applying our models to physics, a central idea is causal invariance. And this has an immediate analog in distributed computing: it’s the idea of eventual consistency, or in other words that it doesn’t matter what order operations are done in; the final result is always the same.

But here’s something from physics: our universe (fortunately!) doesn’t seem like it’s going to halt with a definite “final result”. Instead, it’s just continually evolving, but with causal invariance implying various kinds of local equivalence and consistency. And indeed many modern distributed computing systems are again “just running” without getting to “final results” (think: the internet, or a blockchain).

Well, in our approach to physics the way we handle this is to think in terms of foliations and reference frames—which provide a way to organize and understand what’s going on. And I think it’s going to be possible to think about distributed computing in the same kind of way. We need some kind of “calculus of reference frames” in terms of which we can define good distributed computing primitives.

In physics, reference frames are most familiar in relativity. The most straightforward are inertial frames. But in general relativity there’s been slow but progressive understanding of other kinds of frames. And in our models we’re also led to think about “quantum observation frames”, which are essentially reference frames in the branchial space of quantum states.

Realistically, at least for me, it’s so far quite difficult to wrap one’s head around these various kinds of reference frames. But I think in many ways this is at its root a language design problem. Because if we had a good way to talk about working with reference frames we’d be able to use them in distributed computing and so we’d get familiar with them. And then we’d be able to import our understanding to physics.

One of the most notable features of our models for physics when it comes to distributed computing is the notion of multiway evolution. Usually in distributed computing one’s interested in looking at a few paths, and making sure that, for example, nothing bad can happen as a result of different orders of execution. But in multiway systems we’re not just looking at a few paths; we’re looking at all paths.

And in our models this isn’t just some kind of theoretical concept; it’s the whole basis for quantum mechanics. And given that we’re looking at all paths, we’re led to invent things like quantum observation frames, and branchial space. We can think of the branching of paths in the multiway system as corresponding to elementary pieces of ambiguity. And in a sense the handling of our model—and the features of physics that emerge—is about having ways to deal with “ambiguity in bulk”.

Is there an analog of the Feynman path integral in distributed computing? I expect so—and I wouldn’t be surprised if it’s very useful in giving us a way to organize our thinking and our programs.

In theoretical analyses of distributed computing, one usually ignores physical space—and the speed of light. But with our models, it’s going to be possible to account for such things, alongside branchial connections, which are more like “instantaneous network connections”. And, for example, there’ll be analogs of time dilation associated with motion in both physical space and branchial space. (Maybe such effects are already known in distributed computing; I’m not sure.)

I think the correspondence between distributed computing and physics in the context of our models is going to be incredibly fertile. We already did one livestreamed working session about it (with Taliesin Beynon as a guest); we’ll be doing more.

Distributed computing

In the working session we had, we started off discussing vector clocks in distributed computing, and realized that they’re the analog of geodesic normal coordinates in physics. Then we went on to discuss more of the translation dictionary between distributed computing and physics. We realized that race conditions correspond to branch pairs. The branchial graph defines sibling tasks. Reads and writes are just incoming and outgoing causal edges. We invented the idea of a “causal exclusion graph”, which is a kind of complement of a causal graph, saying not what events can follow a given event, but rather what events can’t follow a given event.

We started discussing applications. Like clustered databases, multiplayer games and trading in markets. We talked about things like Git, where merge conflicts are like violations of causal invariances. We talked a bit about blockchains—but it seemed like there were richer analogs in hashgraphs and things like NKN and IOTA. Consensus somehow seemed to be the analog of “classicality”, but then there’s the question of how much can be achieved in the “quantum regime”.

Although for me the notion of seriously using ideas from physics to think about distributed computing is basically less than two weeks old, I’ve personally been wondering about how to do programming for distributed computing for a very long time. Back in the mid-1980s, for example, when I was helping a company (Thinking Machines Corporation) that was building a 65536-processor computer (the Connection Machine), I thought the most plausible way to do programming on such a system would be through none other than graph rewriting.

But at the time I just couldn’t figure out how to organize such programming so that programmers could understand what was going on. But now—through thinking about physics—I’m pretty sure there’s going to be a way. We’re already used to the idea (at least in the Wolfram Language) that we can write a program functionally, procedurally, declaratively, etc. I think there are going to be ways to write distributed programs “in different reference frames”. It’s probably going to be more structured and more parametrized than these different traditional styles of programming. But basically it’ll be a framework for looking at a given program in different ways, and using different foliations to understand and describe what it’s supposed to do.

I have to mention one more issue that’s been bugging me since 1979. It has to do with recursive evaluation. Imagine we’ve defined a Fibonacci recursion:

f[n_] := f[n - 1] + f[n - 2]
&#10005
f[n_] := f[n - 1] + f[n - 2]
f[1] = f[2] = 1
&#10005
f[1] = f[2] = 1

Now imagine you enter f[10]. How should you evaluate this? At the first step you get f[9]+f[8]. But after that, do you just keep “drilling down” the evaluation of f[9] in a “depth-first way”, until it gets to 1s, or do you for example notice that you get f[8]+f[7]+f[8], and then collect the f[8]s and evaluate them only once?

In my Mathematica-precursor system SMP, I tried to parametrize this behavior, but realistically nobody understood it. So my question now is: given the idea of reference frames, can we invent some kind of notion of “evaluation fronts” that can be described like foliations, and that define the order of recursive evaluation?

An extreme case of this arises in evaluating S, K combinators. Even though S, K combinators are 100 years old this year, they remain extremely hard to systematically wrap one’s head around. And part of the reason has to do with evaluation orders. It’s fine when one manages to get a combinator expression that can successfully be evaluated (through some path) to a fixed point. But what about one that just keeps “evolving” as you try to evaluate it? There doesn’t seem to be any good formalism for handling that. But I think our physics-based approach may finally deliver this.

So, OK, the models that we invented for physics also seem highly relevant for distributed computing. But what about for other things? Already we’ve thought about two other—completely different—potential applications.

The first, that we actually discussed a bit even the week before the Physics Project was launched, has to do with digital contact tracing in the context of the current pandemic. The basic idea—that we discussed in a livestreamed brainstorming session—is that as people move around with their cellphones, Bluetooth or other transactions can say when two phones are nearby. But the graph of what phones were close to what phones can be thought of as being like a causal graph. And now the question of whether different people might have been close enough in space and time for contagion becomes one of reconstructing spatial graphs by making plausible foliations of the causal graph. There are bigger practical problems to solve in digital contact tracing, but assuming these are solved, the issues that can be informed by our models are likely to become important. (By the way, given a network of contacts, the spreading of a contagious disease on it can be thought of as directly analogous to the growth of a geodesic ball in it.)

One last thing that’s still just a vague idea is to apply our models to develop a more abstract approach to biological evolution and natural selection (both for the overall tree of life, and for microorganisms and tumors). Why might there be a connection? The details aren’t yet clear. Perhaps something like the multiway graph (or rule-space multiway graph) can be used to represent the set of all possible sequences of genetic variations. Maybe there’s some way of thinking about the genotype-phenotype correspondence in terms of the correspondence between multiway graphs and causal graphs. Maybe different sequences of “environments” correspond to different foliations, sampling different parts of the possible sequence of genetic variations. Maybe speciation has some correspondence with event horizons. Most likely there’ll need to be some other layer or variation on the models to make them work. But I have a feeling that something is going to be possible.

It’s been possible for a long time to make “aggregated” models of biological evolution, where one’s looking at total numbers of organisms of some particular type (with essentially the direct analog of differential-equation-based aggregated epidemiological models). But at a more individual-organism level one’s typically been reduced to doing simulations, which tend to have messy issues like just how many “almost fittest” organisms should be kept at every “step” of natural selection. It could be that the whole problem is mired in computational irreducibility. But the robust way in which one seems to be able to reason in terms of natural selection suggests to me that—like in physics—there’s some layer of computational reducibility, and one just has to find the right concepts to be able to develop a more general theory on the basis of it. And maybe the models we’ve invented for physics give us the framework to do this.

Some Coming Attractions

We’re at a very exciting point—where there are an incredible amount of “obvious directions” to go. But here are a few that we’re planning on exploring in the next few days, in our livestreamed working sessions.

The Fine Structure of Black Holes

In traditional continuum general relativity it always seems a bit shocking when there’s some kind of discontinuity in the structure of spacetime. In our fundamentally discrete model it’s a bit less shocking, and in fact things like black holes (and other kinds of spacetime singularities) seem to arise very naturally in our models.

But what exactly are black holes like in our models? Do they have the same kind of “no hair” perfection as in general relativity—where only global properties like mass and angular momentum affect how they ultimately look from outside? And how do our black holes generate things like Hawking radiation?

In a livestream last week, we generated a very toy version of a black hole, with a causal graph of the form:

ResourceFunction["MultiwaySystem"]
&#10005
ResourceFunction["MultiwaySystem"][{"A" -> "AB", "XABABX" -> "XXXX", \
"XXXX" -> "XXXXX"}, {"XAAX"}, 8, "CausalGraphStructure"] // \
LayeredGraphPlot

This “black hole” has the feature that causal edges go into it, but none come out. In other words, things can affect the black hole, but the black hole can’t causally affect anything else. It’s the right basic idea, but there’s a lot missing from the toy version, which isn’t surprising, not least because it’s based on a simple string substitution system, and not even a hypergraph.

What we now need to do is to find more realistic examples. Then what we’re expecting is that it’ll actually be fairly obvious that the black hole only has certain properties. The mass will presumably relate to the number of causal edges that go into the black hole. And now that we have an idea what angular momentum is, we should be able to identify how much of that is going in as well. And maybe we’ll be able to see that there’s a limit on the amount of angular momentum a black hole of a given mass can have (as there seems to be in general relativity).

Some features of black holes we should be able to see by looking at ordinary spacetime causal graphs. But to understand Hawking radiation we’re undoubtedly also going to have to look at multiway causal graphs. And we’re hoping that we’ll actually be able to explicitly see the presence of both the causal event horizon and the entanglement event horizon—so that we’ll be able to trace the fate of quantum information in the “life cycle” of the black hole.

All the Spookiness of Quantum Mechanics

Quantum mechanics is notorious for yielding strange phenomena that can be computed within its formalism, but which seem essentially impossible to account for in any other way. Our models, however, finally provide a definite suggestion for what is “underneath” quantum mechanics—and from our models we’ve already been able to derive many of the most prominent phenomena in quantum mechanics.

But there are plenty more phenomena to consider, and we’re planning to look at this in working sessions starting later this week. One notable phenomenon that we’ll be looking at is the violation of Bell’s inequality—which is often said to “prove” that no “deterministic” theory can reproduce the predictions of quantum mechanics. Of course, our theory isn’t “deterministic” in the usual sense. Yes, the whole multiway graph is entirely determined by the underlying rule. But what we observe depends on measurements that sample collections of branches determined by the quantum observation frames we choose.

But we’d still like to see explicitly how Bell’s inequality is violated—and in fact we suspect that in our multiway graph formalism it’ll be much more straightforward to see how this and its various generalizations work. But we’ll see.

In Q&A sessions that we’ve done, and messages that we’ve received, there’ve been many requests to reproduce a classic quantum result: interference in the double-slit experiment. A few months ago, I would have been very pessimistic about being able to do this. I would have thought that first we’d have to understand exactly what particles are, and then we’d only slowly be able to build up something we could consider a realistic “double slit”.

But one of the many surprises has been that quantum phenomena seem much more robust than I expected—and it seems possible to reproduce their essential features without putting all the details in. So maybe we’ll be able, for example, just to look at a multiway system generated by a string substitution system, and already be able to see something like interference fringes in an idealized double-slit experiment. We’ll see.

When we’re talking about quantum mechanics, many important practical phenomena arise from looking at bound states where for example some particle is restricted to a limited region (like an electron in a hydrogen atom), and we’re interested in various time-repeating eigenstates. My first instinct—as in the case of the double-slit experiment—was to think that studying bound states in our models would be very complicated. After all, at some level, bound states are a limiting idealization, and even in quantum field theory (or with quantum mechanics formulated in terms of path integrals) they’re already a complicated concept.

But actually, it seems as if it may be possible to capture the essence of what’s going on in bound states with even very simple toy examples in our models—in which for instance there are just cycles in the multiway graph. But we need to see just how this works, and how far we can get, say in reproducing the features of the harmonic oscillator in quantum mechanics.

In traditional treatments of quantum mechanics, the harmonic oscillator is the kind of thing one starts with. But in our models its properties have to be emergent, and it’ll be interesting to see just how “close to the foundations” or how generic their derivation will be able to be.

People Do Care about Thermodynamics

Understanding the Second Law of thermodynamics was one of the things that first got me interested in fundamental physics, nearly 50 years ago. And I was very pleased that by the 1990s I thought I finally understood how the Second Law works: basically it’s a consequence of computational irreducibility, and the fact that even if the underlying rules for a system are reversible, they can still so “encrypt” information about the initial conditions that no computationally limited observer can expect to recover it.

This phenomenon is ultimately crucial to the derivation of continuum behavior in our models—both for spacetime and for quantum mechanics. (It’s also critical to my old derivation of fluid behavior from idealized discrete underlying molecules.)

The Second Law was big news at the end of the 1800s and into the early 1900s. But I have to say that I thought people had (unfortunately) by now rather lost interest in it, and it had just become one of those things that everyone implicitly assumes is true, even though if pressed they’re not quite sure why. So in the last couple of weeks I’ve been surprised to see so many people asking us whether we’ve managed to understand the Second Law.

Well, the answer is “Yes!”. And in a sense the understanding is at an even more fundamental level than our models: it’s generic to the whole idea of computational models that follow the Principle of Computational Equivalence and exhibit computational irreducibility. Or, put another way, once everything is considered to be computational, including both systems and observers, the Second Law is basically inevitable.

But just where are its limits, and what are the precise mathematical conditions for its validity? And how, for example, does it relate in detail to gravity? (Presumably the reference frames that can be set up are limited by the computational capabilities of observers, which must be compared to the computations being done in the actual evolution of spacetime.) These are things I’ve long wanted to clarify, and I’m hoping we’ll look at these things soon.

What about Peer Review and All That?

There’s a lot in our Physics Project. New ideas. New methods. New conclusions. And it’s not easy to deliver such a thing to the world. We’ve worked hard the last few months to write the best expositions we can, and to make software tools that let anyone reproduce—and extend—everything we’ve done. But the fact remains that to seriously absorb what we just put into the world is going to take significant effort.

It isn’t the way science usually works. Most of the time, progress is slow, with new results trickling out, and consensus about them gradually forming. And in fact—until a few months ago—that’s exactly how I expected things would go with our Physics Project. But—as I explained in my announcement—that’s not how it worked out. Because, to my great surprise, once we started seriously working on the ideas I originally hatched 30 years ago we suddenly discovered that we could make dramatic progress.

And even though we were keen to open the project up, even the things we discovered—together with their background ideas and methods—are a lot to explain, and, for example, fill well over 800 pages.

But how does that fit into the normal, academic way of doing science? It’s not a great fit. When we launched the project two weeks ago, I sent mail to a number of people. A historian of science I’ve known for a long time responded:

Please remember as you go forward that, many protestations to the contrary, most scientists hate originality, which feels strange, uncomfortable, and baffling. They like novelty well within the boundaries of what they’re doing and the approach that they’re taking, but originality is harder for them to grasp. Therefore expect opposition based on incomprehension rather than reasoned disagreement. Hold fast.

My knowledge of history, and my own past experiences, tell me that there’s a lot of truth to this. Although I’m happy to say that in the case of our project it seems like there are actually a very good number of scientists who are enthusiastically making the effort to understand what we’ve done.

Of course, there are people who think “This isn’t the way science usually works; something must be wrong”. And the biggest focus seems to be around “What about peer review?”. Well, that’s an interesting question.

What’s ultimately the point of peer review? Basically it’s that people want external certification that something is correct—before they go to the effort of understanding it themselves, or start building on it. And that’s a reasonable thing to want. But how should it actually work?

When I used to publish academic papers in the 1970s and early 1980s I quickly discovered something disappointing about actual peer review—that closely mirrors what my historian-of-science friend said. If a paper of mine was novel though not particularly original, it sailed right through peer review. But if it was actually original (and those are the papers that have had the most impact in the end) it essentially always ran into trouble with peer review.

I think there’s also always been skullduggery with anonymous peer review—often beyond my simplified “natural selection” model: “If paper cites reviewer, accept; otherwise reject”. But particularly for people outside of science, it’s convenient to at least imagine that there’s some perfect standard of academic validity out there.

I haven’t published an ordinary academic paper since 1986, but I was rather excited two weeks ago to upload my first-ever paper to arXiv. I was surprised it took about a week to get posted, and I was thinking it might have run into some filter that blocks any paper about a fundamental theory of physics—on the “Bayesian” grounds that there’s never been a meaningful paper with such a claim during the time arXiv has been operating. But my friend Paul Ginsparg (founder of arXiv) tells me there’s nothing like that in place; it’s just a question of deciding on categories and handling hundreds of megabytes of data.

OK, but is there a good way to achieve the objectives of peer review for our project? I was hoping I could submit my paper to some academic journal and then leave it to the journal to just “run the peer-review process”. But on its own, it doesn’t seem like it could work. And in particular, it’s hard to imagine that in the normal course of peer reviewing there could be serious traditional peer review on a 450-page document like this that would get done in less than several years.

So over the past week we’ve been thinking about additional, faster things we can do (and, yes, we’ve also been talking to people to get “peer reviews” of possible peer-review processes, and even going to another meta level). Here’s what we’ve come up with. It’s based on the increasingly popular concept of “post-publication peer review”. The idea is to have an open process, where people comment on our papers, and all relevant comments and comments-on-comments, etc. are openly available on the web. We’re trying—albeit imperfectly—to get the best aspects of peer review, and to do it as quickly as possible.

Among other things, what we’re hoping is that people will say what they can “certify” and what they cannot: “I understand this, but don’t have anything to say about that”. We’re fully expecting people will sometimes say “I don’t understand this” or “I don’t think that is correct”. Then it’s up to us to answer, and hopefully before long consensus will be reached. No doubt people will point out errors and limitations (including “you should also refer to so-and-so”)—and we look forward to using this input to make everything as good as possible. (Thanks, by the way, to those who’ve already pointed out typos and other mistakes; much appreciated, and hopefully now all fixed.)

One challenge about open post-publication peer review is who will review the reviewers. Here’s what we’ve set up. First, every reviewer gives information about themselves, and we validate that the person posting is who they say they are. Then we ask the reviewer to fill out certain computable facts about themselves. (Academic affiliation? PhD in physics? Something else? Professor? Published on arXiv? ISI highly cited author? Etc.) Then when people look at the reviews, they can filter by these computable facts, essentially deciding for themselves how they want to “review the reviewers”.

I’m optimistic that this will work well, and will perhaps provide a model for review processes for other things. And as I write this, I can’t help noticing that it’s rather closely related to work we’ve done on validating facts for computable contracts, as well as to the ideas that came up in my testimony last summer for the US Senate about “ranking providers” for automated content selection on the internet.

Submit a Peer Review

Other Things

The Project’s Twitter

There’s a lot going on with the Wolfram Physics Project, and we’re expecting even much more, particularly as an increasing number of other people get involved. I’m hoping I’ll be able to write “progress reports” like this one from time to time, but we’re planning on consistently using the new Twitter feed for the project to give specific, timely updates:

@wolframphysics

Please follow us! And send us updates about what you’re doing in connection with the Wolfram Physics Project, so we can post about it.

Wolfram Physics on Twitter

Can We Explain the Project to Kids?

The way I know I really understand something is when I can explain it absolutely from the ground up. So one of the things I was pleased to do a week or so ago was to try to explain our fundamental theory of physics on a livestream aimed at kids, assuming essentially no prior knowledge.

How does one explain discrete space? I decided to start by talking about pixels on a screen. How about networks? Who’s friends with whom. Dimension? Look at 2×2×2… grid graphs. Etc. I thought I managed to get decently far, talking about general relativity, and even quantum mechanics, all, I hope, without relying on more than extremely everyday knowledge.

And particularly since my livestream seemed to get good reviews from both kids and others, I’m planning in the next week or two to put together a written version of this as a kind of “very elementary” introduction to our project.

Length scales

Project Q&A

Thousands of people have been asking us questions about our project. But fortunately, many of the questions have been the same. And over the last couple of weeks we’ve been progressively expanding the Q&A section of the project website to try to address the most common of the questions:

Wolfram Physics Q&A

Visual Gallery

In addition to being (we hope) very interesting from a scientific point of view, our models also produce interesting visual forms. And we’ve started to assemble a “Visual Gallery” of these forms.

They can be screen backgrounds, or Zoom backgrounds. Or they can be turned into stickers or T-shirt designs (or put on mouse pads, if people other than me still use those).

Wolfram Physics Visual Gallery

We’ll be adding lots more items to the Visual Gallery. But it won’t just be pictures. We’ll also be adding 3D geometry for rendering of graphs and hypergraphs.

In principle, this 3D geometry should let one immediately 3D print “universes”. But so far we’ve had difficulty doing this. It seems as if unless we thicken up the connections to the point where they merge into each other, it’s not possible to get enough structural integrity to successfully make a 3D printout with existing technologies. But there’s undoubtedly a solution to this, and we’re hoping someone will figure it out, say using our Wolfram Language computational geometry capabilities.

VR

It’s pretty difficult (at least for me) to “understand” the structure of the graphs and hypergraphs we’re generating. And ever since I started thinking about network models for physics in the 1990s, I’ve wanted to try to use VR to do this. Well, we’re just starting to have a system that lets one interactively manipulate graphs in 3D in VR. We’ll be posting the code soon, and we hope other people will help add features. But it’s getting closer…

Wolfram Physics in VR

It’s an Exciting Time…

This piece is already quite long, but there’s even much more I could say. It’s very exciting to be seeing all this activity around our Physics Project, after only two weeks.

There’s a lot to do in the project, and with the project. This is a time of great opportunity, where all sorts of discoveries are ripe to be made. And I’m certainly enjoying trying to figure out more with our models—and trying to understand all sorts of things I’ve wondered about for nearly half a century. But for me it’s been particularly wonderful to see so many other people engaging with the project. I personally think physics is great. And I really love the elegance of what’s emerging from our models. But right now what’s most important to me is what a tremendous pleasure it is to share all this with such a broad spectrum of people.

I’m looking forward to seeing what the next few weeks bring. We’re off to a really great start…

Event Horizons, Singularities and Other Exotic Spacetime Phenomena

$
0
0

Wolfram Physics Bulletin

Informal updates and commentary on progress in the Wolfram Physics Project

The Structure and Pathologies of Spacetime

In our models, space emerges as the large-scale limit of our spatial hypergraph, while spacetime effectively emerges as the large-scale limit of the causal graph that represents causal relationships between updating events in the spatial hypergraph. An important result is that (subject to various assumptions) there is a continuum limit in which the emergent spacetime follows Einstein’s equations from general relativity.

And given this, it is natural to ask what happens in our models with some of the notable phenomena from general relativity, such as black holes, event horizons and spacetime singularities. I already discussed this to some extent in my technical introduction to our models. My purpose here is to go further, both in more completely understanding the correspondence with general relativity, and in seeing what additional or different phenomena arise in our models.

It should be said at the outset that even after more than 100 years the whole issue of what weird features of spacetime Einstein’s equations can imply remains a rather confusing subject, that is still very far from completely understood. And in some ways I think our models may help clarify what’s going on. Yes, there’s complexity in taking large-scale limits in our models. But unlike with the Einstein equations, which effectively just tell one to “find a spacetime that satisfies certain constraints defined by the equations”, our models give a direct computational way to determine the structure that is supposed to limit to spacetime.

In what follows, we’ll see that our models can reproduce known implications of Einstein’s equations such as black holes, event horizons and spacetime singularities. But they also suggest the possibility of other, in some ways yet more exotic, spacetime phenomena. Some of these may in fact be things that could be found in the Einstein equations; others require descriptions of spacetime more general than the mathematical structure of general relativity can provide (notably in connection with dimension and topology change).

There’s long been a view that at small enough length scales the Einstein equations will have to be supplemented by some lower-level description of spacetime, presumably connected to quantum mechanics. Our models immediately provide a lower-level representation for spacetime in terms of the discrete structure of the spatial hypergraph and the causal graph—with familiar, continuum spacetime emerging as a large-scale limit, much like continuum fluid behavior emerges as a large-scale limit of molecular dynamics.

And there is already a lot that can be said about the structure and behavior of spacetime in our models just on the basis of the spatial hypergraph and the causal graph. But the models also have the feature that they inexorably involve quantum mechanics as a result of the multiway systems obtained by following different possible sequences of updating events. And this means that we can expect to see how various extreme features of spacetime play out at a quantum level, in particular through the multiway causal graph, which includes not only connections associated with ordinary spacetime, but also with branchtime and the branchial space of quantum entanglements.

I won’t go far in this particular bulletin into the exploration of things like the quantum mechanics of black holes, but we’ll get at least an idea of how these kinds of things can be investigated in our models. In addition to seeing the correspondence with what are now standard questions in mathematical physics, we’ll begin to see some indications of more exotic ideas, like the possibility that in our models particles like electrons may correspond to a kind of generalization of black holes.

What General Relativity Says

Perhaps the most famous prediction of general relativity is the existence of black holes. At a mathematical level, a black hole is characterized by the presence of an event horizon that prevents events occurring inside it from affecting ones outside (though events occurring outside can affect what’s inside).

So how does this actually occur in general relativity? Let’s talk about the simplest nontrivial case: the Schwarzschild solution to Einstein’s equations. Physically, the Schwarzschild solution gives the gravitational field outside a spherically symmetric mass. But there’s a wild thing that happens if the mass is high enough and localized enough: one gets a black hole—with an event horizon. The physical story one can tell is that the escape velocity is larger than the speed of light, so nothing can escape. The mathematical story is more subtle and complicated, and took many years to untangle. But in the end there’s a clear description of the event horizon in terms of the causal structure of spacetime: of what events can causally affect what other ones.

But what about the actual Schwarzschild solution? An important simplifying feature is that it’s a solution to Einstein’s equation in the vacuum, not really in any place where there’s actually mass present. Yes, there’s mass that produces the gravitational field. But the Schwarzschild solution just describes the form of the gravitational field outside of the mass. What about in the black hole case?

In a physical black hole created by a collapsing star, there are presumably remnants of the star inside the event horizon. But that’s not what the Schwarzschild solution describes: it just says there’s vacuum everywhere, with one exception—that at the very center of the black hole there’s some kind of singularity. What can one say about this singularity? The equations imply that right at the singularity the curvature of spacetime is infinite. But there’s also something else. Once any geodesic (corresponding, for example, to the world line of a photon) goes inside the event horizon, it’ll only continue for a limited time, always in effect ending up “at the singularity”. There’s no necessary connection between this kind of geodesic incompleteness and infinite curvature; but in a Schwarzschild black hole both occur.

What is really “at” the center of the Schwarzschild black hole? Is it a place where Einstein’s equations don’t apply? Is it even part of the manifold that corresponds to spacetime? The mathematical structure of general relativity says one starts by setting up a manifold, then one puts a metric on it which satisfies Einstein’s equations. There’s no way the dynamics of the system can change the fundamental structure of the manifold. Yes, its curvature can change, but its dimension cannot, and nor can its topology. So if one wants to say that the point at the center of a Schwarzschild black hole “isn’t part of the manifold”, that’s something one has to put in from the beginning, from outside the Einstein equations.

There’s a certain amount that has been done to classify the kinds of singularities that can occur in general relativity. The singularity in a Schwarzschild black hole is a so-called spacelike one—that essentially “cuts off time” for any geodesic, anywhere in space inside the event horizon.

Discovered 50 years after the Schwarzschild solution is the Kerr solution, representing a rotating black hole. Its structure is much wilder than the Schwarzschild solution. A notable feature is the presence of a timelike singularity—in which time can progress, but space is effectively cut off. (Later, we’ll see that spacelike and timelike singularities have quite simple interpretations in our models.) If the angular momentum is below a critical value, there’s an event horizon (or, actually, two) in a Kerr black hole. But above the critical value, there’s no longer an event horizon, and the singularity is “naked”, and potentially able to affect the rest of the universe.

Like the Schwarzschild solution, the Kerr solution is a mathematical construct that defines a static configuration that a gravitational field can have according to Einstein’s equations when no matter is present (and when the underlying manifold has holes cut out for singularities). There has been a long-running debate about what features of such solutions would survive in more realistic situations, such as matter collapsing to form a black hole.

An early conjecture was the weak cosmic censorship hypothesis which stated that in any realistic situation any singularity must be “hidden” behind an event horizon. This meant for example that Kerr-like black holes formed in practice must have angular momenta below the critical value. Numerical simulations did seem to support that supercritical Kerr-like black holes couldn’t form, but a class of constructions were nevertheless found that could in a certain limit generate at least infinitesimal naked singularities.

Ordinary general relativity is firmly rooted to (3+1)-dimensional spacetime. But it can be generalized to higher dimensions. And in this case one finds new and more complex black-hole-like phenomena—and it seems to be easier for naked singularities at least theoretically to arise.

There are lots of other results from general relativity. One example is Penrose’s singularity theorem, which (with various assumptions) implies that (so long as there is nothing with negative mass AKA gravitational repulsion) as soon as geodesics are trapped within a region (e.g. by an event horizon) they must eventually all converge, and thus form a singularity. (The Hawking singularity theorem is basically a time-reversed version, which implies that there must be a singularity at the beginning of the universe.)

Another idea from general relativity is the “no-hair theorem” (or conjecture), which states that, at least from the outside, a limited number of parameters (such as mass and angular momentum) completely characterize any black hole—at least if it’s static. In other words, black holes are in a sense “perfectly smooth” objects; there aren’t gravitational effects outside the event horizon that reveal “inner complexity”. (Note that this is a classical conjecture; it’s quite likely not to be true when quantum effects are included.)

The Einstein equations are nonlinear partial differential equations, and it’s interesting to compare them with other such equations, such as the Navier–Stokes equations for fluid flow. One feature of the Navier–Stokes equations is that for sufficiently high fluid velocities (larger than the speed of sound) shocks develop that involve discontinuities that cannot be directly described by the equations. And particularly when one gets to hypersonic flow what physically happens is that the approximation that the fluid is continuous—and can be modeled by a partial differential equation—breaks down, and the specific dynamics of the molecules in the fluid start to matter.

Does something similar happen in the Einstein equations? It’s not known whether there can be shocks per se. But it seems likely that to understand things like what appear to be singularities one will have to go “underneath” the Einstein equations—as our models do.

What else can one learn from the fluid dynamics analogy? A notable feature of fluid flow is the phenomenon of fluid turbulence, in which random patterns of flow are ubiquitous. It’s still not clear to what extent turbulence is a true feature of the Navier–Stokes equations, and to what extent it’s associated with sensitive dependence on molecular-scale details. I strongly suspect, though, that like in rule 30 (or in the digits of π) the primary effect is an intrinsic generation of randomness, associated with the phenomenon of computational irreducibility. I’d be amazed if something similar doesn’t happen in Einstein’s equations, but it’ll no doubt be mathematically difficult to establish. Plenty of “randomness” is seen in numerical simulations, but absent a precise underlying computational model it’s very difficult to know if the “randomness” is a true feature of the equations that one is trying to approximate, or is just a feature of the approximation scheme used.

In our models, computational irreducibility and intrinsic randomness generation are ubiquitous. But exactly how such “microscopic” randomness will scale up to “gravitational turbulence” isn’t clear.

One further point, applicable to both fluid flow and gravitation, has to do with computational capability. The Principle of Computational Equivalence strongly suggests that in both cases, sophisticated computation will be ubiquitous. One consequence of this will be computation universality. And in fact it seems likely that this will already be manifest even in such simple cases as the interactions of a few fluid vortices, or just three point masses. The result is that highly complex behavior will be possible.

But that doesn’t mean that—for example in the spacetime case—one can’t summarize certain features of the behavior in simple terms, say by describing overall causal structure, or identifying the presence of an event horizon. However, the presence of underlying computational sophistication does mean that there may be computational limitations in coming up with such summaries. For example, to determine whether a certain system has a certain global causal structure valid for all time may in effect require determining the infinite-time behavior of an irreducible computation, which cannot in general be done by a finite computation, and so must be considered formally undecidable.

Identifying Causal Structure

An event horizon is a feature of the global causal structure of a spacetime—a boundary where events “inside” it can’t affect events (or “observers”) “outside” it. In traditional general relativity, it can be mathematically complicated to establish where there’s an event horizon; it effectively requires proving a theorem about how all possible geodesics will be trapped. (In numerical relativity, one can try to just identify “apparent horizons” where geodesics at least seem to be going in directions where they’ll be trapped, even if one can’t be sure what will happen later.)

But in our models, everything is considerably more explicit. We’re dealing with discrete updating events, occurring at discrete points. And at least in principle we can construct causal graphs that fully describe the causal relationships between all possible events that can occur in spacetime.

Here’s an example of such a causal graph:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 
    1}}, Automatic, 11, "LayeredCausalGraph"]

Remember that the only thing that’s ultimately defined by this causal graph is the causal dependency of events—or in effect the partial ordering of events. Time emerges as an overall inexorable progression of events. Space is associated with the relationship of events in slices defined by foliations of the causal graph.

Any given event can affect any event which it can reach by following directed edges in the causal graph, or, in the language of physics, any event which is in its future light cone. But what happens in the infinite limit of the future light cone of a given event? It could be that that infinite limit effectively covers the whole (“spatial”) universe. Or it could be that it only reaches part of the universe. And in that latter case we can expect that there’ll be an event horizon: a boundary to where the effects of our event can reach.

But how can we map which events have how large an effect—or in a sense what the causal structure of spacetime is? Let’s define what we can call a “causal connection graph” that shows the relationships between the future light cones of events. Given two events, it could be that the limits of their future light cones are exactly the same (both covering the whole universe, or both covering the same part of it). In that case, let’s consider these events “causally equivalent”.

In the causal connection graph, the nodes are collections of causally equivalent events. The edges in the graph are determined by the relationships between the future light cones of these collections of events. Consider two collections A and B. If the limit of the future light cone of A contains the limit of the future light cone of B, then we draw a directed edge in the graph from A to B. And if the limit of the future light cone of A intersects the limit of the future light cone of B (but doesn’t contain it), we draw an undirected edge.

There’ll be a causal connection graph defined for events on any spacelike hypersurface that can be specified as a slice through the causal graph. To find the ultimate causal connection graph, we’d then have to look at the infinite limits of future light cones from events in this spacelike hypersurface. But in practice, we can at least try to approximate this by looking not at the infinite limit of the future light cones, but instead at their limit on some spacelike hypersurface “far enough” in the future. (By the way, this is similar to issues that arise in thinking about “absolute” vs. “apparent” horizons in general relativity.)

OK, so let’s look at the causal graph we had above, and consider four events starting at the top:

HighlightGraph
&#10005
HighlightGraph[
 ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 1}}, 
  Automatic, 9, "LayeredCausalGraph"], Range[4], 
 VertexLabels -> Automatic]

The future light cones for events 1 and 2 contain the whole causal graph. But the future light cones for events 3 and 4 are respectively:

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 1}}, 
       Automatic, 9, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 10]], Red, 
      Thick]}, ImageSize -> 300]] & /@ {3, 4}

These future light cones are distinct. So if we construct the causal connection graph for the spacelike hypersurface containing events 3 and 4, it’ll just consist of two separated nodes. In effect the universe here has broken into two non-communicating subuniverses. In the language of general relativity, we can say that there’s a cosmological event horizon that separates these two subuniverses, and no information can go in either direction between them.

The signature of this in the causal connection graph is quite straightforward. Looking at the first few levels in the causal graph, one has:

Graph
&#10005
Graph[ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 1}}, 
  Automatic, 6, "LayeredCausalGraph"], VertexLabels -> Automatic]

Computing the causal connection graph for successive levels, one gets:

Table
&#10005
Table[Framed[
  Graph[ResourceFunction["CausalConnectionGraph"][
    ResourceFunction[
       "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 1}}, 
      Automatic, #, "LayeredCausalGraph"] &, t0, 12], 
   VertexSize -> .4, VertexLabels -> Automatic, 
   ImageSize -> {150, 30}], FrameStyle -> LightGray], {t0, 1, 5}]

At the first and second levels, no “event horizon” has yet formed, and there’s just a single set of causally equivalent events. But at level 3, there start to be two separate sets of causally equivalent events; an “event horizon” has formed.

So what does the spatial hypergraph “underneath” these look like? Here’s how it evolves (using our standard updating order) for the first few steps:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> Tiny] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 1}}, 
  Automatic, 10, "StatesList"]

It’s very simple compared to the spatial hypergraph for any “real universe”. But it’s still useful as an example for understanding ideas about causal structure. And what we see is that in this tiny “toy universe”, two “lobes” develop, which evolve like two separate subuniverses, with no causal interdependence.

But while there’s no causal interdependence between the “lobes”, the spatial hypergraph is still connected. In our models, however, there’s nothing to say that the spatial hypergraph can’t actually become disconnected—and here’s an example where it does:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {1, 
    2}}, Automatic, 12, "LayeredCausalGraph"]
Framed
&#10005
Framed[ResourceFunction["WolframModelPlot"][#, ImageSize -> Tiny], 
   FrameStyle -> LightGray] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {1, 2}}, 
  Automatic, 10, "StatesList"]

What’s the significance of this? If the spatial hypergraph stays connected, then even if two regions are at some point causally disconnected, it still remains possible for them to reconnect (in effect revealing that one did not have a true event horizon). But if the hypergraph is actually disconnected, no such reconnection is ever possible (unless one has a fundamentally nonlocal rule whose left-hand side is itself a disconnected hypergraph, so that it can effectively pick up elements “anywhere in any universe”, regardless of whether there are any existing relations between the elements).

In traditional general relativity, it is certainly possible to have multiple universes, each with their own disconnected manifold representing space. But while the Einstein equations support the formation of event horizons and causal disconnection, they do not support “forming a new universe” with its own topologically distinct disconnected manifold. To get that requires something like our model, with its much greater flexibility in the fundamental description of space. (In the usual mathematical analysis of general relativity, one defines up front the manifold on which one’s operating, though it’s perhaps conceivable that at least in some limit a metric could develop that’s compatible only with a different topology.)

Note that at the level of the causal connection graph, one just sees progressive formation of subuniverses, with cosmological event horizons; there’s no direct trace of disconnection in the spatial hypergraph:

Table
&#10005
Table[Framed[
  Graph[ResourceFunction["CausalConnectionGraph"][
    ResourceFunction[
       "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {1, 2}}, 
      Automatic, #, "LayeredCausalGraph"] &, t0, 12], 
   VertexSize -> .4, ImageSize -> {150, 30}], 
  FrameStyle -> LightGray], {t0, 1, 7}]

Black-Hole-Like Event Horizons

The type of causal disconnection that we’ve discussed so far is bidirectional: information can’t propagate in either direction across the event horizon. But for black holes, the story is different, basically with effects able to flow into the black hole, but not out.

Here’s a very simple example of something like this in our models:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 
    2}}, Automatic, 12, "LayeredCausalGraph"]

In the “universe at large”, represented rather trivially here by the outermost vertical edges, information can propagate back and forth. But if it goes into the “black hole” at the center, it never comes out again.

Here’s the rule that’s being used

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 2}}]]

and here’s what’s going on in the spatial hypergraph:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> Tiny] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 2}}, 
  Automatic, 10, "StatesList"]

It’s a bit easier to see what’s happening if we look at every single event in the spatial hypergraph:

Show
&#10005
Show[#, ImageSize -> {60, 30}] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 2}}, 
  Automatic, 14, "EventsStatesPlotsList"]

And basically what we see is that events propagate causal effects in both directions on the “ring” that represents the “universe at large”, but any causal effect on the “prong” only goes in one direction, making it correspond to a very simple analog of a black hole.

It is important to note that the black hole “doesn’t form” immediately. The “universe” has to “expand” for a couple of steps before one can distinguish a “black hole” event horizon.

Given the causal graph

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 2}}, 
  Automatic, 7]["LayeredCausalGraph", 
 VertexLabels -> Placed[Automatic, {After, Above}]]

one can construct a causal connection graph. Looking at this on successive steps one gets:

Table
&#10005
Table[Framed[Graph[ResourceFunction["CausalConnectionGraph"]
    [ResourceFunction[
       "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 2}}, 
      Automatic, #, "LayeredCausalGraph"] &, t0, 12], 
   VertexSize -> .1, VertexLabels -> Automatic, 
   ImageSize -> {150, 30}], FrameStyle -> LightGray], {t0, 1, 7}]

For the first three steps, there’s only one set of causally equivalent events. But by step 4, a second set forms. And there’s now a one-way causal connection between the first set of causally equivalent events and the second: in other words, a “black-hole-like” event horizon has formed.

Let’s look at a slightly more complicated case, though still with a very simple rule:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}]]

The causal graph in this case is:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 
    4}}, Automatic, 25, "LayeredCausalGraph"]

Looking at the causal connection graph for successive steps, we see that after a little while a “black hole event horizon” forms:

Table
&#10005
Table[Framed[
  Graph[ResourceFunction["CausalConnectionGraph"][
    ResourceFunction[
       "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
      Automatic, #, "LayeredCausalGraph"] &, t0, 20], 
   ImageSize -> {120, 30}, VertexSize -> .2], 
  FrameStyle -> LightGray], {t0, 2, 12}]

If we look at forward light cones of different events, we see that some “fill the whole universe”, but others stay localized to the region on the left. One can think of the first set as being the light cones of events that are in the “universe at large”; the second set are the light cones of events that are “trapped inside a black hole”:

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
       Automatic, 24, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 20]], Red, 
      Thick]}, ImageSize -> 300]] & /@ {12, 13}

Events trapped inside the black hole cannot affect events outside. But events outside can affect events inside, with events on the boundary in effect representing the process of things falling into the black hole. Note that since energy (and mass) are measured by the flux of causal edges through spacelike hypersurfaces, the black hole shown here effectively gains mass through events that deliver things into the black hole. (Our complete universe is also expanding, so there’s no overall energy conservation to be seen here.)

What does the underlying spatial hypergraph look like in this case? Once again, it’s quite simple:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> Tiny] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
  Automatic, 14, "StatesList"]

So where is the black hole? We can number the events in the causal graph:

Graph
&#10005
Graph[ResourceFunction[
   "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
  Automatic, 13, "LayeredCausalGraph"], VertexLabels -> Automatic]

Then we can match these up with events in the spatial hypergraph:

Row
&#10005
Row[Flatten[
    Riffle[#, {Spacer[10], Text[Style[#, Small]], Spacer[10]} & /@ 
      Range[Length[#] - 1]]] &[
  Show[#, ImageSize -> {60, 30}] & /@ 
   ResourceFunction[
     "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
    Automatic, 13, "EventsStatesPlotsList"]]]

The basic conclusion is that the “main circle” acts like a black hole, with the smaller “handle” being like the “rest of the universe”.

Here’s another, slightly less trivial example:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
     2}, {4, 2, 1}}]]

The causal graph in this case is:

Graph
&#10005
Graph[ResourceFunction[
   "WolframModel"][{{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
      2}, {4, 2, 1}}}, Automatic, 20, "LayeredCausalGraph"], 
 AspectRatio -> 1/2]

Some initial events have light cones which cover the whole space; others always avoid the “spine” on the left, which behaves like a black hole:

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
           2}, {4, 2, 1}}}, Automatic, 18, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 10]], Red, 
      Thick]}, ImageSize -> 300, AspectRatio -> 1/2]] & /@ {8, 10}

Here’s the underlying spatial hypergraph:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> Tiny] & /@ 
 ResourceFunction[
   "WolframModel"][{{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
      2}, {4, 2, 1}}}, Automatic, 12, "StatesList"]

Comparing the list of events

Row
&#10005
Row[Flatten[
    Riffle[#, {Spacer[10], Text[Style[#, Small]], Spacer[10]} & /@ 
      Range[Length[#] - 1]]] &[
  Show[#, ImageSize -> {60, 30}] & /@ 
   ResourceFunction[
     "WolframModel"][{{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
        2}, {4, 2, 1}}}, Automatic, 8, "EventsStatesPlotsList"]]]

with the annotated causal graph

Graph
&#10005
Graph[ResourceFunction[
   "WolframModel"][{{{1, 2, 3}, {4, 5, 3}} -> {{2, 6, 4}, {6, 1, 
      2}, {4, 2, 1}}}, Automatic, 9, "LayeredCausalGraph"], 
 AspectRatio -> 1/2, VertexLabels -> Automatic]

we see that again the “black hole” is associated with a definite part of the spatial hypergraph.

Spacelike and Timelike Singularities

Disconnection is one kind of “pathology” that can occur in our models. Another is termination: a configuration can be reached in which the rules no longer apply, and “time stops”. (In the language of term rewriting systems, one can say in such a case that a “normal form” has been reached.)

Here is an example of the spatial hypergraph for a rule in which this happens:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, "MaxImageSize" -> 80] & /@ 
 ResourceFunction[
   "WolframModel"][{{x, x}, {y, x}} -> {{y, y}, {y, z}, {y, z}, {z, 
     x}, {w, z}}, Automatic, 20, "StatesList"]

The corresponding causal graph is:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x, x}, {y, x}} -> {{y, y}, {y, z}, {y, z}, {z, 
    x}, {w, z}}, Automatic, 15, "LayeredCausalGraph"]

And given this causal graph, we can see that the future light cone of any event winds up “compressing down” to just the single final event—after which “time stops”.

Is there an analog of this in ordinary general relativity? Yes, it’s just a spacelike singularity—like what occurs in the Schwarzschild solution. In the Schwarzschild solution the spacelike singularity is in the future of events inside the event horizon; here it’s in the future of the “whole universe”. But it’s exactly the same idea. The future of anything that happens—or any geodesic—inevitably winds up in the “dead end” of the causal graph, at which “time stops”.

Here’s a slightly more complicated example that again involves termination, but now with two distinct “final events”:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x, y, z}, {x, u, v}} -> {{y, x, w}, {w, u, s}, {v,
     z, u}}, Automatic, 25, "LayeredCausalGraph"]

Like in a cosmological event horizon, different collections of events in effect lead to different branches of future light cones—though in this case whichever branch is followed, “time eventually stops”.

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{x, y, z}, {x, u, v}} -> {{y, x, w}, {w, u, 
          s}, {v, z, u}}, Automatic, 20, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 10]], Red, 
      Thick]}, ImageSize -> 120]] & /@ {13, 16}

The causal connection graphs in this case basically just indicate the formation of a cosmological event horizon:

Table
&#10005
Table[Framed[
  Graph[ResourceFunction["CausalConnectionGraph"][
    ResourceFunction[
       "WolframModel"][{{x, y, z}, {x, u, v}} -> {{y, x, w}, {w, u, 
         s}, {v, z, u}}, Automatic, #, "LayeredCausalGraph"] &, t0, 
    6], ImageSize -> {120, 30}, VertexSize -> .2], 
  FrameStyle -> LightGray], {t0, 2, 6}]

But now each branch does not lead to a “full subuniverse”; instead they lead to two subuniverses that each end up in a spacelike singularity at which time stops.

By the way, consider a “typical” causal graph like:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 1}, {2, 3}, {4, 
    1}}, Automatic, 40, "LayeredCausalGraph"]

Starting from any event in this causal graph, its future light cone will presumably eventually “cover the whole universe”. But what about its past light cone? Wherever we start, the past light cone will eventually “compress down” to the single initialization event at the top of our causal graph. In other words—just like in standard general relativity—the beginning of the universe also behaves like a spacelike singularity.

What about timelike singularities? What is their analog in our models? Remember that a spacelike singularity effectively corresponds to “time stopping”. Well, a timelike singularity presumably corresponds to “space stopping”, but time continuing. Here’s an example:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 3}, {3, 4}, {5, 
    1}}, Automatic, 12, "LayeredCausalGraph"]

In this example, the future light cones of all events concentrate down onto two separated “tracks”, in each of which there is an inexorable progression from one event to the next, with “no room for any space”.

Here’s a minimal version of the same kind of behavior, for the rule:

{{x, x}} {{x, x}, {x, y}}

Show
&#10005
Show[ResourceFunction[
   "WolframModel"][{{x, x}} -> {{x, x}, {x, y}}, {{1, 1}}, 4, 
  "LayeredCausalGraph"], ImageSize -> {Automatic, 300}]

Unsurprisingly, its spatial hypergraph does not succeed in “growing space”:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> 60] & /@ 
 ResourceFunction[
   "WolframModel"][{{x, x}} -> {{x, x}, {x, y}}, {{1, 1}}, 8, 
  "StatesList"]

If one starts looking at other rules, one quickly sees all sorts of strange behavior. The almost trivial unary rule

{{x}} {{x}, {x}}

effectively gives a whole tree of timelike singularities:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x}} -> {{x}, {x}}, {{1}}, 6, "LayeredCausalGraph"]

Here are a few other ways that collections of timelike singularities can be produced:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {3, 
    1}}, Automatic, 20, "LayeredCausalGraph"]
ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {2, 
    1}}, Automatic, 20, "LayeredCausalGraph"]
ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {1, 3}} -> {{2, 2}, {3, 2}, {1, 
    4}}, Automatic, 30, "LayeredCausalGraph"]

Notice that for example in the last case only some events don’t lead to timelike singularities. A bit like in a Kerr black hole, it’s both possible to avoid the singularity, and be trapped in it.

But let’s go back to spacelike singularities. We’ve seen examples where “time stops” for the whole universe. But what about for just some events? Here’s a simple example:

FindDeadEnds
&#10005
FindDeadEnds[fun_, t_Integer] := 
  Intersection[Flatten[Position[VertexOutDegree[fun[t + 1]], 0]], 
   Flatten[Position[VertexOutDegree[fun[t]], 0]]];
Function[ru, 
  TimeConstrained[
   HighlightGraph[
    ResourceFunction["WolframModel"][ru, Automatic, 15, 
     "LayeredCausalGraph"], 
    Style[#, Red] & /@ 
     FindDeadEnds[
      ResourceFunction["WolframModel"][ru, Automatic, #, 
        "LayeredCausalGraph"] &, 25], VertexSize -> .3, 
    ImageSize -> {Automatic, 300}], 
   5]][{{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {3, 4}}]

For most events, the future light cone continues forever. But that’s not true for the highlighted events. Each of these events is a “dead end”, for which “time stops”.

In this case, the dead ends are associated with spatial disconnections, but this does not need to be true:

Framed
&#10005
Framed[ResourceFunction["WolframModelPlot"][#, "MaxImageSize" -> 60], 
   FrameStyle -> LightGray] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {3, 4}}, 
  Automatic, 10, "StatesList"]

Here is a slightly more complicated case:

Function
&#10005
Function[ru, 
  TimeConstrained[
   HighlightGraph[
    ResourceFunction["WolframModel"][ru, Automatic, 17, 
     "LayeredCausalGraph"], 
    Style[#, Red] & /@ 
     FindDeadEnds[
      ResourceFunction["WolframModel"][ru, Automatic, #, 
        "LayeredCausalGraph"] &, 25], VertexSize -> .4, 
    ImageSize -> {Automatic, 300}], 
   5]][{{1, 2}, {3, 2}} -> {{1, 1}, {1, 3}, {4, 2}}]

Again, though, these are “single-event” dead ends, in the sense that after those particular events, time stops. But predecessors of these events still “have a choice”: their future light cones include both the spacelike singularity, and other parts of the causal graph.

It is perfectly possible to have a combination of spacelike and timelike singularities. On the left here is a series of spacelike singularities, while on the right are timelike singularities, effectively all separated by cosmological event horizons:

Function
&#10005
Function[ru, 
  TimeConstrained[
   HighlightGraph[
    ResourceFunction["WolframModel"][ru, Automatic, 18, 
     "LayeredCausalGraph"], 
    Style[#, Red] & /@ 
     FindDeadEnds[
      ResourceFunction["WolframModel"][ru, Automatic, #, 
        "LayeredCausalGraph"] &, 25], VertexSize -> .5, 
    ImageSize -> {Automatic, 300}], 
   5]][{{1, 2}, {3, 2}} -> {{1, 3}, {3, 4}, {4, 1}}]

Recognizing when there’s actually a genuine spacelike singularity can be difficult. Consider the case:

Function
&#10005
Function[ru, 
  TimeConstrained[
   HighlightGraph[
    ResourceFunction["WolframModel"][ru, Automatic, 18, 
     "LayeredCausalGraph"], 
    Style[#, Red] & /@ 
     FindDeadEnds[
      ResourceFunction["WolframModel"][ru, Automatic, #, 
        "LayeredCausalGraph"] &, 25], VertexSize -> .6, 
    ImageSize -> {Automatic, 300}], 
   5]][{{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {3, 1}}]

At first it seems as if some of the events in the last few steps we have generated have no successors. But continuing for a few more steps, we find out that these ones do—though now there are new events that seem not to:

Function
&#10005
Function[ru, 
  TimeConstrained[
   HighlightGraph[
    ResourceFunction["WolframModel"][ru, Automatic, 25, 
     "LayeredCausalGraph"], 
    Style[#, Red] & /@ 
     FindDeadEnds[
      ResourceFunction["WolframModel"][ru, Automatic, #, 
        "LayeredCausalGraph"] &, 30], VertexSize -> 1, 
    ImageSize -> {Automatic, 300}], 
   5]][{{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {3, 1}}]

Plotting the event numbers for “potential spacelike singularities” against the number of steps of evolution generated, we can see that most events only remain “candidates” for a few steps:

ListPlot
&#10005
ListPlot[Catenate[
  Table[{t, #} & /@ 
    FindDeadEnds[
     ResourceFunction[
        "WolframModel"][{{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {3, 1}}, 
       Automatic, #, "LayeredCausalGraph"] &, t], {t, 20}]], 
 Frame -> True]

In the limit of arbitrarily many steps, will any spacelike-singularity events survive? It doesn’t look likely, but it is hard to tell for sure. And this is exactly one of those cases where the general problem is likely to be undecidable: there is no finite computation which can guarantee to give the result.

One feature of all these examples is that they all involve spacelike singularities that affect just single events. At the beginning we also saw examples where “all the events in the universe” eventually wind up at a spacelike singularity. But what about a case that’s more analogous to a Schwarzschild black hole, where a collection of events inside an event horizon wind up at a spacelike singularity, but ones “outside” do not?

I haven’t explicitly found an example where this happens, but I’m confident that one exists, perhaps with slightly more complicated initial conditions than I’ve been using here.

In ordinary general relativity, Penrose’s singularity theorem implies that as soon as there is a trapped surface from which geodesics cannot escape, the geodesics will inevitably reach a singularity. This theorem is known to apply in any number of dimensions (with codimension-2 trapped hypersurfaces). And in our models, it might make one think that as soon as there is a black-hole-like event horizon, there must be a singularity of the kind we have discussed.

But what about a case like the following one we discussed above?

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
       Automatic, 24, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 20]], Red, 
      Thick]}, ImageSize -> 300]] &@13

There is a “trapped region” in the causal graph, but no sign of a singularity. There seem to be a couple of (not-mutually-exclusive) possibilities for what is going on here. The first is that there is in a sense so much “overall expansion in the universe” and lack of energy conservation (reflected in an increasing flux of causal edges) that the energy conditions in the theorem effectively do not apply. A second possibility is that what we’re seeing is “another universe” forming inside the event horizon. In the Kerr solution, the (probably unrealistic and fragile) mathematics implies that it is in principle possible to “go through” the singularity and emerge into “another universe” that is just as infinite as ours. Perhaps what we are seeing here is a similar phenomenon.

The Distribution of Causal Structures

What kinds of causal structures are possible in our systems? To start answering this, we can explicitly look at all 4702 possible 22  32 rules. We have to pick how many steps after the “beginning of the universe” we want to consider; we’ll start with 5. We also have to pick how many steps we want to use to check causal connectivity; we’ll start with 10. Then here are the possible causal connection graphs that occur, together with how many times they occur:

allrules32
&#10005
allrules32 = {{{1, 1}, {1, 1}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 1}, {
    1, 1}} -> {{1, 1}, {1, 1}, {1, 2}}, {{1, 1}, {1, 1}} -> {{1, 1}, {
    1, 1}, {2, 1}}, {{1, 1}, {1, 1}} -> {{1, 1}, {1, 2}, {1, 2}}, {{1,
     1}, {1, 1}} -> {{1, 1}, {1, 2}, {2, 1}}, {{1, 1}, {1, 1}} -> {{1,
     1}, {1, 2}, {2, 2}}, {{1, 1}, {1, 1}} -> {{1, 1}, {1, 2}, {1, 
    3}}, {{1, 1}, {1, 1}} -> {{1, 1}, {1, 2}, {2, 3}}, {{1, 1}, {1, 
    1}} -> {{1, 1}, {1, 2}, {3, 1}}, {{1, 1}, {1, 1}} -> {{1, 1}, {1, 
    2}, {3, 2}}, {{1, 1}, {1, 1}} -> {{1, 1}, {2, 1}, {2, 1}}, {{1, 
    1}, {1, 1}} -> {{1, 1}, {2, 1}, {2, 3}}, {{1, 1}, {1, 1}} -> {{1, 
    1}, {2, 1}, {3, 1}}, {{1, 1}, {1, 1}} -> {{1, 1}, {2, 1}, {3, 
    2}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 2}, {1, 2}}, {{1, 1}, {1, 
    1}} -> {{1, 2}, {1, 2}, {2, 1}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 
    2}, {1, 3}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 2}, {2, 3}}, {{1, 
    1}, {1, 1}} -> {{1, 2}, {2, 1}, {1, 3}}, {{1, 1}, {1, 1}} -> {{1, 
    2}, {1, 2}, {3, 1}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 2}, {3, 
    2}}, {{1, 1}, {1, 1}} -> {{1, 2}, {2, 1}, {3, 1}}, {{1, 1}, {1, 
    1}} -> {{1, 2}, {1, 3}, {2, 3}}, {{1, 1}, {1, 1}} -> {{1, 2}, {2, 
    3}, {3, 1}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 3}, {1, 4}}, {{1, 
    1}, {1, 1}} -> {{1, 2}, {1, 3}, {2, 4}}, {{1, 1}, {1, 1}} -> {{1, 
    2}, {2, 3}, {3, 4}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 3}, {4, 
    1}}, {{1, 1}, {1, 1}} -> {{1, 2}, {1, 3}, {4, 2}}, {{1, 1}, {1, 
    1}} -> {{1, 2}, {2, 3}, {4, 2}}, {{1, 1}, {1, 1}} -> {{1, 2}, {2, 
    3}, {4, 3}}, {{1, 1}, {1, 1}} -> {{1, 2}, {3, 2}, {4, 2}}, {{1, 
    1}, {1, 1}} -> {{2, 1}, {2, 1}, {1, 2}}, {{1, 1}, {1, 1}} -> {{2, 
    1}, {2, 1}, {2, 1}}, {{1, 1}, {1, 1}} -> {{2, 2}, {1, 2}, {1, 
    2}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 1}, {1, 1}}, {{1, 1}, {1, 
    1}} -> {{2, 2}, {2, 1}, {1, 2}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 
    1}, {2, 1}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 2}, {1, 2}}, {{1, 
    1}, {1, 1}} -> {{2, 2}, {2, 2}, {2, 1}}, {{1, 1}, {1, 1}} -> {{2, 
    1}, {1, 2}, {2, 3}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 1}, {1, 
    3}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 1}, {2, 3}}, {{1, 1}, {1, 
    1}} -> {{2, 2}, {1, 2}, {1, 3}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 
    1}, {1, 3}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 1}, {2, 3}}, {{1, 
    1}, {1, 1}} -> {{2, 1}, {1, 2}, {3, 2}}, {{1, 1}, {1, 1}} -> {{2, 
    1}, {2, 1}, {3, 1}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 1}, {3, 
    2}}, {{1, 1}, {1, 1}} -> {{2, 2}, {1, 2}, {3, 1}}, {{1, 1}, {1, 
    1}} -> {{2, 2}, {1, 2}, {3, 2}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 
    1}, {3, 1}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 1}, {3, 2}}, {{1, 
    1}, {1, 1}} -> {{2, 1}, {2, 3}, {1, 3}}, {{1, 1}, {1, 1}} -> {{2, 
    2}, {2, 3}, {1, 2}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 3}, {1, 
    3}}, {{1, 1}, {1, 1}} -> {{2, 2}, {2, 3}, {3, 1}}, {{1, 1}, {1, 
    1}} -> {{2, 1}, {1, 3}, {3, 4}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 
    3}, {1, 4}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 3}, {2, 4}}, {{1, 
    1}, {1, 1}} -> {{2, 1}, {1, 3}, {4, 1}}, {{1, 1}, {1, 1}} -> {{2, 
    1}, {1, 3}, {4, 3}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 3}, {4, 
    1}}, {{1, 1}, {1, 1}} -> {{2, 1}, {2, 3}, {4, 2}}, {{1, 1}, {1, 
    1}} -> {{2, 2}, {3, 2}, {1, 3}}, {{1, 1}, {1, 1}} -> {{2, 2}, {3, 
    2}, {3, 1}}, {{1, 1}, {1, 1}} -> {{2, 1}, {3, 1}, {4, 1}}, {{1, 
    1}, {1, 1}} -> {{2, 3}, {2, 1}, {3, 1}}, {{1, 1}, {1, 1}} -> {{2, 
    3}, {2, 3}, {1, 2}}, {{1, 1}, {1, 1}} -> {{2, 3}, {2, 3}, {1, 
    3}}, {{1, 1}, {1, 1}} -> {{2, 3}, {2, 3}, {2, 1}}, {{1, 1}, {1, 
    1}} -> {{2, 3}, {2, 3}, {3, 1}}, {{1, 1}, {1, 1}} -> {{2, 3}, {3, 
    2}, {1, 2}}, {{1, 1}, {1, 1}} -> {{2, 3}, {3, 2}, {2, 1}}, {{1, 
    1}, {1, 1}} -> {{2, 3}, {2, 1}, {3, 4}}, {{1, 1}, {1, 1}} -> {{2, 
    3}, {3, 1}, {1, 4}}, {{1, 1}, {1, 1}} -> {{2, 3}, {2, 1}, {4, 
    3}}, {{1, 1}, {1, 1}} -> {{2, 3}, {3, 1}, {4, 1}}, {{1, 1}, {1, 
    1}} -> {{2, 3}, {3, 1}, {4, 3}}, {{1, 1}, {1, 1}} -> {{2, 3}, {2, 
    4}, {1, 2}}, {{1, 1}, {1, 1}} -> {{2, 3}, {2, 4}, {1, 3}}, {{1, 
    1}, {1, 1}} -> {{2, 3}, {2, 4}, {3, 1}}, {{1, 1}, {1, 1}} -> {{2, 
    3}, {3, 4}, {1, 4}}, {{1, 1}, {1, 1}} -> {{2, 3}, {3, 4}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {1, 1}, {1, 2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 
    1}, {2, 1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 1}, {2, 2}}, {{1, 
    1}, {1, 2}} -> {{1, 1}, {1, 2}, {1, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {1, 2}, {2, 1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 2}, {2, 
    2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {2, 1}, {2, 1}}, {{1, 1}, {1, 
    2}} -> {{1, 2}, {1, 2}, {1, 2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {1, 
    2}, {2, 1}}, {{1, 1}, {1, 2}} -> {{2, 1}, {2, 1}, {1, 2}}, {{1, 
    1}, {1, 2}} -> {{2, 1}, {2, 1}, {2, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {1, 2}, {1, 2}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 1}, {1, 
    1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 1}, {1, 2}}, {{1, 1}, {1, 
    2}} -> {{2, 2}, {2, 1}, {2, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 
    2}, {1, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 2}, {1, 2}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {2, 2}, {2, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {2, 2}, {2, 2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 1}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 1}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {1, 2}, {1, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 
    2}, {2, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {2, 1}, {2, 3}}, {{1, 
    1}, {1, 2}} -> {{1, 2}, {1, 2}, {1, 3}}, {{1, 1}, {1, 2}} -> {{1, 
    2}, {1, 2}, {2, 3}}, {{1, 1}, {1, 2}} -> {{1, 2}, {2, 1}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{2, 1}, {1, 2}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{2, 1}, {2, 1}, {1, 3}}, {{1, 1}, {1, 2}} -> {{2, 1}, {2, 
    1}, {2, 3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {1, 2}, {1, 3}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {2, 1}, {1, 3}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {2, 1}, {2, 3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 2}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 2}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {1, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 
    1}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 2}, {3, 1}}, {{1, 
    1}, {1, 2}} -> {{1, 1}, {1, 2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {2, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {2, 1}, {3, 
    2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {1, 2}, {3, 1}}, {{1, 1}, {1, 
    2}} -> {{1, 2}, {1, 2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {2, 
    1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 1}, {1, 2}, {3, 2}}, {{1, 
    1}, {1, 2}} -> {{2, 1}, {2, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    1}, {2, 1}, {3, 2}}, {{1, 1}, {1, 2}} -> {{2, 2}, {1, 2}, {3, 
    1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {1, 2}, {3, 2}}, {{1, 1}, {1, 
    2}} -> {{2, 2}, {2, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 
    1}, {3, 2}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 2}, {3, 1}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {2, 2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {1, 3}, {1, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 3}, {2, 
    1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 3}, {2, 2}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {1, 3}, {2, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 
    3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 3}, {3, 2}}, {{1, 
    1}, {1, 2}} -> {{1, 1}, {1, 3}, {3, 3}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {2, 3}, {2, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {2, 3}, {3, 
    2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {1, 3}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{1, 2}, {2, 3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 1}, {2, 
    3}, {1, 3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {1, 3}, {1, 3}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {1, 3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {2, 3}, {1, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 3}, {1, 
    2}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 3}, {1, 3}}, {{1, 1}, {1, 
    2}} -> {{2, 2}, {2, 3}, {2, 3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 
    3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 3}, {3, 2}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {2, 3}, {3, 3}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {1, 3}, {1, 4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 3}, {2, 
    4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 3}, {3, 4}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {2, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {2, 
    3}, {3, 4}}, {{1, 1}, {1, 2}} -> {{1, 2}, {1, 3}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{1, 2}, {1, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{1, 
    2}, {2, 3}, {3, 4}}, {{1, 1}, {1, 2}} -> {{2, 1}, {1, 3}, {3, 
    4}}, {{1, 1}, {1, 2}} -> {{2, 1}, {2, 3}, {1, 4}}, {{1, 1}, {1, 
    2}} -> {{2, 1}, {2, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{2, 2}, {1, 
    3}, {1, 4}}, {{1, 1}, {1, 2}} -> {{2, 2}, {1, 3}, {3, 4}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {2, 3}, {1, 4}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {2, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 3}, {3, 
    4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 3}, {4, 1}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {1, 3}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {1, 
    3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {2, 3}, {4, 3}}, {{1, 
    1}, {1, 2}} -> {{1, 2}, {1, 3}, {4, 1}}, {{1, 1}, {1, 2}} -> {{1, 
    2}, {1, 3}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {2, 3}, {4, 
    2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {2, 3}, {4, 3}}, {{1, 1}, {1, 
    2}} -> {{2, 1}, {1, 3}, {4, 1}}, {{1, 1}, {1, 2}} -> {{2, 1}, {1, 
    3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{2, 1}, {2, 3}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{2, 1}, {2, 3}, {4, 2}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {1, 3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 3}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {2, 3}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{2, 2}, {2, 3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 
    1}, {2, 2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 1}, {2, 3}}, {{1, 
    1}, {1, 2}} -> {{1, 1}, {3, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {3, 1}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 2}, {3, 
    2}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 1}, {3, 1}}, {{1, 1}, {1, 
    2}} -> {{2, 2}, {3, 2}, {1, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 
    2}, {1, 3}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 2}, {3, 1}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {3, 2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    1}, {3, 1}, {2, 4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 1}, {3, 
    4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 2}, {2, 4}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {3, 2}, {3, 4}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 
    1}, {1, 4}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 1}, {3, 4}}, {{1, 
    1}, {1, 2}} -> {{2, 2}, {3, 2}, {1, 4}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {3, 2}, {3, 4}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 1}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 1}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{1, 1}, {3, 1}, {4, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 
    2}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 2}, {3, 2}, {4, 2}}, {{1, 
    1}, {1, 2}} -> {{2, 1}, {3, 1}, {4, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    2}, {3, 1}, {4, 1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 2}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 2}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{2, 2}, {3, 2}, {4, 3}}, {{1, 1}, {1, 2}} -> {{1, 1}, {3, 
    4}, {4, 2}}, {{1, 1}, {1, 2}} -> {{2, 2}, {3, 4}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{1, 3}, {1, 2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {1, 3}, {1, 2}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 3}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 3}, {2, 1}}, {{1, 1}, {1, 
    2}} -> {{1, 3}, {1, 3}, {2, 3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 
    3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 3}, {3, 2}}, {{1, 
    1}, {1, 2}} -> {{1, 3}, {3, 1}, {1, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {3, 1}, {2, 1}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 2}, {2, 
    1}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 1}, {3, 1}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {2, 3}, {1, 2}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 
    3}, {1, 3}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 3}, {2, 1}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {2, 3}, {2, 3}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {2, 3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 3}, {3, 
    2}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 2}, {1, 2}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {3, 2}, {2, 1}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 
    2}, {3, 4}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 3}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{1, 3}, {1, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {1, 3}, {3, 4}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 1}, {1, 
    4}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 1}, {2, 4}}, {{1, 1}, {1, 
    2}} -> {{1, 3}, {3, 2}, {2, 4}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 
    1}, {3, 4}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 3}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {2, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {2, 3}, {3, 4}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 1}, {1, 
    4}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 2}, {1, 4}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {3, 2}, {2, 4}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 
    2}, {4, 3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 3}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{1, 3}, {1, 3}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {1, 3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {2, 3}, {4, 
    3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 1}, {4, 1}}, {{1, 1}, {1, 
    2}} -> {{1, 3}, {3, 1}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 
    2}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 2}, {4, 3}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {2, 1}, {4, 3}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {2, 3}, {4, 1}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 3}, {4, 
    2}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 3}, {4, 3}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {3, 1}, {4, 1}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 
    1}, {4, 3}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 2}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {3, 2}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {1, 4}, {2, 1}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 4}, {2, 
    3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 4}, {3, 2}}, {{1, 1}, {1, 
    2}} -> {{1, 3}, {1, 4}, {3, 4}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 
    4}, {2, 3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 4}, {2, 4}}, {{1, 
    1}, {1, 2}} -> {{1, 3}, {3, 4}, {4, 1}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {3, 4}, {4, 2}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 4}, {1, 
    2}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 4}, {1, 3}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {2, 4}, {3, 1}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 
    4}, {3, 4}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 4}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {3, 4}, {4, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {3, 4}, {4, 2}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 4}, {1, 
    5}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 4}, {2, 5}}, {{1, 1}, {1, 
    2}} -> {{1, 3}, {1, 4}, {3, 5}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 
    4}, {2, 5}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 4}, {4, 5}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {2, 4}, {1, 5}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {2, 4}, {2, 5}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 4}, {3, 
    5}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 4}, {1, 5}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {3, 4}, {4, 5}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 
    4}, {5, 1}}, {{1, 1}, {1, 2}} -> {{1, 3}, {1, 4}, {5, 2}}, {{1, 
    1}, {1, 2}} -> {{1, 3}, {1, 4}, {5, 3}}, {{1, 1}, {1, 2}} -> {{1, 
    3}, {3, 4}, {5, 2}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 4}, {5, 
    3}}, {{1, 1}, {1, 2}} -> {{1, 3}, {3, 4}, {5, 4}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {2, 4}, {5, 1}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 
    4}, {5, 2}}, {{1, 1}, {1, 2}} -> {{2, 3}, {2, 4}, {5, 3}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {3, 4}, {5, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {3, 4}, {5, 3}}, {{1, 1}, {1, 2}} -> {{2, 3}, {3, 4}, {5, 
    4}}, {{1, 1}, {1, 2}} -> {{1, 3}, {4, 3}, {2, 5}}, {{1, 1}, {1, 
    2}} -> {{2, 3}, {4, 3}, {1, 5}}, {{1, 1}, {1, 2}} -> {{1, 3}, {4, 
    3}, {5, 2}}, {{1, 1}, {1, 2}} -> {{1, 3}, {4, 3}, {5, 3}}, {{1, 
    1}, {1, 2}} -> {{2, 3}, {4, 3}, {5, 1}}, {{1, 1}, {1, 2}} -> {{2, 
    3}, {4, 3}, {5, 3}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 3}, {2, 
    3}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 3}, {3, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {3, 1}, {1, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 
    1}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 1}, {2, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 1}, {3, 1}, {2, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {3, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 1}, {3, 
    2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 2}, {1, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {2, 3}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 
    3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 1}, {2, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 2}, {3, 2}, {1, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    2}, {3, 2}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 2}, {2, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 2}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {3, 2}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 
    2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 3}, {1, 2}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {1, 3}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {1, 3}, {2, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 3}, {2, 
    2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 3}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {2, 3}, {1, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {2, 
    3}, {1, 2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {2, 3}, {2, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {2, 3}, {2, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {3, 1}, {1, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {1, 
    2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {1, 3}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {3, 1}, {2, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 
    1}, {2, 2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {2, 3}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {3, 1}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {3, 1}, {3, 2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 2}, {1, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 2}, {1, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {3, 2}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 
    2}, {2, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 2}, {2, 2}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {3, 2}, {2, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {3, 2}, {3, 2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 3}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 3}, {2, 3}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {3, 3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 
    3}, {3, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 2}, {2, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 1}, {1, 3}, {3, 4}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {3, 1}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 1}, {2, 
    4}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 1}, {3, 4}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {3, 2}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 
    2}, {3, 4}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 1}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 2}, {2, 3}, {3, 4}}, {{1, 1}, {1, 2}} -> {{3, 
    2}, {3, 1}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 2}, {1, 
    4}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 2}, {2, 4}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {3, 2}, {3, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 
    3}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 3}, {2, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {2, 3}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {2, 3}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {1, 
    4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {2, 4}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {3, 1}, {3, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 
    2}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 2}, {2, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {3, 2}, {3, 4}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {1, 2}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 2}, {4, 
    2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 3}, {4, 3}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {3, 1}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 
    1}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 1}, {4, 3}}, {{1, 
    1}, {1, 2}} -> {{3, 1}, {3, 2}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {3, 2}, {4, 3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 1}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 1}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {2, 3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 
    1}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 2}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 2}, {3, 2}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    2}, {3, 2}, {4, 3}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 3}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {1, 3}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {1, 3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{3, 3}, {2, 
    3}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {2, 3}, {4, 2}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {2, 3}, {4, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {3, 1}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {4, 
    2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 1}, {4, 3}}, {{1, 1}, {1, 
    2}} -> {{3, 3}, {3, 2}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 
    2}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 2}, {4, 3}}, {{1, 
    1}, {1, 2}} -> {{3, 1}, {1, 4}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {1, 4}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 4}, {1, 
    2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 4}, {1, 4}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {3, 4}, {2, 1}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 
    4}, {2, 3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 4}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 2}, {2, 4}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 
    2}, {3, 4}, {1, 2}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 4}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 4}, {2, 1}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {3, 4}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 
    4}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 4}, {1, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {3, 4}, {2, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {3, 4}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 4}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 3}, {3, 4}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {1, 4}, {2, 5}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 
    4}, {4, 5}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 4}, {1, 5}}, {{1, 
    1}, {1, 2}} -> {{3, 1}, {3, 4}, {2, 5}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {3, 4}, {3, 5}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 4}, {1, 
    5}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 4}, {4, 5}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {3, 4}, {1, 5}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 
    4}, {2, 5}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 4}, {3, 5}}, {{1, 
    1}, {1, 2}} -> {{3, 1}, {1, 4}, {5, 1}}, {{1, 1}, {1, 2}} -> {{3, 
    1}, {1, 4}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {1, 4}, {5, 
    4}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 4}, {5, 1}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {3, 4}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {3, 
    4}, {5, 3}}, {{1, 1}, {1, 2}} -> {{3, 2}, {2, 4}, {5, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 2}, {2, 4}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    2}, {2, 4}, {5, 4}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 4}, {5, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 2}, {3, 4}, {5, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 2}, {3, 4}, {5, 3}}, {{1, 1}, {1, 2}} -> {{3, 3}, {4, 
    3}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 3}, {4, 3}, {2, 4}}, {{1, 
    1}, {1, 2}} -> {{3, 3}, {4, 3}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 
    3}, {4, 3}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 1}, {4, 1}, {2, 
    5}}, {{1, 1}, {1, 2}} -> {{3, 2}, {4, 2}, {1, 5}}, {{1, 1}, {1, 
    2}} -> {{3, 1}, {4, 1}, {5, 1}}, {{1, 1}, {1, 2}} -> {{3, 1}, {4, 
    1}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 2}, {4, 2}, {5, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 2}, {4, 2}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {3, 1}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 1}, {4, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 1}, {4, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {3, 2}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 
    2}, {4, 1}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 2}, {4, 2}}, {{1, 
    1}, {1, 2}} -> {{3, 4}, {3, 4}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {3, 4}, {1, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 4}, {2, 
    3}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 4}, {2, 4}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {3, 4}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 
    4}, {3, 2}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 4}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 4}, {3, 4}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {4, 1}, {1, 2}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 1}, {2, 
    1}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 2}, {1, 2}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {4, 2}, {2, 1}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 
    3}, {1, 3}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 3}, {2, 3}}, {{1, 
    1}, {1, 2}} -> {{3, 4}, {4, 3}, {3, 1}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {4, 3}, {3, 2}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 1}, {4, 
    5}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 2}, {4, 5}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {4, 1}, {1, 5}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 
    1}, {2, 5}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 2}, {1, 5}}, {{1, 
    1}, {1, 2}} -> {{3, 4}, {4, 2}, {2, 5}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {3, 1}, {5, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 2}, {5, 
    4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 1}, {5, 1}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {4, 1}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 
    1}, {5, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 2}, {5, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 4}, {4, 2}, {5, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {4, 2}, {5, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 5}, {1, 
    3}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 5}, {1, 4}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {3, 5}, {2, 3}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 
    5}, {2, 4}}, {{1, 1}, {1, 2}} -> {{3, 4}, {3, 5}, {4, 1}}, {{1, 
    1}, {1, 2}} -> {{3, 4}, {3, 5}, {4, 2}}, {{1, 1}, {1, 2}} -> {{3, 
    4}, {4, 5}, {1, 5}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 5}, {2, 
    5}}, {{1, 1}, {1, 2}} -> {{3, 4}, {4, 5}, {5, 1}}, {{1, 1}, {1, 
    2}} -> {{3, 4}, {4, 5}, {5, 2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 
    1}, {1, 1}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 1}, {1, 2}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {1, 1}, {2, 1}}, {{1, 1}, {2, 1}} -> {{1, 
    1}, {1, 1}, {2, 2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 2}, {1, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 2}, {2, 1}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {1, 2}, {2, 2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {2, 
    1}, {2, 1}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 2}, {1, 2}}, {{1, 
    1}, {2, 1}} -> {{1, 2}, {1, 2}, {2, 1}}, {{1, 1}, {2, 1}} -> {{2, 
    1}, {2, 1}, {1, 2}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 1}, {2, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {1, 2}, {1, 2}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {2, 1}, {1, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 
    1}, {1, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 1}, {2, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 2}, {2, 2}, {1, 1}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {2, 2}, {1, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 2}, {2, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 2}, {2, 2}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {1, 1}, {1, 3}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 
    1}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 2}, {1, 3}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {1, 2}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 
    1}, {2, 1}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 2}, {1, 
    3}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 2}, {2, 3}}, {{1, 1}, {2, 
    1}} -> {{1, 2}, {2, 1}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 1}, {1, 
    2}, {2, 3}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 1}, {1, 3}}, {{1, 
    1}, {2, 1}} -> {{2, 1}, {2, 1}, {2, 3}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {1, 2}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 1}, {1, 
    3}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 1}, {2, 3}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {2, 2}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 
    2}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 1}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {1, 1}, {3, 2}}, {{1, 1}, {2, 1}} -> {{1, 
    1}, {1, 2}, {3, 1}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 2}, {3, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {2, 1}, {3, 1}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {2, 1}, {3, 2}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 
    2}, {3, 1}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 2}, {3, 2}}, {{1, 
    1}, {2, 1}} -> {{1, 2}, {2, 1}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 
    1}, {1, 2}, {3, 2}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 1}, {3, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 1}, {3, 2}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {1, 2}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {1, 
    2}, {3, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 1}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 2}, {2, 1}, {3, 2}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {2, 2}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 2}, {3, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 3}, {1, 3}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {1, 3}, {2, 1}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 
    3}, {2, 2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 3}, {2, 3}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {1, 3}, {3, 1}}, {{1, 1}, {2, 1}} -> {{1, 
    1}, {1, 3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 3}, {3, 
    3}}, {{1, 1}, {2, 1}} -> {{1, 1}, {2, 3}, {2, 3}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {2, 3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 
    3}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 2}, {2, 3}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 1}, {2, 3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {1, 3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 2}, {1, 3}, {3, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 3}, {1, 1}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {2, 3}, {1, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 
    3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 3}, {2, 3}}, {{1, 
    1}, {2, 1}} -> {{2, 2}, {2, 3}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {2, 3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 3}, {3, 
    3}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 3}, {1, 4}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {1, 3}, {2, 4}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 
    3}, {3, 4}}, {{1, 1}, {2, 1}} -> {{1, 1}, {2, 3}, {2, 4}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {2, 3}, {3, 4}}, {{1, 1}, {2, 1}} -> {{1, 
    2}, {1, 3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 3}, {2, 
    4}}, {{1, 1}, {2, 1}} -> {{1, 2}, {2, 3}, {3, 4}}, {{1, 1}, {2, 
    1}} -> {{2, 1}, {1, 3}, {3, 4}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 
    3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 3}, {2, 4}}, {{1, 
    1}, {2, 1}} -> {{2, 2}, {1, 3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {1, 3}, {3, 4}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 3}, {1, 
    4}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 3}, {2, 4}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {2, 3}, {3, 4}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 
    3}, {4, 1}}, {{1, 1}, {2, 1}} -> {{1, 1}, {1, 3}, {4, 2}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {1, 3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{1, 
    1}, {2, 3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 3}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{1, 2}, {1, 3}, {4, 2}}, {{1, 1}, {2, 
    1}} -> {{1, 2}, {2, 3}, {4, 2}}, {{1, 1}, {2, 1}} -> {{1, 2}, {2, 
    3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{2, 1}, {1, 3}, {4, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 1}, {1, 3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{2, 
    1}, {2, 3}, {4, 1}}, {{1, 1}, {2, 1}} -> {{2, 1}, {2, 3}, {4, 
    2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {1, 3}, {4, 3}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {2, 3}, {4, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 
    3}, {4, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {2, 3}, {4, 3}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {3, 1}, {2, 2}}, {{1, 1}, {2, 1}} -> {{1, 
    1}, {3, 1}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 1}, {3, 
    1}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 1}, {3, 2}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {3, 2}, {3, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 
    1}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 2}, {1, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 2}, {3, 2}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {3, 2}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 2}, {3, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 1}, {2, 4}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {3, 1}, {3, 4}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 
    2}, {2, 4}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 2}, {3, 4}}, {{1, 
    1}, {2, 1}} -> {{2, 2}, {3, 1}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {3, 1}, {3, 4}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 2}, {1, 
    4}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 2}, {3, 4}}, {{1, 1}, {2, 
    1}} -> {{1, 1}, {3, 1}, {4, 1}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 
    1}, {4, 2}}, {{1, 1}, {2, 1}} -> {{1, 1}, {3, 1}, {4, 3}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {3, 2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{1, 
    2}, {3, 2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{2, 1}, {3, 1}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 1}, {4, 1}}, {{1, 1}, {2, 
    1}} -> {{2, 2}, {3, 2}, {4, 1}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 
    2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{2, 2}, {3, 2}, {4, 3}}, {{1, 
    1}, {2, 1}} -> {{1, 1}, {3, 4}, {4, 2}}, {{1, 1}, {2, 1}} -> {{2, 
    2}, {3, 4}, {4, 1}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 2}, {3, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 3}, {1, 2}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {1, 3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 
    3}, {2, 1}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 3}, {2, 3}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {1, 3}, {3, 1}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {1, 3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 1}, {1, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 1}, {2, 1}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {3, 2}, {2, 1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 
    1}, {3, 1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 3}, {1, 2}}, {{1, 
    1}, {2, 1}} -> {{2, 3}, {2, 3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 
    3}, {2, 3}, {2, 1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 3}, {2, 
    3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 3}, {3, 1}}, {{1, 1}, {2, 
    1}} -> {{2, 3}, {2, 3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 
    2}, {1, 2}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 2}, {2, 1}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {1, 2}, {3, 4}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {1, 3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 3}, {2, 
    4}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 3}, {3, 4}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {3, 1}, {1, 4}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 
    1}, {2, 4}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 2}, {2, 4}}, {{1, 
    1}, {2, 1}} -> {{2, 3}, {2, 1}, {3, 4}}, {{1, 1}, {2, 1}} -> {{2, 
    3}, {2, 3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 3}, {2, 
    4}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 3}, {3, 4}}, {{1, 1}, {2, 
    1}} -> {{2, 3}, {3, 1}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 
    2}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 2}, {2, 4}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {1, 2}, {4, 3}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {1, 3}, {4, 1}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 3}, {4, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 3}, {4, 3}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {2, 3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 
    1}, {4, 1}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 1}, {4, 2}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {3, 2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {3, 2}, {4, 3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 1}, {4, 
    3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 3}, {4, 1}}, {{1, 1}, {2, 
    1}} -> {{2, 3}, {2, 3}, {4, 2}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 
    3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 1}, {4, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 3}, {3, 1}, {4, 3}}, {{1, 1}, {2, 1}} -> {{2, 
    3}, {3, 2}, {4, 1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 2}, {4, 
    2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 4}, {2, 1}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {1, 4}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 
    4}, {3, 2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 4}, {3, 4}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {3, 4}, {2, 3}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {3, 4}, {2, 4}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 4}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 4}, {4, 2}}, {{1, 1}, {2, 
    1}} -> {{2, 3}, {2, 4}, {1, 2}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 
    4}, {1, 3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 4}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 3}, {2, 4}, {3, 4}}, {{1, 1}, {2, 1}} -> {{2, 
    3}, {3, 4}, {1, 4}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 4}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 4}, {4, 2}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {1, 4}, {1, 5}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 
    4}, {2, 5}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 4}, {3, 5}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {3, 4}, {2, 5}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {3, 4}, {4, 5}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 4}, {1, 
    5}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 4}, {2, 5}}, {{1, 1}, {2, 
    1}} -> {{2, 3}, {2, 4}, {3, 5}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 
    4}, {1, 5}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 4}, {4, 5}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {1, 4}, {5, 1}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {1, 4}, {5, 2}}, {{1, 1}, {2, 1}} -> {{1, 3}, {1, 4}, {5, 
    3}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 4}, {5, 2}}, {{1, 1}, {2, 
    1}} -> {{1, 3}, {3, 4}, {5, 3}}, {{1, 1}, {2, 1}} -> {{1, 3}, {3, 
    4}, {5, 4}}, {{1, 1}, {2, 1}} -> {{2, 3}, {2, 4}, {5, 1}}, {{1, 
    1}, {2, 1}} -> {{2, 3}, {2, 4}, {5, 2}}, {{1, 1}, {2, 1}} -> {{2, 
    3}, {2, 4}, {5, 3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 4}, {5, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {3, 4}, {5, 3}}, {{1, 1}, {2, 
    1}} -> {{2, 3}, {3, 4}, {5, 4}}, {{1, 1}, {2, 1}} -> {{1, 3}, {4, 
    3}, {2, 5}}, {{1, 1}, {2, 1}} -> {{2, 3}, {4, 3}, {1, 5}}, {{1, 
    1}, {2, 1}} -> {{1, 3}, {4, 3}, {5, 2}}, {{1, 1}, {2, 1}} -> {{1, 
    3}, {4, 3}, {5, 3}}, {{1, 1}, {2, 1}} -> {{2, 3}, {4, 3}, {5, 
    1}}, {{1, 1}, {2, 1}} -> {{2, 3}, {4, 3}, {5, 3}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {1, 3}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 
    3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 1}, {1, 2}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {3, 1}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 
    1}, {3, 1}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 1}, {2, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 1}, {3, 1}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {3, 1}, {3, 2}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 
    2}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 3}, {1, 3}}, {{1, 
    1}, {2, 1}} -> {{3, 2}, {2, 3}, {3, 1}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {3, 1}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 2}, {1, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 2}, {1, 3}}, {{1, 1}, {2, 
    1}} -> {{3, 2}, {3, 2}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 
    2}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 2}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 2}, {3, 2}, {3, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {1, 3}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {1, 3}, {1, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {1, 3}, {2, 1}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {1, 3}, {2, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {1, 
    3}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {2, 3}, {1, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {2, 3}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {2, 3}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 3}, {2, 3}, {2, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 1}, {1, 1}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {3, 1}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 
    1}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 1}, {2, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {3, 1}, {2, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {3, 1}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 1}, {3, 
    1}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 1}, {3, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {3, 2}, {1, 1}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 
    2}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 2}, {1, 3}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {3, 2}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {3, 2}, {2, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 2}, {2, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 2}, {3, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {3, 3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 
    3}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 3}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {3, 3}, {3, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    1}, {1, 2}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 3}, {3, 
    4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 1}, {1, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {3, 1}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 
    1}, {3, 4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 2}, {1, 4}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {3, 2}, {3, 4}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {2, 1}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 3}, {3, 
    4}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 1}, {2, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 2}, {3, 2}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 
    2}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 2}, {3, 4}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {1, 3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {1, 3}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {2, 3}, {1, 
    4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {2, 3}, {2, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {3, 1}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 
    1}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 1}, {3, 4}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {3, 2}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {3, 2}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 2}, {3, 
    4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 2}, {4, 1}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {1, 2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 
    3}, {4, 3}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 1}, {4, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {3, 1}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    1}, {3, 1}, {4, 3}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 2}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 2}, {4, 3}}, {{1, 1}, {2, 
    1}} -> {{3, 2}, {2, 1}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 
    1}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 3}, {4, 3}}, {{1, 
    1}, {2, 1}} -> {{3, 2}, {3, 1}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {3, 2}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 2}, {4, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 2}, {4, 3}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {1, 3}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 3}, {1, 
    3}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {1, 3}, {4, 3}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {2, 3}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {2, 3}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {2, 3}, {4, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 1}, {4, 1}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {3, 1}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 
    1}, {4, 3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 2}, {4, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {3, 2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {3, 2}, {4, 3}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 4}, {2, 
    4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 4}, {4, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {3, 4}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 
    4}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 4}, {2, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {3, 4}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {2, 4}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 4}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 4}, {1, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 2}, {3, 4}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 
    4}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 4}, {2, 4}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {3, 4}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {3, 4}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 4}, {2, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 4}, {2, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 3}, {3, 4}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 3}, {3, 
    4}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 4}, {2, 5}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {1, 4}, {4, 5}}, {{1, 1}, {2, 1}} -> {{3, 
    1}, {3, 4}, {1, 5}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 4}, {2, 
    5}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 4}, {3, 5}}, {{1, 1}, {2, 
    1}} -> {{3, 2}, {2, 4}, {1, 5}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 
    4}, {4, 5}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 4}, {1, 5}}, {{1, 
    1}, {2, 1}} -> {{3, 2}, {3, 4}, {2, 5}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {3, 4}, {3, 5}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 4}, {5, 
    1}}, {{1, 1}, {2, 1}} -> {{3, 1}, {1, 4}, {5, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {1, 4}, {5, 4}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 
    4}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 1}, {3, 4}, {5, 2}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {3, 4}, {5, 3}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {2, 4}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 4}, {5, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 2}, {2, 4}, {5, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 2}, {3, 4}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 
    4}, {5, 2}}, {{1, 1}, {2, 1}} -> {{3, 2}, {3, 4}, {5, 3}}, {{1, 
    1}, {2, 1}} -> {{3, 3}, {4, 3}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 
    3}, {4, 3}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 3}, {4, 3}, {4, 
    1}}, {{1, 1}, {2, 1}} -> {{3, 3}, {4, 3}, {4, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 1}, {4, 1}, {2, 5}}, {{1, 1}, {2, 1}} -> {{3, 2}, {4, 
    2}, {1, 5}}, {{1, 1}, {2, 1}} -> {{3, 1}, {4, 1}, {5, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 1}, {4, 1}, {5, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    2}, {4, 2}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 2}, {4, 2}, {5, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 1}, {2, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {3, 1}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 
    1}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 2}, {1, 4}}, {{1, 
    1}, {2, 1}} -> {{3, 4}, {3, 2}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 
    4}, {3, 2}, {4, 2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 4}, {1, 
    3}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 4}, {1, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {3, 4}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 
    4}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 4}, {3, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 4}, {3, 4}, {3, 2}}, {{1, 1}, {2, 1}} -> {{3, 
    4}, {3, 4}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 4}, {4, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 1}, {1, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {4, 1}, {2, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 
    2}, {1, 2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 2}, {2, 1}}, {{1, 
    1}, {2, 1}} -> {{3, 4}, {4, 3}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 
    4}, {4, 3}, {2, 3}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 3}, {3, 
    1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 3}, {3, 2}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {3, 1}, {4, 5}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 
    2}, {4, 5}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 1}, {1, 5}}, {{1, 
    1}, {2, 1}} -> {{3, 4}, {4, 1}, {2, 5}}, {{1, 1}, {2, 1}} -> {{3, 
    4}, {4, 2}, {1, 5}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 2}, {2, 
    5}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 1}, {5, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {3, 2}, {5, 4}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 
    1}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 1}, {5, 2}}, {{1, 
    1}, {2, 1}} -> {{3, 4}, {4, 1}, {5, 4}}, {{1, 1}, {2, 1}} -> {{3, 
    4}, {4, 2}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 2}, {5, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 2}, {5, 4}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {3, 5}, {1, 3}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 
    5}, {1, 4}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 5}, {2, 3}}, {{1, 
    1}, {2, 1}} -> {{3, 4}, {3, 5}, {2, 4}}, {{1, 1}, {2, 1}} -> {{3, 
    4}, {3, 5}, {4, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {3, 5}, {4, 
    2}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 5}, {1, 5}}, {{1, 1}, {2, 
    1}} -> {{3, 4}, {4, 5}, {2, 5}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 
    5}, {5, 1}}, {{1, 1}, {2, 1}} -> {{3, 4}, {4, 5}, {5, 2}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {1, 1}, {1, 2}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 1}, {2, 
    1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 1}, {2, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {1, 2}, {1, 2}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 
    2}, {2, 1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 2}, {2, 2}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {2, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{1, 
    2}, {1, 2}, {1, 2}}, {{1, 2}, {1, 2}} -> {{1, 2}, {1, 2}, {2, 
    1}}, {{1, 2}, {1, 2}} -> {{2, 1}, {2, 1}, {1, 2}}, {{1, 2}, {1, 
    2}} -> {{2, 1}, {2, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 
    2}, {1, 2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 1}, {1, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {2, 1}, {1, 2}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {2, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 2}, {1, 
    1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 2}, {1, 2}}, {{1, 2}, {1, 
    2}} -> {{2, 2}, {2, 2}, {2, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 
    2}, {2, 2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {1, 1}, {1, 2}}, {{1, 2}, {2, 1}} -> {{1, 
    1}, {1, 1}, {2, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 1}, {2, 
    2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 2}, {1, 2}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {1, 2}, {2, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 
    2}, {2, 2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {2, 1}, {2, 1}}, {{1, 
    2}, {2, 1}} -> {{1, 2}, {1, 2}, {1, 2}}, {{1, 2}, {2, 1}} -> {{1, 
    2}, {1, 2}, {2, 1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 1}, {1, 
    3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 1}, {2, 3}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {1, 2}, {1, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 
    2}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {2, 1}, {2, 3}}, {{1, 
    2}, {1, 2}} -> {{1, 2}, {1, 2}, {1, 3}}, {{1, 2}, {1, 2}} -> {{1, 
    2}, {1, 2}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 2}, {2, 1}, {1, 
    3}}, {{1, 2}, {1, 2}} -> {{2, 1}, {1, 2}, {2, 3}}, {{1, 2}, {1, 
    2}} -> {{2, 1}, {2, 1}, {1, 3}}, {{1, 2}, {1, 2}} -> {{2, 1}, {2, 
    1}, {2, 3}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 2}, {1, 3}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {2, 1}, {1, 3}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {2, 1}, {2, 3}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 2}, {1, 
    3}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 2}, {2, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {1, 1}, {1, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 
    1}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 2}, {1, 3}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {1, 2}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 
    1}, {2, 1}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 2}, {1, 2}, {1, 
    3}}, {{1, 2}, {2, 1}} -> {{1, 2}, {1, 2}, {2, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 2}, {2, 1}, {1, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 
    1}, {3, 1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 1}, {3, 2}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {1, 2}, {3, 1}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {1, 2}, {3, 2}}, {{1, 2}, {1, 2}} -> {{1, 1}, {2, 1}, {3, 
    1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {2, 1}, {3, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 2}, {1, 2}, {3, 1}}, {{1, 2}, {1, 2}} -> {{1, 2}, {1, 
    2}, {3, 2}}, {{1, 2}, {1, 2}} -> {{1, 2}, {2, 1}, {3, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 1}, {1, 2}, {3, 2}}, {{1, 2}, {1, 2}} -> {{2, 
    1}, {2, 1}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 1}, {2, 1}, {3, 
    2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 2}, {3, 1}}, {{1, 2}, {1, 
    2}} -> {{2, 2}, {1, 2}, {3, 2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 
    1}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 1}, {3, 2}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {2, 2}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {2, 2}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 1}, {3, 
    1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 1}, {3, 2}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {1, 2}, {3, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 
    2}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {2, 1}, {3, 1}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {2, 1}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 
    2}, {1, 2}, {3, 1}}, {{1, 2}, {2, 1}} -> {{1, 2}, {1, 2}, {3, 
    2}}, {{1, 2}, {2, 1}} -> {{1, 2}, {2, 1}, {3, 1}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {1, 3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 
    3}, {2, 1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 3}, {2, 2}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {1, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {1, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 3}, {3, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 3}, {3, 3}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {2, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {2, 
    3}, {3, 2}}, {{1, 2}, {1, 2}} -> {{1, 2}, {1, 3}, {2, 3}}, {{1, 
    2}, {1, 2}} -> {{1, 2}, {2, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 
    1}, {2, 3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 3}, {1, 
    3}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 3}, {3, 1}}, {{1, 2}, {1, 
    2}} -> {{2, 2}, {2, 3}, {1, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 
    3}, {1, 2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 3}, {1, 3}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {2, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {2, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 3}, {3, 
    2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 3}, {3, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {1, 3}, {1, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 
    3}, {2, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 3}, {2, 2}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {1, 3}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 
    1}, {1, 3}, {3, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 3}, {3, 
    2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 3}, {3, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {2, 3}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {2, 
    3}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 2}, {1, 3}, {2, 3}}, {{1, 
    2}, {2, 1}} -> {{1, 2}, {2, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {1, 3}, {1, 4}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 3}, {2, 
    4}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 3}, {3, 4}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {2, 3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{1, 1}, {2, 
    3}, {3, 4}}, {{1, 2}, {1, 2}} -> {{1, 2}, {1, 3}, {1, 4}}, {{1, 
    2}, {1, 2}} -> {{1, 2}, {1, 3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{1, 
    2}, {2, 3}, {3, 4}}, {{1, 2}, {1, 2}} -> {{2, 1}, {1, 3}, {3, 
    4}}, {{1, 2}, {1, 2}} -> {{2, 1}, {2, 3}, {1, 4}}, {{1, 2}, {1, 
    2}} -> {{2, 1}, {2, 3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 
    3}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 3}, {3, 4}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {2, 3}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {2, 3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 3}, {3, 
    4}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 3}, {1, 4}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {1, 3}, {2, 4}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 
    3}, {3, 4}}, {{1, 2}, {2, 1}} -> {{1, 1}, {2, 3}, {2, 4}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {2, 3}, {3, 4}}, {{1, 2}, {2, 1}} -> {{1, 
    2}, {1, 3}, {1, 4}}, {{1, 2}, {2, 1}} -> {{1, 2}, {1, 3}, {2, 
    4}}, {{1, 2}, {2, 1}} -> {{1, 2}, {2, 3}, {3, 4}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {1, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 
    3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{1, 1}, {1, 3}, {4, 3}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {2, 3}, {4, 3}}, {{1, 2}, {1, 2}} -> {{1, 
    2}, {1, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{1, 2}, {1, 3}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 2}, {2, 3}, {4, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 2}, {2, 3}, {4, 3}}, {{1, 2}, {1, 2}} -> {{2, 1}, {1, 
    3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{2, 1}, {1, 3}, {4, 3}}, {{1, 
    2}, {1, 2}} -> {{2, 1}, {2, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{2, 
    1}, {2, 3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {1, 3}, {4, 
    3}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 3}, {4, 1}}, {{1, 2}, {1, 
    2}} -> {{2, 2}, {2, 3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {2, 
    3}, {4, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {1, 3}, {4, 1}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {1, 3}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 
    1}, {1, 3}, {4, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {2, 3}, {4, 
    3}}, {{1, 2}, {2, 1}} -> {{1, 2}, {1, 3}, {4, 1}}, {{1, 2}, {2, 
    1}} -> {{1, 2}, {1, 3}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 2}, {2, 
    3}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 2}, {2, 3}, {4, 3}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {3, 1}, {2, 2}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {3, 1}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {3, 1}, {3, 
    1}}, {{1, 2}, {1, 2}} -> {{1, 1}, {3, 1}, {3, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {3, 2}, {3, 2}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 
    1}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 2}, {1, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {3, 2}, {1, 3}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {3, 2}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 2}, {3, 
    2}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 1}, {2, 2}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {3, 1}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 
    1}, {3, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 1}, {3, 2}}, {{1, 
    2}, {2, 1}} -> {{1, 1}, {3, 2}, {3, 2}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {3, 1}, {2, 4}}, {{1, 2}, {1, 2}} -> {{1, 1}, {3, 1}, {3, 
    4}}, {{1, 2}, {1, 2}} -> {{1, 1}, {3, 2}, {2, 4}}, {{1, 2}, {1, 
    2}} -> {{1, 1}, {3, 2}, {3, 4}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 
    1}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 1}, {3, 4}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {3, 2}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {3, 2}, {3, 4}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 1}, {2, 
    4}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 1}, {3, 4}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {3, 2}, {2, 4}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 
    2}, {3, 4}}, {{1, 2}, {1, 2}} -> {{1, 1}, {3, 1}, {4, 1}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {3, 1}, {4, 2}}, {{1, 2}, {1, 2}} -> {{1, 
    1}, {3, 1}, {4, 3}}, {{1, 2}, {1, 2}} -> {{1, 1}, {3, 2}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 2}, {3, 2}, {4, 2}}, {{1, 2}, {1, 
    2}} -> {{2, 1}, {3, 1}, {4, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 
    1}, {4, 1}}, {{1, 2}, {1, 2}} -> {{2, 2}, {3, 2}, {4, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 2}, {3, 2}, {4, 2}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {3, 2}, {4, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 1}, {4, 
    1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 1}, {4, 2}}, {{1, 2}, {2, 
    1}} -> {{1, 1}, {3, 1}, {4, 3}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 
    2}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 2}, {3, 2}, {4, 2}}, {{1, 
    2}, {1, 2}} -> {{1, 1}, {3, 4}, {4, 2}}, {{1, 2}, {1, 2}} -> {{2, 
    2}, {3, 4}, {4, 1}}, {{1, 2}, {2, 1}} -> {{1, 1}, {3, 4}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 2}, {3, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {1, 3}, {1, 2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 
    3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 3}, {2, 1}}, {{1, 
    2}, {1, 2}} -> {{1, 3}, {1, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {1, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 3}, {3, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 1}, {1, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {3, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 
    2}, {2, 1}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 1}, {3, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {2, 3}, {1, 2}}, {{1, 2}, {1, 2}} -> {{2, 
    3}, {2, 3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 3}, {2, 
    1}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 3}, {2, 3}}, {{1, 2}, {1, 
    2}} -> {{2, 3}, {2, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 
    3}, {3, 2}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 2}, {1, 2}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {3, 2}, {2, 1}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {1, 2}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 3}, {1, 
    2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 3}, {1, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 3}, {1, 3}, {2, 1}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 
    3}, {2, 3}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 3}, {3, 1}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {1, 3}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {3, 1}, {1, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 1}, {2, 
    1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 2}, {3, 4}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {1, 3}, {1, 4}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 
    3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 3}, {3, 4}}, {{1, 
    2}, {1, 2}} -> {{1, 3}, {3, 1}, {1, 4}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {3, 1}, {2, 4}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 2}, {2, 
    4}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 1}, {3, 4}}, {{1, 2}, {1, 
    2}} -> {{2, 3}, {2, 3}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 
    3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 3}, {3, 4}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {3, 1}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 
    3}, {3, 2}, {1, 4}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 2}, {2, 
    4}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 2}, {3, 4}}, {{1, 2}, {2, 
    1}} -> {{1, 3}, {1, 3}, {1, 4}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 
    3}, {2, 4}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 3}, {3, 4}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {3, 1}, {1, 4}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {3, 1}, {2, 4}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 2}, {2, 
    4}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 2}, {4, 3}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {1, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 
    3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 3}, {4, 3}}, {{1, 
    2}, {1, 2}} -> {{1, 3}, {2, 3}, {4, 3}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {3, 1}, {4, 1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 1}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 2}, {4, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {3, 2}, {4, 3}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 
    1}, {4, 3}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 3}, {4, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {2, 3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{2, 
    3}, {2, 3}, {4, 3}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 1}, {4, 
    1}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 1}, {4, 3}}, {{1, 2}, {1, 
    2}} -> {{2, 3}, {3, 2}, {4, 1}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 
    2}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 2}, {4, 3}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {1, 3}, {4, 1}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {1, 3}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 3}, {4, 
    3}}, {{1, 2}, {2, 1}} -> {{1, 3}, {2, 3}, {4, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 3}, {3, 1}, {4, 1}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 
    1}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 2}, {4, 2}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {3, 2}, {4, 3}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {1, 4}, {2, 1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 4}, {2, 
    3}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 4}, {3, 2}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {1, 4}, {3, 4}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 
    4}, {2, 3}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 4}, {2, 4}}, {{1, 
    2}, {1, 2}} -> {{1, 3}, {3, 4}, {4, 1}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {3, 4}, {4, 2}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 4}, {1, 
    2}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 4}, {1, 3}}, {{1, 2}, {1, 
    2}} -> {{2, 3}, {2, 4}, {3, 1}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 
    4}, {3, 4}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 4}, {1, 4}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {3, 4}, {4, 1}}, {{1, 2}, {1, 2}} -> {{2, 
    3}, {3, 4}, {4, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 4}, {2, 
    1}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 4}, {2, 3}}, {{1, 2}, {2, 
    1}} -> {{1, 3}, {1, 4}, {3, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 
    4}, {3, 4}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 4}, {2, 3}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {3, 4}, {2, 4}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {3, 4}, {4, 1}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 4}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 4}, {1, 5}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {1, 4}, {2, 5}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 
    4}, {3, 5}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 4}, {2, 5}}, {{1, 
    2}, {1, 2}} -> {{1, 3}, {3, 4}, {4, 5}}, {{1, 2}, {1, 2}} -> {{2, 
    3}, {2, 4}, {1, 5}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 4}, {2, 
    5}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 4}, {3, 5}}, {{1, 2}, {1, 
    2}} -> {{2, 3}, {3, 4}, {1, 5}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 
    4}, {4, 5}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 4}, {1, 5}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {1, 4}, {2, 5}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {1, 4}, {3, 5}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 4}, {2, 
    5}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 4}, {4, 5}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {1, 4}, {5, 1}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 
    4}, {5, 2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {1, 4}, {5, 3}}, {{1, 
    2}, {1, 2}} -> {{1, 3}, {3, 4}, {5, 2}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {3, 4}, {5, 3}}, {{1, 2}, {1, 2}} -> {{1, 3}, {3, 4}, {5, 
    4}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 4}, {5, 1}}, {{1, 2}, {1, 
    2}} -> {{2, 3}, {2, 4}, {5, 2}}, {{1, 2}, {1, 2}} -> {{2, 3}, {2, 
    4}, {5, 3}}, {{1, 2}, {1, 2}} -> {{2, 3}, {3, 4}, {5, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {3, 4}, {5, 3}}, {{1, 2}, {1, 2}} -> {{2, 
    3}, {3, 4}, {5, 4}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 4}, {5, 
    1}}, {{1, 2}, {2, 1}} -> {{1, 3}, {1, 4}, {5, 2}}, {{1, 2}, {2, 
    1}} -> {{1, 3}, {1, 4}, {5, 3}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 
    4}, {5, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {3, 4}, {5, 3}}, {{1, 
    2}, {2, 1}} -> {{1, 3}, {3, 4}, {5, 4}}, {{1, 2}, {1, 2}} -> {{1, 
    3}, {4, 3}, {2, 5}}, {{1, 2}, {1, 2}} -> {{2, 3}, {4, 3}, {1, 
    5}}, {{1, 2}, {2, 1}} -> {{1, 3}, {4, 3}, {2, 5}}, {{1, 2}, {1, 
    2}} -> {{1, 3}, {4, 3}, {5, 2}}, {{1, 2}, {1, 2}} -> {{1, 3}, {4, 
    3}, {5, 3}}, {{1, 2}, {1, 2}} -> {{2, 3}, {4, 3}, {5, 1}}, {{1, 
    2}, {1, 2}} -> {{2, 3}, {4, 3}, {5, 3}}, {{1, 2}, {2, 1}} -> {{1, 
    3}, {4, 3}, {5, 2}}, {{1, 2}, {2, 1}} -> {{1, 3}, {4, 3}, {5, 
    3}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 3}, {2, 3}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {1, 3}, {3, 2}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 
    1}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 1}, {1, 3}}, {{1, 
    2}, {1, 2}} -> {{3, 1}, {3, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{3, 
    1}, {3, 1}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 1}, {3, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 1}, {3, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {3, 2}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 
    3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 3}, {3, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 2}, {3, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{3, 
    2}, {3, 2}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 2}, {1, 
    3}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 2}, {2, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 2}, {3, 2}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 
    2}, {3, 1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 2}, {3, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {1, 3}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {1, 3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {1, 3}, {2, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {1, 3}, {2, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {1, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {2, 
    3}, {1, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {2, 3}, {1, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {2, 3}, {2, 1}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {2, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 1}, {1, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 1}, {1, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {3, 1}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 
    1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 1}, {2, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {3, 1}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {3, 1}, {3, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 1}, {3, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 2}, {1, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {3, 2}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 
    2}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 2}, {2, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {3, 2}, {2, 2}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {3, 2}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 2}, {3, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 3}, {1, 3}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {3, 3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 
    3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 3}, {3, 2}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {1, 3}, {2, 3}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {1, 3}, {3, 2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 1}, {1, 
    2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 1}, {1, 3}}, {{1, 2}, {2, 
    1}} -> {{3, 1}, {3, 1}, {2, 1}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 
    1}, {2, 3}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 1}, {3, 1}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {3, 1}, {3, 2}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {3, 2}, {1, 2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 3}, {1, 
    2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 3}, {1, 3}}, {{1, 2}, {2, 
    1}} -> {{3, 3}, {1, 3}, {2, 1}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 
    3}, {2, 2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 3}, {2, 3}}, {{1, 
    2}, {2, 1}} -> {{3, 3}, {3, 1}, {1, 1}}, {{1, 2}, {2, 1}} -> {{3, 
    3}, {3, 1}, {1, 2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 1}, {1, 
    3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 1}, {2, 1}}, {{1, 2}, {2, 
    1}} -> {{3, 3}, {3, 1}, {2, 2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 
    1}, {2, 3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 1}, {3, 1}}, {{1, 
    2}, {2, 1}} -> {{3, 3}, {3, 1}, {3, 2}}, {{1, 2}, {2, 1}} -> {{3, 
    3}, {3, 3}, {1, 3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 3}, {3, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 2}, {2, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {1, 3}, {3, 4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 
    1}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 1}, {2, 4}}, {{1, 
    2}, {1, 2}} -> {{3, 1}, {3, 1}, {3, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    1}, {3, 2}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 2}, {3, 
    4}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 1}, {1, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 2}, {2, 3}, {3, 4}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 
    1}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 2}, {1, 4}}, {{1, 
    2}, {1, 2}} -> {{3, 2}, {3, 2}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    2}, {3, 2}, {3, 4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {1, 3}, {1, 
    4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {1, 3}, {2, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {2, 3}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {2, 
    3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 1}, {1, 4}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {3, 1}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {3, 1}, {3, 4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 2}, {1, 
    4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 2}, {2, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {3, 2}, {3, 4}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 
    2}, {2, 4}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 3}, {3, 4}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {3, 1}, {1, 4}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {3, 1}, {2, 4}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 1}, {3, 
    4}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 2}, {1, 4}}, {{1, 2}, {2, 
    1}} -> {{3, 1}, {3, 2}, {3, 4}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 
    3}, {1, 4}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 3}, {2, 4}}, {{1, 
    2}, {2, 1}} -> {{3, 3}, {3, 1}, {1, 4}}, {{1, 2}, {2, 1}} -> {{3, 
    3}, {3, 1}, {2, 4}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 1}, {3, 
    4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 2}, {4, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {1, 2}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 
    3}, {4, 3}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 1}, {4, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 1}, {3, 1}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 
    1}, {3, 1}, {4, 3}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 2}, {4, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 2}, {4, 3}}, {{1, 2}, {1, 
    2}} -> {{3, 2}, {2, 1}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 
    1}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 3}, {4, 3}}, {{1, 
    2}, {1, 2}} -> {{3, 2}, {3, 1}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 
    2}, {3, 2}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 2}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 2}, {4, 3}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {1, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {1, 
    3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 3}, {1, 3}, {4, 3}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {2, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {2, 3}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 3}, {2, 3}, {4, 
    3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 1}, {4, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {3, 1}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 
    1}, {4, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 2}, {4, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {3, 2}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {3, 2}, {4, 3}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 2}, {4, 
    1}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 2}, {4, 2}}, {{1, 2}, {2, 
    1}} -> {{3, 1}, {1, 3}, {4, 3}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 
    1}, {4, 1}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 1}, {4, 2}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {3, 1}, {4, 3}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {3, 2}, {4, 1}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 2}, {4, 
    3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 3}, {4, 1}}, {{1, 2}, {2, 
    1}} -> {{3, 3}, {1, 3}, {4, 2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {1, 
    3}, {4, 3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 1}, {4, 1}}, {{1, 
    2}, {2, 1}} -> {{3, 3}, {3, 1}, {4, 2}}, {{1, 2}, {2, 1}} -> {{3, 
    3}, {3, 1}, {4, 3}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 4}, {2, 
    4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 4}, {4, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {3, 4}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 
    4}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 4}, {2, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 1}, {3, 4}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 
    2}, {2, 4}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 4}, {4, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 4}, {1, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 2}, {3, 4}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 
    4}, {2, 1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 4}, {2, 4}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {3, 4}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {3, 4}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 4}, {2, 
    3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 4}, {2, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 3}, {3, 4}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {3, 
    4}, {4, 2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 4}, {2, 4}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {1, 4}, {4, 2}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {3, 4}, {1, 2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 4}, {1, 
    4}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 4}, {2, 1}}, {{1, 2}, {2, 
    1}} -> {{3, 1}, {3, 4}, {2, 3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 
    4}, {1, 3}}, {{1, 2}, {2, 1}} -> {{3, 3}, {3, 4}, {1, 4}}, {{1, 
    2}, {2, 1}} -> {{3, 3}, {3, 4}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 
    1}, {1, 4}, {2, 5}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 4}, {4, 
    5}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 4}, {1, 5}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {3, 4}, {2, 5}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 
    4}, {3, 5}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 4}, {1, 5}}, {{1, 
    2}, {1, 2}} -> {{3, 2}, {2, 4}, {4, 5}}, {{1, 2}, {1, 2}} -> {{3, 
    2}, {3, 4}, {1, 5}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 4}, {2, 
    5}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 4}, {3, 5}}, {{1, 2}, {2, 
    1}} -> {{3, 1}, {1, 4}, {2, 5}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 
    4}, {4, 5}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 4}, {1, 5}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {3, 4}, {2, 5}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {3, 4}, {3, 5}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 4}, {5, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {1, 4}, {5, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 1}, {1, 4}, {5, 4}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 
    4}, {5, 1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {3, 4}, {5, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 1}, {3, 4}, {5, 3}}, {{1, 2}, {1, 2}} -> {{3, 
    2}, {2, 4}, {5, 1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 4}, {5, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {2, 4}, {5, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 2}, {3, 4}, {5, 1}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 
    4}, {5, 2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {3, 4}, {5, 3}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {1, 4}, {5, 1}}, {{1, 2}, {2, 1}} -> {{3, 
    1}, {1, 4}, {5, 2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {1, 4}, {5, 
    4}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 4}, {5, 1}}, {{1, 2}, {2, 
    1}} -> {{3, 1}, {3, 4}, {5, 2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {3, 
    4}, {5, 3}}, {{1, 2}, {1, 2}} -> {{3, 3}, {4, 3}, {1, 4}}, {{1, 
    2}, {1, 2}} -> {{3, 3}, {4, 3}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    3}, {4, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 3}, {4, 3}, {4, 
    2}}, {{1, 2}, {2, 1}} -> {{3, 3}, {4, 3}, {1, 4}}, {{1, 2}, {2, 
    1}} -> {{3, 3}, {4, 3}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {4, 
    1}, {2, 5}}, {{1, 2}, {1, 2}} -> {{3, 2}, {4, 2}, {1, 5}}, {{1, 
    2}, {2, 1}} -> {{3, 1}, {4, 1}, {2, 5}}, {{1, 2}, {1, 2}} -> {{3, 
    1}, {4, 1}, {5, 1}}, {{1, 2}, {1, 2}} -> {{3, 1}, {4, 1}, {5, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 2}, {4, 2}, {5, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 2}, {4, 2}, {5, 2}}, {{1, 2}, {2, 1}} -> {{3, 1}, {4, 
    1}, {5, 1}}, {{1, 2}, {2, 1}} -> {{3, 1}, {4, 1}, {5, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {3, 1}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    4}, {3, 1}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 1}, {4, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 2}, {1, 4}}, {{1, 2}, {1, 
    2}} -> {{3, 4}, {3, 2}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 
    2}, {4, 2}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 4}, {1, 3}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {3, 4}, {1, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    4}, {3, 4}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 4}, {2, 
    4}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 4}, {3, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 4}, {3, 4}, {3, 2}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 
    4}, {4, 1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 4}, {4, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {4, 1}, {1, 2}}, {{1, 2}, {1, 2}} -> {{3, 
    4}, {4, 1}, {2, 1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 2}, {1, 
    2}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 2}, {2, 1}}, {{1, 2}, {1, 
    2}} -> {{3, 4}, {4, 3}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 
    3}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 3}, {3, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {4, 3}, {3, 2}}, {{1, 2}, {2, 1}} -> {{3, 
    4}, {3, 1}, {2, 4}}, {{1, 2}, {2, 1}} -> {{3, 4}, {3, 1}, {4, 
    1}}, {{1, 2}, {2, 1}} -> {{3, 4}, {3, 1}, {4, 2}}, {{1, 2}, {2, 
    1}} -> {{3, 4}, {3, 4}, {1, 3}}, {{1, 2}, {2, 1}} -> {{3, 4}, {3, 
    4}, {1, 4}}, {{1, 2}, {2, 1}} -> {{3, 4}, {3, 4}, {3, 1}}, {{1, 
    2}, {2, 1}} -> {{3, 4}, {3, 4}, {4, 1}}, {{1, 2}, {2, 1}} -> {{3, 
    4}, {4, 1}, {1, 2}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 1}, {2, 
    1}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 3}, {1, 3}}, {{1, 2}, {2, 
    1}} -> {{3, 4}, {4, 3}, {3, 1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 
    1}, {4, 5}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 2}, {4, 5}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {4, 1}, {1, 5}}, {{1, 2}, {1, 2}} -> {{3, 
    4}, {4, 1}, {2, 5}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 2}, {1, 
    5}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 2}, {2, 5}}, {{1, 2}, {2, 
    1}} -> {{3, 4}, {3, 1}, {4, 5}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 
    1}, {1, 5}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 1}, {2, 5}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {3, 1}, {5, 4}}, {{1, 2}, {1, 2}} -> {{3, 
    4}, {3, 2}, {5, 4}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 1}, {5, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 1}, {5, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 4}, {4, 1}, {5, 4}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 
    2}, {5, 1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 2}, {5, 2}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {4, 2}, {5, 4}}, {{1, 2}, {2, 1}} -> {{3, 
    4}, {3, 1}, {5, 4}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 1}, {5, 
    1}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 1}, {5, 2}}, {{1, 2}, {2, 
    1}} -> {{3, 4}, {4, 1}, {5, 4}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 
    5}, {1, 3}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 5}, {1, 4}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {3, 5}, {2, 3}}, {{1, 2}, {1, 2}} -> {{3, 
    4}, {3, 5}, {2, 4}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 5}, {4, 
    1}}, {{1, 2}, {1, 2}} -> {{3, 4}, {3, 5}, {4, 2}}, {{1, 2}, {1, 
    2}} -> {{3, 4}, {4, 5}, {1, 5}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 
    5}, {2, 5}}, {{1, 2}, {1, 2}} -> {{3, 4}, {4, 5}, {5, 1}}, {{1, 
    2}, {1, 2}} -> {{3, 4}, {4, 5}, {5, 2}}, {{1, 2}, {2, 1}} -> {{3, 
    4}, {3, 5}, {1, 3}}, {{1, 2}, {2, 1}} -> {{3, 4}, {3, 5}, {1, 
    4}}, {{1, 2}, {2, 1}} -> {{3, 4}, {3, 5}, {4, 1}}, {{1, 2}, {2, 
    1}} -> {{3, 4}, {4, 5}, {1, 5}}, {{1, 2}, {2, 1}} -> {{3, 4}, {4, 
    5}, {5, 1}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {1, 1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {1, 1}, {2, 1}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 1}, {2, 
    2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 1}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {1, 2}, {1, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 
    2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 2}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {1, 2}, {2, 2}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {1, 2}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 2}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 2}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {1, 2}, {3, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 
    1}, {2, 1}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 1}, {2, 3}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {2, 1}, {3, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {2, 1}, {3, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 1}, {3, 
    3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 2}, {3, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {2, 3}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 
    3}, {3, 2}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 2}, {1, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {1, 2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {1, 2}, {2, 1}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 2}, {2, 
    3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 2}, {3, 1}}, {{1, 2}, {1, 
    3}} -> {{1, 2}, {1, 2}, {3, 2}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 
    3}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 1}, {1, 3}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {2, 1}, {3, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {2, 3}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 2}, {2, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 2}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{2, 1}, {2, 1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 
    1}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 1}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 1}, {2, 1}, {2, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {2, 1}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 1}, {3, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 3}, {1, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {1, 2}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 
    2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 2}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {1, 2}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {1, 2}, {3, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 3}, {1, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 3}, {3, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 1}, {1, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 1}, {1, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 1}, {2, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 1}, {2, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 1}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 1}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 1}, {3, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    2}, {1, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {1, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 2}, {2, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {2, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 2}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    2}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {3, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 3}, {1, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 3}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 3}, {1, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 3}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 3}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    3}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 3}, {3, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {3, 1}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {3, 2}, {1, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 2}, {1, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 2}, {3, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {3, 2}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 
    1}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 3}, {1, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 3}, {2, 3}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {2, 3}, {2, 1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 3}, {2, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 3}, {3, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 3}, {2, 3}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 
    2}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 2}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 1}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 1}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 1}, {2, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 1}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 1}, {3, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {1, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {1, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    2}, {2, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {3, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 3}, {1, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    3}, {2, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 3}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 3}, {3, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 1}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {2, 1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 
    1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 1}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {2, 1}, {3, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {2, 2}, {3, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 3}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 3}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {3, 1}, {2, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 
    1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 1}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {3, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {3, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 2}, {1, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 2}, {1, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {1, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 
    2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 2}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 2}, {1, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {1, 3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 1}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 1}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {2, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 
    2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 3}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {1, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    3}, {1, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 3}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 3}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 3}, {1, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 
    1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 1}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {3, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {1, 2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 2}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 1}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{2, 1}, {2, 1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 
    1}, {2, 1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 1}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 1}, {2, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {2, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 3}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 2}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {1, 2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 
    2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 2}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {1, 2}, {3, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {1, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 3}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {1, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {3, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 2}, {1, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 2}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 2}, {2, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 2}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 2}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    2}, {3, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 3}, {1, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 3}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 3}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    3}, {3, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 1}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {3, 2}, {1, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {3, 2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 2}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 2}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {2, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 
    3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 3}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 3}, {2, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    3}, {2, 3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 3}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 3}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {3, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 
    2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 3}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 1}, {1, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    1}, {3, 1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 1}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 1}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 1}, {3, 1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 
    1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 1}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 1}, {3, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {2, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{3, 2}, {2, 3}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 1}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 2}, {3, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 
    2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 2}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{3, 2}, {3, 2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {3, 2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 2}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 2}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {1, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 
    3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 3}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {1, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {1, 3}, {2, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 3}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 1}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {2, 3}, {1, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 
    3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 3}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {2, 3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 1}, {1, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 1}, {1, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 1}, {1, 3}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 1}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    1}, {2, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 1}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 2}, {1, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 2}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 2}, {2, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {1, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {2, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {3, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {1, 1}, {1, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 
    1}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 2}, {1, 4}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {1, 2}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {1, 2}, {3, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 1}, {2, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 1}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {2, 2}, {3, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 
    3}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 3}, {3, 4}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {1, 2}, {1, 4}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {1, 2}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 2}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 3}, {1, 4}}, {{1, 2}, {1, 
    3}} -> {{1, 2}, {1, 3}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 
    1}, {1, 4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 1}, {3, 4}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {2, 3}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {1, 2}, {2, 4}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 3}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 1}, {1, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 1}, {2, 1}, {2, 4}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 
    1}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 3}, {1, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 1}, {2, 3}, {2, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {1, 2}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 2}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 3}, {1, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {1, 3}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    1}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 1}, {2, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 1}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 2}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {2, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 3}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    3}, {2, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 3}, {3, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {3, 1}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 2}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 2}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {3, 3}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 
    1}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 3}, {1, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 3}, {2, 3}, {2, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {2, 3}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 1}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 2}, {1, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 3}, {3, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 1}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 3}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {2, 1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {2, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 2}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 3}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {2, 3}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 
    1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 1}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {3, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {3, 2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 3}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 2}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {1, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 
    2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 3}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 2}, {1, 3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {2, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 1}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 3}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 3}, {1, 2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 
    3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 3}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {1, 3}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    3}, {3, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 1}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 2}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 1}, {1, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 
    3}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 1}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 1}, {2, 1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {2, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 3}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 3}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {1, 2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 
    2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 3}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {1, 3}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 2}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 3}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 1}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {3, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 
    2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 2}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {3, 3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    3}, {2, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 3}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 3}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {2, 3}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 
    1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 2}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 3}, {3, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    1}, {1, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 3}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 1}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 1}, {3, 1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 
    1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 2}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 1}, {3, 2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {2, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 2}, {2, 3}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 1}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 2}, {3, 2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 
    2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 2}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {1, 2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {1, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 3}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 3}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {2, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 
    1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 3}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {2, 3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 1}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 1}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 2}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 1}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {1, 1}, {4, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 
    2}, {4, 1}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 2}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {1, 2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {2, 1}, {4, 1}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 1}, {4, 
    2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 1}, {4, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {2, 2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 
    3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 2}, {4, 1}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {1, 2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {1, 2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 3}, {4, 
    1}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 3}, {4, 2}}, {{1, 2}, {1, 
    3}} -> {{1, 2}, {2, 1}, {4, 1}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 
    1}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 3}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {2, 3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {3, 2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 2}, {4, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 3}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 1}, {1, 3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 
    1}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 1}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 1}, {2, 1}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {2, 3}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 3}, {4, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 1}, {3, 1}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {1, 2}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 
    2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 2}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {1, 3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 1}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 1}, {4, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 1}, {4, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 2}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 2}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 3}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 3}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 3}, {4, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {3, 2}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 
    2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 2}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {3, 3}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {2, 1}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 3}, {4, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 3}, {4, 2}}, {{1, 2}, {1, 
    3}} -> {{2, 3}, {2, 3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 
    1}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 1}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 3}, {3, 2}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {3, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 1}, {4, 
    1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 1}, {4, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 1}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 2}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 3}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 3}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {2, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 
    1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 1}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {2, 2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {2, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 1}, {4, 
    1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 1}, {4, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {3, 1}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 
    2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 3}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 2}, {1, 2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {1, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 2}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 3}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {1, 3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 
    1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 1}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 2}, {2, 3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {2, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {3, 2}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 2}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 3}, {1, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 
    3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 3}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {2, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    3}, {3, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 1}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 2}, {4, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 3}, {3, 2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 
    2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 3}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 1}, {1, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {2, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 1}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 1}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 1}, {2, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 
    3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 1}, {3, 1}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {1, 2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {1, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 2}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 3}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 1}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 2}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 3}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 1}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {3, 2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {3, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 2}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 3}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {2, 1}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 
    3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 3}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 3}, {2, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    3}, {3, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 1}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 2}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {3, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 
    2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 2}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 1}, {1, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    1}, {3, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 1}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 1}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{3, 1}, {3, 2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 
    2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 2}, {2, 1}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{3, 2}, {2, 1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {2, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 1}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 2}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 2}, {3, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 
    2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 2}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {1, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {1, 3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 3}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 1}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {2, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 
    3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 3}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 1}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 2}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 3}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 4}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 4}, {2, 1}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {1, 4}, {2, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 
    4}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 4}, {2, 4}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {1, 4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {1, 4}, {4, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 4}, {4, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 4}, {2, 4}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {2, 4}, {3, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 
    4}, {4, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 4}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {1, 4}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {1, 4}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 4}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 4}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{1, 2}, {2, 4}, {3, 2}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 
    4}, {3, 4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {4, 1}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {2, 4}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {1, 4}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 4}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 4}, {4, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 1}, {2, 4}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 
    4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 4}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 1}, {2, 4}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {1, 4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 4}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 4}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {1, 4}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    4}, {1, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {1, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 4}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {2, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {3, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 4}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    4}, {3, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {3, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 4}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {4, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {4, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {3, 4}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 
    4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 4}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 3}, {2, 4}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {2, 4}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 3}, {3, 4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 
    4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 4}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 4}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {2, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 4}, {3, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {4, 
    1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {4, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 4}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    4}, {4, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 4}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {2, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {2, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 4}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 4}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {3, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 
    4}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 4}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 2}, {1, 4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {1, 4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 4}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 4}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {2, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 
    4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 4}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {1, 4}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    3}, {1, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 4}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 4}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 3}, {3, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 
    4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 4}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {3, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {1, 4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 4}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 4}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 1}, {2, 4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 
    4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 4}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 1}, {2, 4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {1, 4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {1, 4}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    4}, {1, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    4}, {3, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {4, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {3, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 
    4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 4}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 3}, {2, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 
    3}, {2, 4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 4}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 4}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {3, 4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 
    4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 4}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 1}, {1, 4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    1}, {1, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 4}, {1, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 4}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 1}, {3, 4}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 
    4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 2}, {2, 4}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 2}, {2, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {3, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 4}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 4}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 2}, {3, 4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 
    4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 4}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {1, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {1, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 4}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 4}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {2, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    4}, {1, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {2, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {3, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {3, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {4, 4}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {1, 4}, {1, 5}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 
    4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 4}, {4, 5}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {2, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {2, 4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 4}, {4, 
    5}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 4}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{1, 2}, {1, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 
    4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 5}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {2, 4}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {1, 4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 4}, {4, 
    5}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 4}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 1}, {2, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 
    4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 4}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {1, 4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {1, 4}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {1, 
    5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {2, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {2, 4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    4}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 4}, {3, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {3, 4}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {2, 4}, {1, 5}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {2, 
    5}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {3, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 3}, {3, 4}, {1, 5}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 
    4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {1, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {1, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {4, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 4}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {2, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {2, 
    4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 4}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {3, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {1, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 4}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 4}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {2, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 
    4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 4}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {1, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    3}, {1, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 4}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 4}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 1}, {1, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 
    4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 4}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{2, 1}, {2, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {2, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {1, 
    5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {1, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {2, 5}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {2, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {2, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 4}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {3, 4}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {2, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 
    4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 4}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{2, 3}, {3, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 
    3}, {3, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 4}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 4}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{3, 1}, {3, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 
    4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 4}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 2}, {2, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {2, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 4}, {1, 
    5}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 4}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{3, 2}, {3, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 
    4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 4}, {2, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {1, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {2, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 4}, {4, 
    5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {4, 5}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {1, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {1, 4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {1, 4}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {2, 4}, {5, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {2, 4}, {5, 4}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 
    4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{1, 2}, {1, 4}, {5, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 2}, {1, 4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    2}, {2, 4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {5, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 1}, {1, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 
    4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 1}, {1, 4}, {5, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 1}, {2, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {2, 4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{2, 1}, {2, 4}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {1, 4}, {5, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {1, 4}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 
    4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {2, 4}, {5, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {2, 4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {2, 4}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 4}, {5, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {3, 4}, {5, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 3}, {2, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 
    4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{2, 3}, {2, 4}, {5, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 3}, {3, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    3}, {3, 4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 3}, {3, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {1, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 
    4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {1, 4}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {2, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {2, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 4}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {3, 4}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {1, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 
    4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {1, 4}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 2}, {2, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {2, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 2}, {2, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 4}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 3}, {1, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 3}, {1, 
    4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 3}, {3, 4}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {3, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    3}, {3, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 4}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {1, 4}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 1}, {1, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 
    4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {2, 4}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 1}, {2, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {1, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {1, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {2, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 
    4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {2, 4}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {3, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {3, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 4}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {2, 4}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {2, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 
    4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 3}, {3, 4}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 3}, {3, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    1}, {1, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 4}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {1, 4}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 1}, {3, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 
    4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 1}, {3, 4}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 2}, {2, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {2, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 2}, {2, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 4}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 2}, {3, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 2}, {3, 
    4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {1, 4}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {1, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {2, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {2, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {3, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 
    4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {3, 4}, {5, 4}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {4, 1}, {2, 2}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {4, 1}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 1}, {2, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 1}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {4, 1}, {4, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 
    2}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 2}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {4, 2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {4, 1}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 1}, {4, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 1}, {4, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {4, 2}, {1, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 
    2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 2}, {1, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {4, 2}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {4, 2}, {3, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 2}, {3, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 2}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {4, 2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 
    2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 3}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {4, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {4, 1}, {2, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {4, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 
    1}, {3, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {4, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {4, 1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 2}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {4, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 
    2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 3}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {4, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {4, 1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 1}, {4, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 1}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {4, 2}, {1, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 
    2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 2}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {4, 2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {4, 2}, {3, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 2}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 2}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {4, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 
    2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 3}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {4, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {4, 1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 1}, {4, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 1}, {4, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {4, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 
    2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 3}, {1, 1}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {4, 3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {4, 3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 3}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 3}, {2, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {4, 3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 
    3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 3}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {4, 3}, {4, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {4, 1}, {2, 5}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 1}, {4, 
    5}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 2}, {2, 5}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {4, 2}, {4, 5}}, {{1, 2}, {1, 3}} -> {{1, 2}, {4, 
    2}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 1}, {4, 1}, {3, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {4, 1}, {1, 5}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {4, 1}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 2}, {1, 
    5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 2}, {3, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {4, 2}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 
    3}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 3}, {4, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 3}, {4, 3}, {1, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {4, 1}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {4, 2}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 
    2}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 3}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {4, 3}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    2}, {4, 2}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 3}, {4, 3}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{2, 1}, {4, 1}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {4, 1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 
    1}, {4, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 2}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {4, 2}, {3, 5}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {4, 2}, {4, 5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 3}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 3}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {4, 3}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 1}, {4, 
    1}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 2}, {4, 2}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {4, 1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {4, 1}, {4, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 2}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 2}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {4, 3}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 
    3}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 3}, {4, 5}}, {{1, 
    2}, {1, 3}} -> {{1, 1}, {4, 1}, {5, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    1}, {4, 1}, {5, 2}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 1}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 2}, {5, 2}}, {{1, 2}, {1, 
    3}} -> {{1, 1}, {4, 2}, {5, 3}}, {{1, 2}, {1, 3}} -> {{1, 2}, {4, 
    2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{1, 2}, {4, 2}, {5, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 1}, {4, 1}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    1}, {4, 1}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 1}, {5, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 1}, {5, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {4, 2}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 
    2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 2}, {5, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 2}, {4, 2}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    2}, {4, 3}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 3}, {4, 3}, {5, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 3}, {4, 3}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 1}, {4, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 
    1}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 1}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 1}, {4, 1}, {5, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    1}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 2}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 3}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 2}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 2}, {4, 
    2}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 3}, {4, 3}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 3}, {4, 3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    1}, {4, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 1}, {4, 1}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 1}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 2}, {4, 1}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 
    2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 2}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {4, 2}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {4, 2}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 2}, {4, 3}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 3}, {4, 3}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 3}, {4, 3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{3, 1}, {4, 
    1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 1}, {4, 1}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 2}, {4, 2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    2}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 1}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 1}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 3}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 
    3}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 3}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 3}, {4, 3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{3, 
    3}, {4, 3}, {5, 4}}, {{1, 2}, {1, 3}} -> {{1, 1}, {4, 5}, {5, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 2}, {4, 5}, {5, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 2}, {4, 5}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 
    5}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 1}, {4, 5}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 2}, {4, 5}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    2}, {4, 5}, {5, 3}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 5}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 3}, {4, 5}, {5, 2}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {1, 2}, {3, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 
    2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 2}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {1, 4}, {1, 2}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {1, 4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 4}, {2, 
    1}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 4}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {1, 4}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 
    4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 4}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {2, 4}, {3, 4}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {4, 1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 1}, {2, 
    1}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 1}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {4, 2}, {2, 1}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 
    2}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 2}, {3, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {4, 2}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 1}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 1}, {4, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 1}, {4, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {2, 3}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 
    3}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 3}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {2, 4}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 4}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {2, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {2, 4}, {2, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 
    4}, {2, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {2, 4}, {3, 2}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 4}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {4, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {4, 2}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {2, 4}, {4, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    1}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 1}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {4, 1}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {4, 2}, {1, 2}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 2}, {1, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 2}, {2, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {4, 2}, {2, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    2}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 2}, {3, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {4, 3}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {4, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 2}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 2}, {4, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 3}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {1, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {1, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 4}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {1, 4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {1, 4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    4}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {2, 4}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 1}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 1}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 
    1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 2}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 2}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 3}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 
    3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 3}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 1}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 3}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {2, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 
    3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {2, 4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 
    4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 1}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {4, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {4, 2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 2}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 2}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 3}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {4, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {4, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 1}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 1}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {3, 1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 2}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {3, 4}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {3, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    4}, {4, 3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 1}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {4, 1}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {4, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 2}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 3}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 3}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {4, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {4, 3}, {3, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 2}, {4, 
    5}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 4}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {1, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 
    4}, {4, 5}}, {{1, 2}, {1, 3}} -> {{1, 4}, {2, 4}, {3, 5}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {4, 1}, {1, 5}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {4, 1}, {2, 5}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 2}, {2, 
    5}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 2}, {3, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {2, 1}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 
    3}, {4, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {2, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 4}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {4, 
    5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {3, 4}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {4, 1}, {1, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    1}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 2}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {4, 2}, {2, 5}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {4, 2}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {1, 
    5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 2}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    3}, {4, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {1, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {1, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {4, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {2, 4}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {3, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 
    1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 1}, {2, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 1}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 2}, {2, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 2}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 3}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 3}, {3, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 
    1}, {4, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 3}, {4, 5}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {3, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 1}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {4, 2}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {4, 2}, {2, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 2}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 3}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 3}, {3, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    1}, {4, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 2}, {4, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    1}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 2}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {4, 2}, {2, 5}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {4, 3}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 3}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 3}, {3, 5}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {1, 2}, {5, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 
    4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 4}, {5, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {1, 4}, {5, 4}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {2, 4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{1, 4}, {2, 4}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 1}, {5, 1}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {4, 1}, {5, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 
    2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 2}, {5, 3}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {4, 2}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 1}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 3}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {5, 1}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {2, 4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 
    4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 4}, {5, 4}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {3, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {3, 4}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 1}, {5, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 1}, {5, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {4, 1}, {5, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    2}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 2}, {5, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {4, 2}, {5, 3}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {4, 3}, {5, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 2}, {5, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    3}, {5, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {5, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {1, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {1, 4}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {2, 4}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {2, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {3, 
    4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {3, 4}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 1}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 1}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 2}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 2}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 
    2}, {5, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 3}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 3}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 1}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 3}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {2, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 
    4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {3, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {3, 4}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 1}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 1}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    1}, {5, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 2}, {5, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {4, 2}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 3}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 3}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 3}, {5, 4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    1}, {5, 4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 2}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 4}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    1}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 1}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {4, 2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 2}, {5, 
    4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 3}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    3}, {5, 3}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 5}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {1, 5}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {1, 5}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 5}, {4, 
    2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 5}, {4, 5}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {4, 5}, {2, 3}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 
    5}, {2, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 5}, {2, 5}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {4, 5}, {5, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {4, 5}, {5, 2}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {1, 
    2}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {1, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {2, 5}, {1, 4}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 
    5}, {3, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {3, 2}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {2, 5}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 5}, {4, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {4, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {4, 5}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {4, 5}, {1, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    5}, {1, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 5}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {4, 5}, {3, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {4, 5}, {3, 5}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 5}, {5, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 5}, {5, 2}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {4, 5}, {5, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    5}, {2, 1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {1, 5}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {1, 5}, {3, 1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 5}, {4, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    5}, {4, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {4, 5}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 5}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 5}, {2, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 5}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 5}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 
    5}, {3, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 5}, {5, 1}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {4, 5}, {5, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 5}, {5, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {1, 
    2}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {1, 3}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {2, 5}, {1, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 
    5}, {3, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 5}, {4, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 5}, {1, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    5}, {1, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 5}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {4, 5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {4, 5}, {3, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 5}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 5}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 5}, {5, 3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    5}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 5}, {1, 4}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 5}, {2, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {3, 5}, {4, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 
    5}, {4, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {4, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {4, 5}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {4, 5}, {1, 5}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 5}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 5}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 5}, {5, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    5}, {5, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 5}, {5, 3}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {1, 5}, {1, 6}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {1, 5}, {2, 6}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 5}, {4, 
    6}}, {{1, 2}, {1, 3}} -> {{1, 4}, {2, 5}, {3, 6}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {4, 5}, {2, 6}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 
    5}, {5, 6}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {1, 6}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {2, 5}, {2, 6}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 5}, {3, 6}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {4, 
    6}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 5}, {1, 6}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {4, 5}, {3, 6}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    5}, {5, 6}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {1, 6}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {1, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {1, 5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {4, 
    6}}, {{1, 2}, {2, 3}} -> {{1, 4}, {2, 5}, {3, 6}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 
    5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 5}, {5, 6}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 5}, {1, 6}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {3, 
    6}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {4, 6}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 5}, {1, 6}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 5}, {5, 6}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 5}, {1, 6}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {3, 
    6}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {4, 6}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 5}, {1, 6}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 5}, {5, 6}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {1, 5}, {6, 1}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {1, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {1, 5}, {6, 
    4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {2, 5}, {6, 3}}, {{1, 2}, {1, 
    3}} -> {{1, 4}, {4, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 
    5}, {6, 4}}, {{1, 2}, {1, 3}} -> {{1, 4}, {4, 5}, {6, 5}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {2, 5}, {6, 1}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {2, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {6, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {2, 5}, {6, 4}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {3, 5}, {6, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 
    5}, {6, 1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {4, 5}, {6, 3}}, {{1, 
    2}, {1, 3}} -> {{2, 4}, {4, 5}, {6, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {4, 5}, {6, 5}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {6, 
    1}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 5}, {6, 2}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {1, 5}, {6, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {1, 
    5}, {6, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {2, 5}, {6, 3}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {3, 5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {4, 5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 5}, {6, 
    3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {4, 5}, {6, 4}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {4, 5}, {6, 5}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 
    5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {2, 5}, {6, 2}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {2, 5}, {6, 3}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {2, 5}, {6, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {3, 5}, {6, 
    1}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 5}, {6, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {4, 5}, {6, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 
    5}, {6, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {4, 5}, {6, 5}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {3, 5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {3, 5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {6, 
    3}}, {{1, 2}, {2, 3}} -> {{3, 4}, {3, 5}, {6, 4}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {4, 5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 
    5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {4, 5}, {6, 4}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {4, 5}, {6, 5}}, {{1, 2}, {1, 3}} -> {{1, 
    4}, {5, 4}, {2, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {5, 4}, {1, 
    3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {5, 4}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{1, 4}, {5, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {5, 
    4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{2, 4}, {5, 4}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{2, 4}, {5, 4}, {3, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {5, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {5, 4}, {2, 
    1}}, {{1, 2}, {1, 3}} -> {{1, 4}, {5, 4}, {2, 6}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {5, 4}, {1, 6}}, {{1, 2}, {1, 3}} -> {{2, 4}, {5, 
    4}, {3, 6}}, {{1, 2}, {2, 3}} -> {{1, 4}, {5, 4}, {2, 6}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {5, 4}, {3, 6}}, {{1, 2}, {2, 3}} -> {{2, 
    4}, {5, 4}, {1, 6}}, {{1, 2}, {2, 3}} -> {{2, 4}, {5, 4}, {3, 
    6}}, {{1, 2}, {2, 3}} -> {{3, 4}, {5, 4}, {1, 6}}, {{1, 2}, {2, 
    3}} -> {{3, 4}, {5, 4}, {2, 6}}, {{1, 2}, {1, 3}} -> {{1, 4}, {5, 
    2}, {6, 3}}, {{1, 2}, {1, 3}} -> {{1, 4}, {5, 4}, {6, 2}}, {{1, 
    2}, {1, 3}} -> {{1, 4}, {5, 4}, {6, 4}}, {{1, 2}, {1, 3}} -> {{2, 
    4}, {5, 1}, {6, 3}}, {{1, 2}, {1, 3}} -> {{2, 4}, {5, 4}, {6, 
    1}}, {{1, 2}, {1, 3}} -> {{2, 4}, {5, 4}, {6, 3}}, {{1, 2}, {1, 
    3}} -> {{2, 4}, {5, 4}, {6, 4}}, {{1, 2}, {2, 3}} -> {{1, 4}, {5, 
    2}, {6, 3}}, {{1, 2}, {2, 3}} -> {{1, 4}, {5, 4}, {6, 2}}, {{1, 
    2}, {2, 3}} -> {{1, 4}, {5, 4}, {6, 3}}, {{1, 2}, {2, 3}} -> {{1, 
    4}, {5, 4}, {6, 4}}, {{1, 2}, {2, 3}} -> {{2, 4}, {5, 1}, {6, 
    3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {5, 4}, {6, 1}}, {{1, 2}, {2, 
    3}} -> {{2, 4}, {5, 4}, {6, 3}}, {{1, 2}, {2, 3}} -> {{2, 4}, {5, 
    4}, {6, 4}}, {{1, 2}, {2, 3}} -> {{3, 4}, {5, 1}, {6, 2}}, {{1, 
    2}, {2, 3}} -> {{3, 4}, {5, 4}, {6, 1}}, {{1, 2}, {2, 3}} -> {{3, 
    4}, {5, 4}, {6, 2}}, {{1, 2}, {2, 3}} -> {{3, 4}, {5, 4}, {6, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 2}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {1, 2}, {3, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 
    4}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 4}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {4, 1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {4, 1}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 1}, {2, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 1}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {4, 1}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 
    1}, {4, 1}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 1}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {4, 2}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {4, 2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 2}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 2}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {4, 2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 
    1}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 1}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {2, 3}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {2, 3}, {3, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 4}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 4}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {2, 4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 
    4}, {4, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 1}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {4, 1}, {2, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {4, 1}, {3, 2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {1, 
    2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {1, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 2}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    2}, {2, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {2, 3}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {4, 2}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {4, 2}, {3, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {3, 
    2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {3, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 2}, {4, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    2}, {4, 2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {4, 3}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {4, 3}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {4, 3}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 3}, {2, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 3}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {1, 4}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 
    4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 4}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {1, 4}, {2, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {1, 4}, {2, 3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 4}, {2, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {1, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {2, 4}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 
    4}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {2, 4}, {2, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {2, 4}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {2, 4}, {3, 3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 
    4}, {3, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 1}, {1, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 1}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 1}, {2, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 1}, {2, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {4, 1}, {2, 3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 
    1}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 1}, {4, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 1}, {4, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 2}, {1, 1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {1, 
    2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {1, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {4, 2}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 
    2}, {2, 1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {2, 2}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 2}, {2, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 2}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {4, 2}, {3, 3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 
    2}, {3, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {4, 2}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 2}, {4, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 4}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 4}, {2, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 4}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {4, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 
    2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 2}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {1, 3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {1, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 4}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 4}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {1, 4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 
    4}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 1}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 1}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {4, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    3}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 1}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {2, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {2, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 3}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 4}, {1, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {2, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 
    4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 4}, {4, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 1}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 1}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    2}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 2}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 2}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    2}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 3}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 3}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 
    1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 1}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {3, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {3, 2}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 4}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 4}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {3, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 
    4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 1}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {4, 1}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 2}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 2}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {4, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {4, 3}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {4, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {4, 3}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 3}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {4, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {1, 4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 
    4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {1, 4}, {2, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {1, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {1, 4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 
    4}, {3, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {3, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {2, 4}, {1, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {2, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 4}, {1, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 4}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {2, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 
    4}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 4}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {2, 4}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {2, 4}, {3, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 4}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {1, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {3, 4}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 
    4}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {3, 4}, {2, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {3, 4}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {3, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    1}, {1, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 1}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 1}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {2, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {2, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    1}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 1}, {3, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {4, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 1}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    1}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {1, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 2}, {2, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 2}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {3, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 2}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    2}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {1, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {2, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 3}, {2, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    3}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {3, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {3, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 3}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    4}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 4}, {2, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 4}, {4, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 4}, {4, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {1, 2}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 
    2}, {3, 5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 4}, {4, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {4, 1}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {4, 1}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 1}, {4, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 2}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {4, 2}, {3, 5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 
    2}, {4, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 1}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {2, 1}, {3, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {2, 3}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 3}, {3, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 4}, {4, 5}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 1}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    2}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {2, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {4, 2}, {3, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {4, 2}, {4, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 3}, {1, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 3}, {2, 5}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 3}, {4, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 
    4}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 4}, {2, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {2, 4}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {2, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {3, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 1}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {4, 1}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 
    1}, {4, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 2}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 2}, {3, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {4, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 2}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {1, 2}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 
    3}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 3}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {1, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 1}, {4, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    2}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 2}, {4, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 3}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {4, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {2, 1}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 
    1}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 3}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {2, 3}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {2, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 1}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 2}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    2}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {4, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 3}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 3}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 3}, {4, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 1}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {3, 1}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 
    2}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 2}, {2, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {3, 4}, {4, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 1}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 2}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {4, 3}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    3}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {4, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {1, 4}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {1, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {3, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 4}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {2, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 
    4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {3, 4}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {3, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {1, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 1}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    1}, {4, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 2}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 2}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {4, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {1, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 3}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    3}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {4, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {1, 2}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {1, 2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 2}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 4}, {5, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {4, 1}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 
    1}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 1}, {5, 4}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {4, 2}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {4, 2}, {5, 3}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 2}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 1}, {5, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {2, 1}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 
    1}, {5, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 3}, {5, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {2, 3}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {2, 3}, {5, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 4}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 1}, {5, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 2}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 2}, {5, 3}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {4, 2}, {5, 4}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {4, 3}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 3}, {5, 
    2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 3}, {5, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {1, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 
    4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {1, 4}, {5, 4}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {2, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {2, 4}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {2, 4}, {5, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 4}, {4, 1}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 
    1}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 1}, {5, 4}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 2}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 2}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {1, 2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 
    2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 2}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {1, 3}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {1, 3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 3}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 4}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    1}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 1}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 1}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 2}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 3}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {2, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {2, 1}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 1}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 3}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {2, 3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 
    3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 4}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 1}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 2}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 2}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    3}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 3}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 3}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {3, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 1}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 1}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {3, 2}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 
    2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 2}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {3, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 1}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 2}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {4, 3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 3}, {5, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {1, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {1, 4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {1, 4}, {5, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {2, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 
    4}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {2, 4}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {2, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {3, 4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 4}, {3, 4}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {3, 4}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 1}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 1}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 1}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 2}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 2}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    2}, {5, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {5, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 3}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 3}, {5, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 5}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {1, 5}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 
    5}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 5}, {1, 2}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {4, 5}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {4, 5}, {2, 1}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 5}, {2, 
    3}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 5}, {2, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {2, 5}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 
    5}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 5}, {3, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {2, 5}, {3, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {2, 5}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 5}, {5, 
    3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {1, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 5}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    5}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {2, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {4, 5}, {2, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {4, 5}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {3, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {3, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 5}, {3, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 
    5}, {1, 4}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 5}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {4, 5}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {4, 5}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 5}, {5, 
    1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {4, 5}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {1, 5}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 
    5}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 5}, {3, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {1, 5}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {1, 5}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 5}, {5, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 5}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    5}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 5}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {4, 5}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 
    5}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 5}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {2, 5}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {2, 5}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 5}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 5}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 5}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    5}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 5}, {1, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 5}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 5}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 5}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 5}, {3, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 5}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 5}, {1, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {3, 5}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {3, 5}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 5}, {2, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 5}, {5, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {3, 5}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    5}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {4, 5}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 5}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {2, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {4, 5}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    5}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {3, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 5}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 5}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 5}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 5}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {4, 5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 
    5}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {4, 5}, {5, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {4, 5}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    4}, {4, 5}, {5, 3}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 5}, {2, 
    6}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 5}, {5, 6}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {4, 5}, {1, 6}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 
    5}, {2, 6}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 5}, {4, 6}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {2, 5}, {1, 6}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {2, 5}, {3, 6}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 5}, {5, 
    6}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {1, 6}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 5}, {2, 6}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    5}, {3, 6}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {4, 6}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {1, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {1, 5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 5}, {5, 
    6}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {1, 6}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {4, 6}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {2, 5}, {1, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {2, 5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 5}, {5, 
    6}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 5}, {1, 6}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {4, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 5}, {4, 6}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {3, 5}, {1, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {3, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 5}, {5, 
    6}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {1, 6}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {4, 5}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 
    5}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {4, 6}}, {{1, 
    2}, {1, 3}} -> {{4, 1}, {1, 5}, {6, 1}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {1, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {1, 5}, {6, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 5}, {6, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {4, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {4, 
    5}, {6, 4}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 5}, {6, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {2, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {2, 5}, {6, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {2, 5}, {6, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {6, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {4, 5}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 
    5}, {6, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {4, 5}, {6, 4}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {1, 5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    1}, {1, 5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 5}, {6, 
    3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {1, 5}, {6, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {4, 5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 
    5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {4, 5}, {6, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 1}, {4, 5}, {6, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {2, 5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 5}, {6, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {2, 5}, {6, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 2}, {2, 5}, {6, 5}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 
    5}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 2}, {4, 5}, {6, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {4, 5}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {4, 5}, {6, 4}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 5}, {6, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 5}, {6, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {3, 5}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {3, 
    5}, {6, 5}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {6, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {4, 5}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {4, 5}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {4, 5}, {6, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 1}, {5, 1}, {2, 3}}, {{1, 2}, {1, 
    3}} -> {{4, 2}, {5, 2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {5, 
    2}, {3, 1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {5, 4}, {1, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 4}, {5, 4}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    4}, {5, 4}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 4}, {5, 4}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 1}, {5, 1}, {2, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {5, 1}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 2}, {5, 
    2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {5, 2}, {3, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 3}, {5, 3}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {5, 3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {5, 4}, {1, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {5, 4}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 4}, {5, 4}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 4}, {5, 
    4}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 4}, {5, 4}, {5, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 4}, {5, 4}, {5, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    1}, {5, 1}, {2, 6}}, {{1, 2}, {1, 3}} -> {{4, 2}, {5, 2}, {1, 
    6}}, {{1, 2}, {1, 3}} -> {{4, 2}, {5, 2}, {3, 6}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {5, 1}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 1}, {5, 
    1}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 2}, {5, 2}, {1, 6}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {5, 2}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    3}, {5, 3}, {1, 6}}, {{1, 2}, {2, 3}} -> {{4, 3}, {5, 3}, {2, 
    6}}, {{1, 2}, {1, 3}} -> {{4, 1}, {5, 1}, {6, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 1}, {5, 1}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 1}, {5, 
    2}, {6, 3}}, {{1, 2}, {1, 3}} -> {{4, 2}, {5, 2}, {6, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 2}, {5, 2}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    2}, {5, 2}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {5, 1}, {6, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 1}, {5, 1}, {6, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 1}, {5, 1}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 1}, {5, 
    2}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 2}, {5, 2}, {6, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 2}, {5, 2}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    2}, {5, 2}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 3}, {5, 3}, {6, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 3}, {5, 3}, {6, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 3}, {5, 3}, {6, 3}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 
    1}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 1}, {5, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {4, 1}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {4, 2}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 2}, {3, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 2}, {5, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {4, 2}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 
    2}, {5, 3}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 5}, {1, 4}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {4, 5}, {1, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {4, 5}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 5}, {2, 
    5}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 5}, {4, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {4, 5}, {4, 2}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 
    5}, {5, 1}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 5}, {5, 2}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {5, 1}, {1, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {5, 1}, {2, 1}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 1}, {2, 
    3}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 2}, {1, 2}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {5, 2}, {1, 3}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 
    2}, {2, 1}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 2}, {2, 3}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {5, 2}, {3, 1}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {5, 2}, {3, 2}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 4}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 4}, {2, 4}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {5, 4}, {4, 1}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 
    4}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 1}, {2, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {4, 1}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 1}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 1}, {5, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 1}, {5, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {4, 2}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 
    2}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 2}, {5, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {4, 2}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 2}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 3}, {1, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 3}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {4, 3}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 
    3}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 3}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {4, 5}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 5}, {1, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 5}, {2, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 5}, {2, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {4, 5}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 
    5}, {3, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 5}, {4, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {4, 5}, {4, 2}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 5}, {4, 3}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 5}, {5, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 5}, {5, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {4, 5}, {5, 3}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    1}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 1}, {1, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 1}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 1}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 1}, {3, 
    1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 1}, {3, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 2}, {1, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    2}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 2}, {2, 1}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 2}, {2, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 2}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 2}, {3, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 3}, {1, 2}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 3}, {1, 3}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    3}, {2, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 3}, {2, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 3}, {3, 1}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 3}, {3, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 4}, {1, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 4}, {2, 4}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 4}, {3, 4}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    4}, {4, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 4}, {4, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 4}, {4, 3}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {4, 1}, {5, 6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 2}, {5, 
    6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 1}, {1, 6}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {5, 1}, {2, 6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 
    2}, {1, 6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 2}, {2, 6}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {5, 2}, {3, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 1}, {5, 6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 2}, {5, 
    6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 3}, {5, 6}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 1}, {1, 6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    1}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 1}, {3, 6}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 2}, {1, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 2}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 2}, {3, 
    6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 3}, {1, 6}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 3}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    3}, {3, 6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 1}, {6, 5}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {4, 2}, {6, 5}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {5, 1}, {6, 1}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 1}, {6, 
    2}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 1}, {6, 5}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {5, 2}, {6, 1}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 
    2}, {6, 2}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 2}, {6, 3}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {5, 2}, {6, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 1}, {6, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 2}, {6, 
    5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 3}, {6, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 1}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    1}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 1}, {6, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 1}, {6, 5}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 2}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 2}, {6, 
    2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 2}, {6, 3}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 2}, {6, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    3}, {6, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 3}, {6, 2}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 3}, {6, 3}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 3}, {6, 5}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 6}, {1, 
    4}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 6}, {1, 5}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {4, 6}, {2, 4}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 
    6}, {2, 5}}, {{1, 2}, {1, 3}} -> {{4, 5}, {4, 6}, {5, 1}}, {{1, 
    2}, {1, 3}} -> {{4, 5}, {4, 6}, {5, 2}}, {{1, 2}, {1, 3}} -> {{4, 
    5}, {5, 6}, {1, 6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 6}, {2, 
    6}}, {{1, 2}, {1, 3}} -> {{4, 5}, {5, 6}, {6, 1}}, {{1, 2}, {1, 
    3}} -> {{4, 5}, {5, 6}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 
    6}, {1, 4}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 6}, {1, 5}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {4, 6}, {2, 4}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {4, 6}, {2, 5}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 6}, {3, 
    4}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 6}, {3, 5}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {4, 6}, {5, 1}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 
    6}, {5, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {4, 6}, {5, 3}}, {{1, 
    2}, {2, 3}} -> {{4, 5}, {5, 6}, {1, 6}}, {{1, 2}, {2, 3}} -> {{4, 
    5}, {5, 6}, {2, 6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 6}, {3, 
    6}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 6}, {6, 1}}, {{1, 2}, {2, 
    3}} -> {{4, 5}, {5, 6}, {6, 2}}, {{1, 2}, {2, 3}} -> {{4, 5}, {5, 
    6}, {6, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {1, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 1}, {1, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {2, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {2, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 1}, {3, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 2}, {1, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 2}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 2}, {2, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 2}, {2, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    2}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 2}, {3, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 2}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 2}, {3, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 3}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 3}, {2, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 3}, {2, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    3}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 3}, {3, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 3}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 3}, {3, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 1}, {2, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 1}, {2, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {2, 1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 
    1}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 1}, {3, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {2, 2}, {3, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {2, 3}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 3}, {3, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 1}, {2, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {3, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 
    1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 1}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {3, 2}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {1, 2}, {1, 2}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 2}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 2}, {2, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {1, 2}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 
    2}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 2}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 2}, {1, 3}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {2, 1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 1}, {3, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 3}, {3, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 3}, {1, 2}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 
    3}, {1, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 3}, {1, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {1, 3}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    3}, {1, 3}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 3}, {3, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 3}, {3, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 3}, {3, 1}, {1, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 
    1}, {2, 1}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 2}, {2, 3}}, {{1, 
    2}, {3, 2}} -> {{2, 1}, {1, 2}, {3, 2}}, {{1, 2}, {3, 2}} -> {{2, 
    1}, {2, 1}, {1, 2}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 1}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 1}, {2, 1}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {2, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 
    1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 1}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{2, 1}, {2, 3}, {1, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {1, 2}, {1, 2}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 2}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 2}, {3, 1}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {1, 2}, {3, 2}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 
    2}, {3, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 3}, {1, 3}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {1, 3}, {3, 1}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {2, 1}, {1, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 1}, {1, 
    2}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 1}, {1, 3}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 1}, {2, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 
    1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 1}, {3, 1}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {2, 1}, {3, 2}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {2, 1}, {3, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 2}, {1, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 2}, {1, 2}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 2}, {1, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 
    2}, {2, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 2}, {2, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 1}, {1, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 1}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {3, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 2}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 2}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 3}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 3}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 3}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 1}, {2, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 1}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {2, 2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 
    3}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 3}, {3, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {3, 1}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {3, 1}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 2}, {2, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 2}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {3, 3}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 
    2}, {1, 4}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 2}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 2}, {1, 2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {1, 3}, {1, 4}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 3}, {2, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 1}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {2, 1}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 
    3}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 2}, {3, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {1, 3}, {1, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    3}, {1, 3}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 3}, {3, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 1}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 3}, {3, 1}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 
    2}, {2, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 2}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{2, 1}, {1, 3}, {3, 4}}, {{1, 2}, {3, 2}} -> {{2, 
    1}, {2, 1}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 1}, {2, 
    4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 1}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {2, 3}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 
    3}, {2, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 2}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {1, 2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {1, 3}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 3}, {3, 
    4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 1}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 1}, {2, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 
    1}, {3, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 2}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {2, 2}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {4, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 1}, {4, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 2}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 2}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 3}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 3}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 3}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 1}, {4, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {2, 1}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 
    1}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 2}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {2, 3}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {3, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 1}, {4, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 1}, {4, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {3, 2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 
    3}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 2}, {4, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 2}, {1, 2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {1, 2}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 3}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 3}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {2, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 
    1}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 3}, {4, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 2}, {2, 3}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {3, 2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 2}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 3}, {4, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 3}, {1, 3}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 
    3}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {2, 3}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {3, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    3}, {3, 1}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 2}, {4, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 2}, {4, 3}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {1, 2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 
    3}, {4, 1}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 3}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{2, 1}, {2, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{2, 
    1}, {2, 1}, {4, 2}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 1}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 3}, {4, 1}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {2, 3}, {4, 2}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 
    2}, {4, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 2}, {4, 2}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {1, 2}, {4, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {1, 3}, {4, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 1}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 1}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 1}, {4, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 
    2}, {4, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 2}, {4, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 4}, {1, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 4}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {2, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {2, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    4}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 4}, {3, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {1, 4}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    4}, {4, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 4}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {2, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {2, 4}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 4}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 4}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {3, 4}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 
    4}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 4}, {2, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 2}, {1, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {1, 4}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 4}, {3, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 4}, {3, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {2, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 
    4}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 4}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {1, 4}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    3}, {1, 4}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 4}, {3, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 4}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 3}, {3, 4}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 
    4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 4}, {4, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {3, 4}, {4, 2}}, {{1, 2}, {3, 2}} -> {{2, 
    1}, {1, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 4}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 4}, {1, 3}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {2, 4}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 
    4}, {3, 1}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 4}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {1, 4}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {1, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 4}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 4}, {4, 3}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 4}, {1, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 
    4}, {1, 2}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 4}, {1, 3}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {2, 4}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {2, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 4}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 4}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 4}, {4, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 
    4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {2, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 4}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 4}, {4, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 4}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 4}, {3, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {2, 4}, {4, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 
    4}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 4}, {4, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 2}, {1, 4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {1, 4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 4}, {3, 
    5}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 4}, {3, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {2, 4}, {4, 5}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 
    4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 4}, {2, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {1, 4}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 
    3}, {3, 4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 4}, {4, 
    5}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 4}, {3, 5}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {1, 4}, {4, 5}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 
    4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 4}, {2, 5}}, {{1, 
    2}, {3, 2}} -> {{2, 1}, {2, 4}, {3, 5}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {1, 4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 4}, {3, 
    5}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 4}, {4, 5}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {2, 4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 
    4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 4}, {4, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {1, 4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {1, 4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {1, 4}, {5, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {2, 4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {2, 
    4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {3, 4}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {3, 4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    2}, {1, 4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 4}, {5, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 2}, {1, 4}, {5, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {2, 4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 
    4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 2}, {2, 4}, {5, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {1, 4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    3}, {1, 4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 3}, {1, 4}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 4}, {5, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 3}, {3, 4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {3, 
    4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {1, 4}, {5, 1}}, {{1, 
    2}, {3, 2}} -> {{2, 1}, {1, 4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    1}, {1, 4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 4}, {5, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 1}, {2, 4}, {5, 2}}, {{1, 2}, {3, 
    2}} -> {{2, 1}, {2, 4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 
    4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {1, 4}, {5, 4}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {2, 4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {2, 4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{2, 2}, {2, 4}, {5, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 1}, {2, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {4, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 
    1}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 1}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {4, 1}, {3, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {4, 1}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 1}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 1}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {4, 1}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 
    2}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 2}, {4, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {4, 2}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {4, 3}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 3}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 1}, {1, 3}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {4, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 
    1}, {4, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 2}, {1, 1}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {4, 2}, {1, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {4, 2}, {1, 4}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 2}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 2}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {4, 1}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 
    1}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 1}, {4, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {4, 2}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {4, 2}, {4, 5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 3}, {3, 
    5}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 3}, {4, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {4, 2}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 3}, {4, 
    3}, {2, 5}}, {{1, 2}, {3, 2}} -> {{2, 1}, {4, 1}, {3, 5}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {4, 1}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 
    2}, {4, 1}, {4, 5}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 2}, {1, 
    5}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 2}, {4, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 1}, {4, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 
    1}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 1}, {5, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 1}, {4, 1}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {4, 2}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 2}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 3}, {5, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 2}, {4, 2}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 2}, {4, 
    2}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 3}, {4, 3}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 3}, {4, 3}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    1}, {4, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{2, 1}, {4, 1}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 1}, {5, 1}}, {{1, 2}, {3, 
    2}} -> {{2, 2}, {4, 1}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 
    2}, {5, 1}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 2}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{2, 2}, {4, 2}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    1}, {4, 5}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 1}, {4, 5}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 2}, {4, 5}, {5, 1}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 2}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {1, 3}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {1, 3}, {4, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 3}, {4, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {1, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 4}, {1, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    4}, {1, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {2, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {1, 4}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {1, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {3, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {3, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    4}, {4, 1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {4, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {1, 4}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {2, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 1}, {1, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 1}, {1, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {4, 1}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 
    1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 1}, {3, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 1}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 2}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 2}, {2, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 2}, {3, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {4, 2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 
    3}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 3}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 3}, {3, 2}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {2, 1}, {3, 4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 1}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 1}, {4, 3}}, {{1, 2}, {3, 
    2}} -> {{2, 4}, {2, 4}, {1, 2}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 
    4}, {1, 3}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 4}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {2, 4}, {2, 1}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {2, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 4}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 4}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{2, 4}, {4, 1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 
    1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 2}, {1, 2}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {4, 2}, {1, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {4, 2}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 2}, {4, 
    5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 3}, {4, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {3, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {1, 4}, {4, 5}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {2, 4}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {3, 4}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 1}, {1, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {4, 1}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 
    1}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 2}, {2, 5}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 2}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 3}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 3}, {3, 
    5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 1}, {4, 5}}, {{1, 2}, {3, 
    2}} -> {{2, 4}, {2, 4}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 
    4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 4}, {4, 5}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {4, 1}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {4, 1}, {3, 5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 2}, {1, 
    5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 2}, {2, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 2}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    3}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {5, 1}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {1, 4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {1, 4}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 4}, {5, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {2, 4}, {5, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {2, 4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {3, 
    4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {3, 4}, {5, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 1}, {5, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 1}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 2}, {5, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {4, 2}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 
    2}, {5, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 3}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 3}, {5, 3}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 3}, {5, 4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 1}, {5, 
    4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 4}, {5, 1}}, {{1, 2}, {3, 
    2}} -> {{2, 4}, {2, 4}, {5, 2}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 
    4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 1}, {5, 1}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {4, 1}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {4, 1}, {5, 4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 2}, {5, 
    1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 2}, {5, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 5}, {2, 1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    5}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {1, 5}, {3, 1}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {1, 5}, {3, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {3, 
    4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 5}, {4, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    5}, {4, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {2, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 5}, {2, 4}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 5}, {2, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {3, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {4, 5}, {3, 5}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 
    5}, {5, 1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {4, 5}, {5, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {2, 5}, {1, 2}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 5}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 5}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{2, 4}, {2, 5}, {4, 1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 
    5}, {4, 5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 5}, {1, 3}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {4, 5}, {1, 5}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {4, 5}, {5, 1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 5}, {5, 
    2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {1, 6}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 5}, {2, 6}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    5}, {3, 6}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {4, 6}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {2, 5}, {3, 6}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 5}, {2, 6}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {3, 
    6}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {5, 6}}, {{1, 2}, {3, 
    2}} -> {{2, 4}, {2, 5}, {1, 6}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 
    5}, {2, 6}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 5}, {4, 6}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {4, 5}, {1, 6}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {4, 5}, {5, 6}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {6, 
    1}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 5}, {6, 2}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {1, 5}, {6, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {1, 
    5}, {6, 4}}, {{1, 2}, {3, 2}} -> {{1, 4}, {2, 5}, {6, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {3, 5}, {6, 2}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {4, 5}, {6, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {6, 
    3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {4, 5}, {6, 4}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {4, 5}, {6, 5}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 
    5}, {6, 1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {2, 5}, {6, 2}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {2, 5}, {6, 4}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {4, 5}, {6, 1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 5}, {6, 
    4}}, {{1, 2}, {3, 2}} -> {{2, 4}, {4, 5}, {6, 5}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {5, 4}, {2, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {5, 
    4}, {3, 2}}, {{1, 2}, {3, 2}} -> {{2, 4}, {5, 4}, {1, 3}}, {{1, 
    2}, {3, 2}} -> {{1, 4}, {5, 4}, {2, 6}}, {{1, 2}, {3, 2}} -> {{1, 
    4}, {5, 4}, {3, 6}}, {{1, 2}, {3, 2}} -> {{2, 4}, {5, 4}, {1, 
    6}}, {{1, 2}, {3, 2}} -> {{1, 4}, {5, 2}, {6, 3}}, {{1, 2}, {3, 
    2}} -> {{1, 4}, {5, 4}, {6, 2}}, {{1, 2}, {3, 2}} -> {{1, 4}, {5, 
    4}, {6, 3}}, {{1, 2}, {3, 2}} -> {{1, 4}, {5, 4}, {6, 4}}, {{1, 
    2}, {3, 2}} -> {{2, 4}, {5, 1}, {6, 3}}, {{1, 2}, {3, 2}} -> {{2, 
    4}, {5, 4}, {6, 1}}, {{1, 2}, {3, 2}} -> {{2, 4}, {5, 4}, {6, 
    4}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 2}, {2, 3}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {1, 2}, {3, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 
    3}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 3}, {3, 2}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {1, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {1, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 4}, {4, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 4}, {4, 3}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 1}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 1}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {2, 
    4}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {3, 1}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 1}, {3, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    1}, {3, 4}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {4, 1}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 1}, {4, 2}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 1}, {4, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 2}, {1, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 2}, {1, 3}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 2}, {3, 1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    2}, {3, 4}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 2}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 3}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {2, 
    1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {2, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {2, 1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 
    1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 4}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {2, 4}, {4, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {4, 1}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 1}, {2, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 1}, {3, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {4, 2}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 
    2}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 2}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {4, 2}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {4, 2}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 2}, {4, 
    1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 2}, {4, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {1, 4}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 
    4}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {1, 4}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {1, 4}, {2, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {2, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {2, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {1, 4}, {3, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 
    4}, {3, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {3, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {1, 4}, {3, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {2, 4}, {1, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 4}, {1, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 4}, {1, 3}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {2, 4}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 
    4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {1, 1}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 1}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {1, 
    4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {2, 1}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 1}, {2, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 
    1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 1}, {3, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {3, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {3, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 1}, {4, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 
    1}, {4, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {4, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 2}, {1, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 2}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 2}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 2}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 2}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 
    2}, {2, 2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 2}, {2, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 2}, {4, 2}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 4}, {1, 4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 4}, {2, 
    4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 4}, {4, 1}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 4}, {4, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 
    2}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 2}, {3, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {1, 3}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {1, 3}, {3, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 4}, {4, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {1, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 1}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    1}, {3, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {4, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 2}, {1, 5}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 2}, {3, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 2}, {4, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 3}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    3}, {4, 5}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 1}, {1, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {2, 1}, {3, 5}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {2, 4}, {4, 5}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 1}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 2}, {1, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {4, 2}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 
    2}, {4, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {1, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {1, 4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {1, 4}, {3, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 4}, {1, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 4}, {2, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 1}, {1, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 
    1}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {3, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 1}, {4, 5}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 2}, {1, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 2}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 2}, {4, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {1, 2}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 
    2}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 2}, {5, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {1, 3}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {1, 3}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 3}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 4}, {5, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    1}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 1}, {5, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 1}, {5, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 2}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 2}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 2}, {5, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 3}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    3}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {5, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {2, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {2, 1}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 1}, {5, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 4}, {5, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {4, 1}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 
    2}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 2}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {4, 2}, {5, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {1, 4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {5, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {1, 4}, {5, 3}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {1, 4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 
    4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {2, 4}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {2, 4}, {5, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {5, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 1}, {5, 3}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 1}, {5, 4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 
    2}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 2}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 2}, {5, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {1, 5}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {3, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {1, 5}, {3, 5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 
    5}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {5, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 5}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 5}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 5}, {1, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 5}, {2, 1}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 5}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    5}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 5}, {3, 1}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 5}, {3, 2}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 5}, {3, 4}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 5}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 5}, {1, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {2, 5}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 
    5}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 5}, {1, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {4, 5}, {1, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {4, 5}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 5}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 5}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {4, 5}, {1, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 
    5}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 4}, {4, 5}, {2, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 4}, {4, 5}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    4}, {4, 5}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {2, 
    6}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {3, 6}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {1, 5}, {5, 6}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    5}, {1, 6}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 5}, {2, 6}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 5}, {3, 6}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {4, 5}, {4, 6}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 5}, {1, 
    6}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 5}, {5, 6}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {4, 5}, {1, 6}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 
    5}, {2, 6}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 5}, {4, 6}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {1, 5}, {6, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {1, 5}, {6, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {6, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {1, 5}, {6, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {4, 5}, {6, 1}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 
    5}, {6, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {4, 5}, {6, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {4, 5}, {6, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {2, 5}, {6, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 5}, {6, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {2, 5}, {6, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 2}, {4, 5}, {6, 1}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 
    5}, {6, 2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {4, 5}, {6, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {5, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {5, 1}, {3, 2}}, {{1, 2}, {3, 2}} -> {{4, 2}, {5, 2}, {1, 
    3}}, {{1, 2}, {3, 2}} -> {{4, 4}, {5, 4}, {1, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 4}, {5, 4}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 4}, {5, 
    4}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 4}, {5, 4}, {5, 2}}, {{1, 
    2}, {3, 2}} -> {{4, 1}, {5, 1}, {2, 6}}, {{1, 2}, {3, 2}} -> {{4, 
    1}, {5, 1}, {3, 6}}, {{1, 2}, {3, 2}} -> {{4, 2}, {5, 2}, {1, 
    6}}, {{1, 2}, {3, 2}} -> {{4, 1}, {5, 1}, {6, 1}}, {{1, 2}, {3, 
    2}} -> {{4, 1}, {5, 1}, {6, 2}}, {{1, 2}, {3, 2}} -> {{4, 1}, {5, 
    1}, {6, 3}}, {{1, 2}, {3, 2}} -> {{4, 1}, {5, 2}, {6, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 2}, {5, 2}, {6, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    2}, {5, 2}, {6, 2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 1}, {2, 
    5}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 1}, {3, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {4, 1}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 
    1}, {5, 2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 1}, {5, 3}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {4, 2}, {1, 5}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {4, 2}, {5, 1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 2}, {5, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 5}, {1, 4}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {4, 5}, {1, 5}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 
    5}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 5}, {2, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {4, 5}, {4, 1}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {4, 5}, {4, 2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 5}, {5, 
    1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 5}, {5, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {5, 1}, {1, 2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 
    1}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 1}, {2, 1}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {5, 1}, {2, 3}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {5, 1}, {3, 1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 1}, {3, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 2}, {1, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {5, 2}, {1, 3}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 
    2}, {2, 1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 4}, {1, 4}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {5, 4}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {5, 4}, {4, 1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 4}, {4, 
    2}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 1}, {5, 6}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {4, 2}, {5, 6}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 
    1}, {1, 6}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 1}, {2, 6}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {5, 1}, {3, 6}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {5, 2}, {1, 6}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 2}, {2, 
    6}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 1}, {6, 5}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {4, 2}, {6, 5}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 
    1}, {6, 1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 1}, {6, 2}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {5, 1}, {6, 3}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {5, 1}, {6, 5}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 2}, {6, 
    1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 2}, {6, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {5, 2}, {6, 5}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 
    6}, {1, 4}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 6}, {1, 5}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {4, 6}, {2, 4}}, {{1, 2}, {3, 2}} -> {{4, 
    5}, {4, 6}, {2, 5}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 6}, {5, 
    1}}, {{1, 2}, {3, 2}} -> {{4, 5}, {4, 6}, {5, 2}}, {{1, 2}, {3, 
    2}} -> {{4, 5}, {5, 6}, {1, 6}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 
    6}, {2, 6}}, {{1, 2}, {3, 2}} -> {{4, 5}, {5, 6}, {6, 1}}, {{1, 
    2}, {3, 2}} -> {{4, 5}, {5, 6}, {6, 2}}};
gres = {#[[1]], CanonicalGraph[#[[2]]]} & /@ 
   ResourceFunction["ParallelMapMonitored"][
    Function[ru, 
     TimeConstrained[
      With[{gs = 
         ResourceFunction["CausalConnectionGraph"][
          ResourceFunction["WolframModel"][ru, Automatic, #, 
            "LayeredCausalGraph"] &, 5, 10]}, {ru, IndexGraph[gs]}], 
      5]], allrules32];
Text /@ Counts[
  Framed[Graph[Last[#], ImageSize -> {80, 80}, 
      EdgeStyle -> 
       Directive[RGBColor[0.68, 0.3, 0.], Thickness[Large]], 
      VertexStyle -> RGBColor[0.853, 0.65, 0.3], VertexSize -> .15], 
     FrameStyle -> LightGray] & /@ gres]

Nearly 40% of these rules give causal graphs that terminate before 10 steps, corresponding to “universe-scale” spacelike singularities. Of the remainder, 76% produce no event horizons (at least at the steps we’re measuring). Then the next most common behavior is to generate “subuniverses” separated by cosmological event horizons. And then there are black holes. In aggregate these are produced in about 3% of cases. The most common is a single black hole, followed by a black hole combined with a subuniverse, two black holes, etc.

There are also more exotic cases, like the last case shown. The rule here is:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 2}}]]

And here is the causal graph:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 
    2}}, Automatic, 17, "LayeredCausalGraph"]

This is how the causal connection graph develops at successive steps:

Table
&#10005
Table[Graph[
  ResourceFunction["CausalConnectionGraph"][
   ResourceFunction[
      "WolframModel"][{{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 2}}, 
     Automatic, #, "LayeredCausalGraph"] &, t, 20], VertexSize -> .1, 
  ImageSize -> 60], {t, 8}]

It takes a few steps for “black holes to form”. But in the end, we get what we can interpret as a nested set of black holes. The causally equivalent events corresponding to the node at the top have future light cones that eventually cover the universe. But events associated with “lower” nodes yield progressively more localized future light cones:

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 2}}, 
       Automatic, 14, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 20]], Red, 
      Thick]}, ImageSize -> 200, AspectRatio -> 1/2]] & /@ {12, 13, 
  14, 15}

Here’s what’s going on in the spatial hypergraph:

ResourceFunction
&#10005
ResourceFunction["WolframModelPlot"][#, ImageSize -> Tiny] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 2}}, 
  Automatic, 15, "StatesList"]

Once again, it’s at first pretty obscure where the “black holes” are. It’s mildly helpful to see individual events. What seems to be happening is that the “exterior of the universe” is the main loop. But somehow information can get propagated into the “long hair”, but then can’t get out again.

Show
&#10005
Show[#, ImageSize -> {50, 50}] & /@ 
 ResourceFunction[
   "WolframModel"][{{1, 2}, {3, 2}} -> {{4, 1}, {4, 3}, {1, 2}}, 
  Automatic, 12, "EventsStatesPlotsList"]

What is the analog of all this in ordinary continuum general relativity? Presumably it’s some kind of nested black hole phenomenon. Inside the event horizon of a black hole other black holes form. Perhaps all these black holes in some sense eventually merge (as the singularity theorem might seem to suggest), or perhaps there’s some structure with multiple levels of event horizons, as roughly happens in the Kerr solution.

Properties of Black Holes

When there’s a black hole in our models, what can one tell about what’s inside it? Causal edges go into the event horizon, but none come out. However, that does not mean that the structure of the spatial hypergraph doesn’t reflect what’s inside the event horizon. After all, every causal edge that crosses the event horizon originated in an event outside the event horizon. And that event represented some update in the spatial hypergraph. Or, in other words, while causal edges are “lost” into the event horizon, the “memory” of what they did is progressively imprinted in the spatial hypergraph outside the event horizon.

In ordinary general relativity, there are “no-hair” theorems that say that the gravitational effects of a black hole depend only on a few parameters, such as its overall mass and overall angular momentum. In our models, the overall mass is essentially just determined by the number of causal edges that end up crossing the event horizon. (Angular momentum is related to a kind of vorticity in the causal graph.) So the no-hair theorem for mass says that when there is an event horizon, none of the details of these causal edges matter; only their total number. It’s not clear why this would be true, but it seems conceivable that it could be an essentially purely graph theoretic result.

There’s another potential effect, however. Consider the following causal graph:

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{1, 2}, {1, 3}} -> {{1, 2}, {2, 4}, {3, 4}}, 
       Automatic, 13, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 20]], Red, 
      Thick], Style[Subgraph[gg, {8, 11, 13}], Purple, 
      Thickness[.025]]}, ImageSize -> 300]] &@13

The edges highlighted in red correspond in effect to the interior of an event horizon. The edges highlighted in purple form a pair. The one on the left will cross the event horizon, and “fall into the black hole”. The one on the right will escape to affect the rest of the universe.

Later on, we will see how similar phenomena occur in the multiway causal graph, and we will discuss how this relates to quantum phenomena such as Hawking radiation. But for now, here, everything we’re discussing is purely classical. But recall that while the flux of causal edges through spacelike surfaces corresponds to energy, the flux of causal edges through timelike surfaces corresponds to momentum. On average, the net momentum associated with a pair of causal edges should always cancel out. But if one of the edges crosses inside an event horizon, then whatever momentum it carries will just be aggregated into the total momentum of what’s inside the event horizon, effectively leaving the momentum associated with the other edge uncanceled.

What does this mean? It needs a more detailed analysis. But there seems to be some reason to think that these uncanceled causal edges should lead to a net momentum flux away from the black hole—a kind of “black hole wind”. The effect will be small, because it’ll essentially be proportional to the elementary length divided by the black hole radius (or, alternatively, the black hole surface area times the elementary length divided by the black hole volume). But conceivably in some circumstances—notably with small black holes—it could have measurable consequences.

By the way, it’s worth understanding the “physical origin” of this “wind”. Essentially it’s the result of a classical analog of vacuum polarization. Whereas in ordinary general relativity spacetime is a continuum, in our models it consists of discrete components, and this discreteness causes there to be “fluctuations” that are sensitive to the presence of the event horizon.

More Complicated Cases

If one starts looking at causal graphs for our models, there are plenty of complex things one sees, even with very simple underlying rules. Here are a few examples with various forms of overall causal structure:

Labeled
&#10005
Labeled[ResourceFunction["WolframModel"][#, Automatic, 30, 
    "LayeredCausalGraph"], 
   RulePlot[ResourceFunction["WolframModel"][#]]] &[{{1, 2}, {3, 
    2}} -> {{1, 3}, {1, 2}, {4, 3}}]
Labeled
&#10005
Labeled[ResourceFunction["WolframModel"][#, Automatic, 35, 
    "LayeredCausalGraph"], 
   RulePlot[ResourceFunction["WolframModel"][#]]] &[{{1, 1}, {1, 
    2}} -> {{2, 2}, {1, 2}, {1, 2}, {1, 3}}]
Labeled
&#10005
Labeled[ResourceFunction["WolframModel"][#, Automatic, 30, 
    "LayeredCausalGraph"], 
   RulePlot[ResourceFunction["WolframModel"][#]]] &[{{1, 2}, {3, 
    2}} -> {{2, 2}, {4, 1}, {1, 3}}]

Other Exotic Phenomena in Spacetime, etc.?

Event horizons and singularities can be surprising and confusing in the Einstein equations, but the mathematical structure of the equations at least in principle allows them to be described there. But might there be other kinds of exotic phenomena in spacetime that just can’t be described using this mathematical structure? Our models definitely suggest so.

One example we already encountered is disconnection in the spatial hypergraph. Another major class of phenomena involve what amounts to dynamical change of dimension. In the timelike singularities we saw above, parts of the spatial hypergraph effectively degenerate to being zero-dimensional.

But in general, the whole or part of the spatial hypergraph can change its effective dimension. I’ve discussed elsewhere the possibility that the early universe might have had a larger effective dimension than the current universe. But what would happen if there was just a region of higher dimension in our current universe?

It’s not clear how to make a traditional continuum mathematical description of this, though presumably if one tried to describe it in terms of a manifold, it would show up as a region with infinite curvature. But given a finite—but perhaps very large—hypergraph, it’s much more straightforward to see how regions of different (approximate) dimension might exist together. Inevitably, though, there will be lots of intricate issues to unravel in seeing exactly how all the appropriate mathematical limits fit together.

Nevertheless, one can at least get some sense of what “dimension anomalies” might be like. A region of higher effective dimension will—by definition—have a denser pattern of connections than the surrounding parts of the spatial hypergraph. It’s not clear what the boundary of the structure will be like. But conceivably, to be stable, it will have to correspond to an event horizon of some sort.

One thing that’s fairly clear is that a region of higher dimension will tend to involve more causal edges than the surrounding hypergraph, and will therefore behave like a region of higher energy, so that it’ll show up as a tiny region of space in which energy (or mass) is concentrated.

But of course we’re already very familiar with one example of tiny regions of space in which energy (or mass) are concentrated: elementary particles, like electrons. I’ve always assumed that particles in our models are some kind of stable localized structures in the spatial hypergraph. Perhaps their stability has an essentially topological or graph theoretical origin. But perhaps instead it’s more connected to some kind of event-horizon-like phenomenon—that would be visible in the causal graph.

But what’s “inside” such a particle? It could be a collection of elements of the spatial hypergraph that are connected to form some higher (or effectively infinite) dimensional space. It could also simply be that at scales involving fairly few elements in the hypergraph, there are only a discrete set of possible forms for the local region of the hypergraph that support something like an event horizon.

As an example of the kinds of localization one can see in a causal graph, here are the forms of forward light cones one gets starting from different events in a particular causal graph:

With
&#10005
With[{gg = 
     Graph[ResourceFunction[
        "WolframModel"][{{1, 2}, {1, 3}} -> {{2, 4}, {4, 3}, {3, 1}}, 
       Automatic, 25, "LayeredCausalGraph"]]}, 
   HighlightGraph[
    gg, {Style[Subgraph[gg, VertexOutComponent[gg, #, 60]], Red, 
      Thick]}, ImageSize -> 150, AspectRatio -> 1/2]] & /@ 
 Range[50, 60]

It’s not clear what the best signature to use in searching for particles in our models will be. But it seems like an intriguing—and appealing—possibility that in some sense particles like electrons are “generalized black holes”, or in effect that the apparent “perfection” of black holes is also manifest in the “perfection” of the smallest distinct constituents of matter.

(This reminds me of a personal anecdote. Back around 1975, when I was about 15 years old, I attended an evening physics talk in Oxford about black holes and their quantum features. After the talk, I went up to the speaker and asked if perhaps electrons could be black holes. “Absolutely not”, he said, rather dismissively. Well, we’ll see…)

What about other kinds of “anomalies in spacetime”? In our models, the structure of space is maintained by continual activity in the spatial hypergraph. Presumably in most situations such activity will on a large scale reach a certain statistically uniform “equilibrium” state. But it is conceivable that different regions of space could show different overall levels of activity—which would lead to different “vacuum energy densities”, effectively corresponding to different values of the cosmological constant.

It is also conceivable that there could be different domains in space, all with the same overall activity level (and thus the same energy density) but with different configurations of some large-scale (perhaps effectively topological) feature. And in such a case—as in many cosmological models—one can expect things like domain walls.

In ordinary general relativity, the basic topology of space remains fixed over the course of time. In our models, it can dynamically change, with the structure of space potentially being “re-knitted” in different topological configurations. I haven’t explicitly found examples of rules that generate something like a wormhole, but I’m sure they exist.

I mentioned above the possibility of small regions of space having different effective dimensions. It’s also possible that there could be extended structures, such as tubes, with different effective dimensions. And this raises the intriguing possibility of what one might call “space tunnels” in which there is a tube of “higher-dimensional space” connecting two points in the spatial hypergraph.

No doubt there are many possibilities, and many kinds of exotic phenomena that can occur. Some may be visible directly in the large-scale structure of the spatial hypergraph, while others may be more obvious in the causal graph, or the causal connection graph. It is interesting to imagine classifying different possibilities, although it seems almost inevitable that there will ultimately be undecidable aspects to essentially any classification.

Some exotic phenomena may rely on the discrete structure of our models, and not have meaningful continuum limits. But my guess is that there will be plenty of exotic phenomena that can occur even in continuum models of spacetime, but which just haven’t been looked for before.

Cauchy Surfaces, Closed Timelike Curves, etc.

The questions of causal structure that we’re considering here are already quite complicated. But there are even more issues to consider. One of them is the question of whether it is possible to define a definite notion of “progression of time” in spacetime. The causal graph in effect specifies the partial ordering of events in a system. But now we can ask whether there is a valid foliation that respects this partial ordering, in the sense that all events in a given slice are strictly “before” those in subsequent slices. In the continuum limit, this is effectively asking whether the dynamics of our system exhibit strong hyperbolicity.

One case where this will definitely not be seen is when there are closed timelike curves, represented by loops in the causal graph. But there are also plenty of other cases where there is no strict way to make a “thickness-1” foliation in which no event occurring within a given slice has a successor in the same slice, as in:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{1, 2}, {2, 3}, {3, 4}}, {{1, 
   2}, {2, 3}}, 10, "LayeredCausalGraph"]

When it comes to taking a continuum limit, one should presumably “thicken” the foliations first. But there will inevitably still be cases where no limited amount of thickening will suffice to allow foliations to be created. And such cases one will presumably correspond to failures of Cauchy development in the continuum limit. And it seems likely that this phenomenon can be fairly directly connected to the singularities we have discussed above, but exactly how is not clear.

The Quantum Case

Everything I’ve said so far has involved “purely classical” evolution of the spatial hypergraph, and the causal graph that represents causal relationships within this evolution. But we can also consider multiway evolution and the multiway causal graph which in our models represent quantum behavior. So what happens to various forms of causal disconnection in this case?

The basic answer is that things rapidly become very complicated. Consider the very simple “cosmological horizon” causal graph:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 
    4}}, Automatic, 10, "LayeredCausalGraph"]
RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 4}}]]

Here is the corresponding multiway system:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 
      4}}}, {{{0, 0}, {0, 0}}}, 4, "StatesGraphStructure"]

And here is the multiway causal graph:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][
  "WolframModel" -> {{{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 
       4}}}, {{{0, 0}, {0, 0}}}, 4, "CausalGraphStructure"], 
 AspectRatio -> 1/2]

And here is its transitive reduction:

TransitiveReductionGraph
&#10005
TransitiveReductionGraph[
 Graph[ResourceFunction["MultiwaySystem"][
   "WolframModel" -> {{{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 
        4}}}, {{{0, 0}, {0, 0}}}, 4, "CausalGraphStructure"], 
  AspectRatio -> 1/2]]

This multiway causal graph includes both spacelike causal edges, of the kind shown in the ordinary causal graph, and branchlike causal edges, which represent causal relationships between different branches in the multiway graph. If one looked only at spacelike edges, one would see the spacetime event horizon. But branchlike edges can in effect connect across the event horizon.

Another way to say this is that quantum entanglements can span the event horizon. And if we look at the branchial graph associated with the multiway system above, we see that all states are entangled, even across the event horizon:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{1, 2}, {1, 3}} -> {{2, 2}, {3, 1}, {1, 
      4}}}, {{{0, 0}, {0, 0}}}, 4, "BranchialGraphStructure"]

There’s lots to understand here. And it’s all quite complicated. But let’s look at a simpler case:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 1}} -> {{1, 1}, {1, 2}}]]

The ordinary spacetime causal graph in this case is:

Graph
&#10005
Graph[ResourceFunction["WolframModel"][{{1, 1}} -> {{1, 1}, {1, 2}}, 
  Automatic, 6, "LayeredCausalGraph"], ImageSize -> {Automatic, 400}]

The underlying spatial hypergraphs are

Show
&#10005
Show[#, ImageSize -> 50] & /@ 
 ResourceFunction["WolframModel"][{{1, 1}} -> {{1, 1}, {1, 2}}, 
  Automatic, 8, "StatesPlotsList"]

and the multiway graph is just

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{1, 1}} -> {{1, 1}, {1, 2}}}, {{{0, 0}, {0, 
    0}}}, 5, "StatesGraphStructure"]

but the multiway causal graph is

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{1, 1}} -> {{1, 1}, {1, 2}}}, {{{0, 0}, {0, 
    0}}}, 7, "CausalGraphStructure"]

although its transitive reduction is just:

TransitiveReductionGraph
&#10005
TransitiveReductionGraph[
 ResourceFunction["MultiwaySystem"][
  "WolframModel" -> {{{1, 1}} -> {{1, 1}, {1, 2}}}, {{{0, 0}, {0, 
     0}}}, 7, "CausalGraphStructure"]]

As another example, consider

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 1}} -> {{1, 1}, {1, 1}}]]

whose ordinary causal graph is:

Graph
&#10005
Graph[ResourceFunction["WolframModel"][{{1, 1}} -> {{1, 1}, {1, 1}}, 
  Automatic, 6, "LayeredCausalGraph"], ImageSize -> {Automatic, 400}]

The multiway causal graph is:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{1, 1}} -> {{1, 1}, {1, 1}}}, {{{0, 0}, {0, 
    0}}}, 8, "CausalGraphStructure"]

But once again the transitive reduction of this graph is just:

TransitiveReductionGraph
&#10005
TransitiveReductionGraph[
 ResourceFunction["MultiwaySystem"][
  "WolframModel" -> {{{1, 1}} -> {{1, 1}, {1, 1}}}, {{{0, 0}, {0, 
     0}}}, 8, "CausalGraphStructure"]]

As a marginally more realistic example, consider the minimal “black-hole-like” case from above:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 2}}]]

The ordinary causal graph in this case is:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 
    2}}, Automatic, 10, "LayeredCausalGraph"]

The multiway causal graph in this case is:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][
  "WolframModel" -> {{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 
       2}}}, {{{0, 0}, {0, 0}}}, 6, "CausalGraphStructure"], 
 AspectRatio -> 1/2]

The transitive reduction of this is:

TransitiveReductionGraph
&#10005
TransitiveReductionGraph[%]

Meanwhile, the branchial graph has the structure:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{1, 2}, {2, 3}} -> {{4, 1}, {4, 3}, {1, 
      2}}}, {{{0, 0}, {0, 0}}}, 6, "BranchialGraphStructure"]

And once again, even though in the spacetime causal graph there’s a (rather minimal) black-hole-like event horizon, things are considerably more complicated in the multiway (i.e. quantum) case, notably with quantum entanglements in the branchial graph apparently spanning the event horizon.

There is much more to study here. But it’s already clear that there are some complicated relationships between ordinary causal connectivity, and multiway (i.e. quantum) causal connectivity. Just as in ordinary causal graphs there can be event horizons that limit the spatial effects of events, so in multiway causal graphs there can be event horizons that limit the branchial effects of events. Or, said another way, there should be “entanglement event horizons” that limit the causal relationships between quantum states.

One way to view causal effects in spacetime is to say that a given event can affect events in its future light cone. But in the space of quantum states the causal effect of an event is also limited, but now to an entanglement cone rather than a light cone. And just as we looked at causal connection graphs on the basis of light cones, we can do the same thing for entanglement cones.

The result will hopefully be an elucidation of the quantum character of event horizons. But that will have to await another bulletin…

Exploring Rulial Space: The Case of Turing Machines

$
0
0

Wolfram Physics Bulletin

Informal updates and commentary on progress in the Wolfram Physics Project

Generalized Physics and the Theory of Computation

Let’s say we find a rule that reproduces physics. A big question would then be: “Why this rule, and not another?” I think there’s a very elegant potential answer to this question, that uses what we’re calling rule space relativity—and that essentially says that there isn’t just one rule: actually all possible rules are being used, but we’re basically picking a reference frame that makes us attribute what we see to some particular rule. In other words, our description of the universe is a sense of our making, and there can be many other—potentially utterly incoherent—descriptions, etc.

But so how does this work at a more formal level? This bulletin is going to explore one very simple case. And in doing so we’ll discover that what we’re exploring is potentially relevant not only for questions of “generalized physics”, but also for fundamental questions in the theory of computation. In essence, what we’ll be doing is to study the structure of spaces created by applying all possible rules, potentially, for example, allowing us to “geometrize” spaces of possible algorithms and their applications.

In our models of physics, we begin by considering spatial hypergraphs that describe relations between “atoms of space”. Then, looking at all possible ways a given rule can update these hypergraphs, we form what we call a multiway graph. The transversals of this graph define what we call branchial space, in which we can see the pattern of entanglements between quantum states.

But there’s also a third level we can consider. Instead of just forming a multiway graph in which we do all possible updates with a given rule, we form a rulial multiway (or “ultramultiway”) graph in which we follow not only all possible updates, but also all possible rules. The transversals to this rulial multiway graph define what we call rulial space. Causal invariance in the rulial multiway graph then implies “rule space relativity” which is what allows us to use different possible reference frames to describe the universe.

In the end, the Principle of Computational Equivalence implies a certain invariance to the limiting structure of the rulial multiway graph, independent of the particular parametrization of the set of possible rules. But for the sake of understanding rulial space and the rulial multiway graph—and getting intuition about these—this bulletin is going to look at a specific set of possible rules, defined by simple Turing machines.

The rules we’ll use aren’t a good fit for describing our universe. But the comparative simplicity of their structure will help us in trying to elucidate some of the complexities of rulial space. In addition, using Turing machines will make it easier for us to make contact with the theory of computation, where Turing machines are standard models.

Turing Machines

Here’s a representation of the rule for a particular 2-state (s = 2), 2-color (k = 2) Turing machine:

RulePlot
&#10005
RulePlot[TuringMachine[2506]]

The pointer represents the “head” of the Turing machine and its orientation represents the state of the head. Here’s what this particular Turing machine does over the course of a few steps starting from a “blank tape” (i.e. all squares white):

RulePlot
&#10005
RulePlot[TuringMachine[2506], {{1, 10}, Table[0, 21]}, 20, 
 Mesh -> All, Frame -> None]

There are (2 s k)sk possible s-state k-color Turing machines, or 4096 s = 2, k = 2 machines. Here are all the distinct behaviors that occur (up to left-right reflection) in these 4096 machines, starting from a blank tape:

allrules = With
&#10005
allrules = 
  With[{s = 2, k = 2}, 
   Table[Flatten[
     MapIndexed[{1, -1} #2 + {0, 
          k} -> {1, 1, 2} Mod[
           Quotient[#1, {2 k, 2, 1}], {s, k, 2}] + {1, 0, -1} &, 
      Partition[IntegerDigits[n, 2 s k, s k], k], {2}]], {n, 0, 
     4095}]];
o0[{s1_, k1_} -> {s2_, k2_, o_}] := {s1, k1} -> {s2, k2, -o};
minrule[r_] := First[Sort[Sort /@ {r, o0 /@ r}]];
Keys[ReverseSort[
  Counts[RulePlot[TuringMachine[#], {1, {{}, 0}}, 9, 
      ImageSize -> {Automatic, 45}, FrameStyle -> LightGray] & /@ 
    Union[minrule /@ allrules]]]]

But note that all these are deterministic Turing machines, in the sense that, for a given machine, it is always the same rule that is applied at every step. But what about non-deterministic Turing machines? The idea here is to allow several different rules at any given step.

As a simple example, consider the pair of rules:

RulePlot
&#10005
RulePlot[TuringMachine[#]] & /@ {2506, 3506}

To show the possible evolution histories, we can construct a multiway system:

With
&#10005
With[{t = 3}, 
 Show[ResourceFunction["MultiwayTuringMachine"][{2506, 
     3506}, {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, "StatesGraph", 
    "IncludeEventInstances" -> True, 
    VertexSize -> 1.1 {1, 1/(2 t + 1)}]] /. 
  Arrowheads[Medium] -> Arrowheads[0.025]]

Each arrow represents an application of one of the two possible rules. A non-deterministic Turing machine is usually considered to follow a particular path, corresponding to a particular evolution history. Then a typical question is whether there exists some path that leads to a particular final state (which might be viewed as the solution to some particular problem). (In a quantum Turing machine, one considers collections of multiple states, viewed as being in a quantum superposition.)

In what we’re going to be doing here, we want to study the “extreme case” of non-determinism—where at every step, every possible Turing machine rule (at least with a given s, k) is applied. Then we’ll be interested in the full rulial multiway graph that’s created.

The Rulial Multiway Graph

Start with a blank tape. Then apply all possible 2,2 Turing machine rules. This is the rulial multiway graph that’s formed after 1 step:

With[{t = 1}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 1}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraph", "IncludeEventInstances" -> False, 
  VertexSize -> .25 {1, 1/(2 t + 1)}]]

Each individual arrow here represents many possible rules from all the 4096 discussed above. Here’s what happens after 2 steps:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
   "StatesGraph", "IncludeEventInstances" -> False, 
   PerformanceGoal -> "Quality", VertexSize -> .4 {1.1, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.015], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

In layered form, this becomes:

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; LayeredGraphPlot[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
   "StatesGraph", "IncludeEventInstances" -> False, 
   PerformanceGoal -> "Quality", VertexSize -> {1, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.01], Hue[0.75, 0, 0.35], Dashing[None], 
   AbsoluteThickness[1]], AspectRatio -> 1/4]

In creating these rulial multiway graphs, there’s an important simplification we’re able to make. We don’t need to separately apply all 4096 2,2 Turing machine rules to each state at each step—because all that ever matters is the one case of the rule that’s relevant to the particular state one has. There are 4 possible individual cases—and for each of these there are 8 possible outcomes, leading to 32 “micro-rules”:

GraphicsGrid
&#10005
Cell[CellGroupData[{
Cell[BoxData[{
 RowBox[{
  RowBox[{"CloudGet", "[", 
   RowBox[{
   "CloudObject", "[", 
    "\"\<https://www.wolframcloud.com/obj/wolframphysics/Bulletin/\
DeltaTM.wl\>\"", "]"}], "]"}], ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{
   RowBox[{"tmg0", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s_", ",", "a_"}], "}"}], "\[Rule]", 
      RowBox[{"{", 
       RowBox[{"sp_", ",", "ap_", ",", "dir_"}], "}"}]}], ",", 
     "stot_", ",", "k_"}], "]"}], ":=", 
   RowBox[{"With", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"cf", "=", 
       RowBox[{"Apply", "[", 
        RowBox[{"Function", ",", 
         RowBox[{"{", 
          RowBox[{"Piecewise", "[", 
           RowBox[{
            RowBox[{"Table", "[", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{
                RowBox[{"Blend", "[", 
                 RowBox[{
                  RowBox[{"{", 
                   RowBox[{
                    RowBox[{"RGBColor", "[", 
                    RowBox[{"0.977", ",", "0.952", ",", "0."}], "]"}],
                     ",", 
                    RowBox[{"RGBColor", "[", 
                    RowBox[{"0.965", ",", "0.401", ",", "0.18"}], 
                    "]"}]}], "}"}], ",", 
                  RowBox[{"If", "[", 
                   RowBox[{
                    RowBox[{"k", "\[LessEqual]", "2"}], ",", 
                    RowBox[{"1", "/", "2"}], ",", 
                    RowBox[{"i", "/", 
                    RowBox[{"(", 
                    RowBox[{"k", "-", "2"}], ")"}]}]}], "]"}]}], 
                 "]"}], ",", 
                RowBox[{"#", ">", "i"}]}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"i", ",", 
                RowBox[{"k", "-", "2"}], ",", "0", ",", 
                RowBox[{"-", "1"}]}], "}"}]}], "]"}], ",", "White"}], 
           "]"}], "}"}]}], "]"}]}], "}"}], ",", 
     RowBox[{"Graphics", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"White", ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"Scaled", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}], "]"}], ",", 
             RowBox[{"Scaled", "[", 
              RowBox[{"{", 
               RowBox[{"1", ",", "1"}], "}"}], "]"}]}], "]"}]}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"Directive", "[", 
            RowBox[{
             RowBox[{"EdgeForm", "[", 
              RowBox[{"Directive", "[", 
               RowBox[{
                RowBox[{"GrayLevel", "[", ".6", "]"}], ",", 
                RowBox[{"AbsoluteThickness", "[", "0.7", "]"}]}], 
               "]"}], "]"}], ",", 
             RowBox[{"cf", "[", "a", "]"}]}], "]"}], ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"1", ",", "0"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"2", ",", "1"}], "}"}]}], "]"}]}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"Directive", "[", 
            RowBox[{
             RowBox[{"EdgeForm", "[", 
              RowBox[{"Directive", "[", 
               RowBox[{
                RowBox[{"GrayLevel", "[", ".6", "]"}], ",", 
                RowBox[{"AbsoluteThickness", "[", "0.7", "]"}]}], 
               "]"}], "]"}], ",", 
             RowBox[{"cf", "[", "ap", "]"}]}], "]"}], ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"1", ",", 
               RowBox[{
                RowBox[{"-", "5"}], "/", "4"}]}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"2", ",", 
               RowBox[{
                RowBox[{"-", "1"}], "/", "4"}]}], "}"}]}], "]"}]}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
           "NKSSpecialFunctions`RulePlot`Dump`TuringMarker", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"1.5", ",", "0.5"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"s", ",", "stot"}], "}"}]}], "]"}], ",", 
           RowBox[{
           "NKSSpecialFunctions`RulePlot`Dump`TuringMarker", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"1.5", "+", "dir"}], ",", 
               RowBox[{
                RowBox[{"-", "3"}], "/", "4"}]}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"sp", ",", "stot"}], "}"}]}], "]"}]}], "}"}]}], 
        "}"}], ",", 
       RowBox[{"PlotRange", "\[Rule]", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{
             RowBox[{"-", "1"}], "/", "2"}], ",", 
            RowBox[{"3", "+", 
             RowBox[{"1", "/", "2"}]}]}], "}"}], ",", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{
             RowBox[{"-", "7"}], "/", "4"}], ",", 
            RowBox[{"3", "/", "2"}]}], "}"}]}], "}"}]}]}], "]"}]}], 
    "]"}]}], ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{
   RowBox[{"ElementaryTuringGraphic", "[", 
    RowBox[{"rules_List", ",", 
     RowBox[{"s_Integer", ":", "2"}], ",", 
     RowBox[{"k_Integer", ":", "2"}], ",", "opts___"}], "]"}], ":=", 
   RowBox[{"Table", "[", 
    RowBox[{
     RowBox[{
     "NKSSpecialFunctions`RulePlot`Dump`AtlasGraphicsGrid", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"{", 
         RowBox[{"tmg0", "[", 
          RowBox[{"r", ",", "s", ",", "k"}], "]"}], "}"}], "}"}], ",",
        "opts", ",", 
       RowBox[{"Frame", "\[Rule]", "True"}]}], "]"}], ",", 
     RowBox[{"{", 
      RowBox[{"r", ",", "rules"}], "}"}]}], "]"}]}], ";"}]}], "Input"],

Cell[BoxData[
 RowBox[{"GraphicsGrid", "[", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"s", " ", "=", " ", "2"}], ",", " ", 
      RowBox[{"k", " ", "=", " ", "2"}]}], "}"}], ",", " ", 
    RowBox[{"Partition", "[", 
     RowBox[{
      RowBox[{"ElementaryTuringGraphic", "[", 
       RowBox[{
        RowBox[{"Flatten", "[", 
         RowBox[{"Table", "[", 
          RowBox[{
           RowBox[{"DeltaTMRule", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"si", ",", " ", "ki"}], "}"}], ",", " ", 
             RowBox[{"{", 
              RowBox[{"s", ",", " ", "k"}], "}"}]}], "]"}], ",", " ", 
           
           RowBox[{"{", 
            RowBox[{"si", ",", " ", "s"}], "}"}], ",", " ", 
           RowBox[{"{", 
            RowBox[{"ki", ",", " ", "0", ",", " ", 
             RowBox[{"k", " ", "-", " ", "1"}]}], "}"}]}], "]"}], 
         "]"}], ",", 
        RowBox[{"ImageSize", " ", "\[Rule]", " ", "50"}]}], "]"}], 
      ",", "8"}], "]"}]}], "]"}], "]"}]], "Input"]
}, Open  ]]

After 3 steps, the rulial multiway graph has the form:

With[{t = 3}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 3}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
  "StatesGraphStructure", VertexSize -> 1]]

In 3D this becomes:

Graph3D
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
Graph3D[With[{t = 3}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
   "StatesGraphStructure", VertexSize -> 1, 
   EdgeStyle -> Directive[Hue[0.62, 0.05, 0.55], Opacity[.6]]]]]

In layered form it is:

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
LayeredGraphPlot[
 With[{t = 3}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
   "StatesGraphStructure"]], AspectRatio -> 1/4]

After 5 steps, the rulial multiway graph is:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 5}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
  "StatesGraphStructure", VertexSize -> 1]]

What about other numbers of states and colors? Here are the rulial multiway graphs after 3 steps for Turing machines with various numbers of states and colors :

ParallelMap
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; ParallelMap[
 Labeled[With[{t = 3}, 
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[#], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
     "StatesGraphStructure"]], Text[#]] &, {{1, 2}, {1, 3}, {1, 
   4}, {2, 2}, {2, 3}, {2, 4}, {3, 2}, {3, 3}, {3, 4}, {4, 2}, {4, 
   3}, {4, 4}}]

Notice the presence of s = 1 examples. Even with a single possible state, it is already possible to form a nontrivial rulial multiway system. Here are the results for k = 2 after 1 step:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 1}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraph", "IncludeEventInstances" -> False, 
  VertexSize -> .4 {1, 1/(2 t + 1)}]]

2 steps:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1.1, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.02], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

6 steps:

With
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 6}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraphStructure"]]

In principle one can also consider the even simpler case of s = k = 1. After 1 step one gets:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 1}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraph", "IncludeEventInstances" -> False, 
  PerformanceGoal -> "Quality", VertexSize -> .7 {1, 1/(2 t + 1)}]]

After 2 steps this becomes:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1.1, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.02], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

And after 3 steps:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 3}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.015], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

As a layered graph rendering, this becomes:

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; LayeredGraphPlot[
 With[{t = 4}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
   "StatesGraphStructure"]], AspectRatio -> 1/2]

With s = 2, k = 1 one gets after 1 step:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 1}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraph", "IncludeEventInstances" -> False, 
  PerformanceGoal -> "Quality", VertexSize -> .6 {1, 1/(2 t + 1)}]]

After 2 steps this becomes:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1, .27}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.02], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

And after 3 steps:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 3}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1.15, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.015], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

After 6 steps this becomes:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 6}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality"]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.013], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

At least in the central part of this graph, there are edges going in both directions. Combining these, and treating the graph as undirected, one gets:

SimpleGraph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; SimpleGraph[
 UndirectedGraph[
  With[{t = 6}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
     ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
    "IncludeEventInstances" -> False, PerformanceGoal -> "Quality"]]]]

With 3 states (s = 3) there are 3 tracks:

SimpleGraph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; SimpleGraph[
 UndirectedGraph[
  With[{t = 6}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{3, 1}], {{1, t + 1, 0}, 
     ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
    "IncludeEventInstances" -> False, PerformanceGoal -> "Quality"]]]]

One can also consider Turing machines in which the head can not only move left or right, but can also stay still. In this case, with s = 2, k = 1 one gets after 1 step:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 1}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 1}, {-1, 0, 1}], {{1, t + 1, 0}, 
   ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
  "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
  VertexSize -> .6 {1, 1/(2 t + 1)}]]

After 2 steps this becomes:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}, {-1, 0, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1.15, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.015], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

After 4 steps, removing repeated edges, etc. this gives:

SimpleGraph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; SimpleGraph[
 UndirectedGraph[
  With[{t = 4}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 1}, {-1, 0, 1}], {{1, t + 1, 0}, 
     ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
    "IncludeEventInstances" -> False, PerformanceGoal -> "Quality"]]]]

As another generalization, one can consider Turing machines that can move not just one, but also two squares:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 With[{t = 1}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}, {-2, -1, 0, 1, 2}], {{1, 2 t + 1, 0}, 
    ConstantArray[0, 4 t + 1]}, t, "StatesGraph", 
   "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
   VertexSize -> .3 {1.15, .3}]], 
 EdgeStyle -> 
  Directive[Arrowheads[0.015], 
   ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
    "EdgeStyle"], Dashing[None], AbsoluteThickness[1]]]

After 2 steps one gets:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 2}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 1}, {-2, -1, 0, 1, 2}], {{1, 2 t + 1, 0}, 
   ConstantArray[0, 4 t + 1]}, t, "StatesGraph", 
  "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
  VertexSize -> .6 {1, 1/(2 t + 1)}]]

After 3 steps, removing repeated edges, etc. this gives:

SimpleGraph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; SimpleGraph[
 UndirectedGraph[
  With[{t = 3}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 1}, {-2, -1, 0, 1, 2}], {{1, 2 t + 1, 0}, 
     ConstantArray[0, 4 t + 1]}, t, "StatesGraphStructure", 
    "IncludeEventInstances" -> False, PerformanceGoal -> "Quality"]]]]

The Limit of the Rulial Multiway Graph

What is the limiting structure of the rulial multiway graph after an infinite number of steps? The first crucial observation is that it’s in a sense homogeneous: the structure of the graph around any given node is always the same (i.e. it’s a vertex-transitive graph). To see why this is true, recall that each node in the graph corresponds to a particular distinct configuration of the Turing machine. This node will lead to all nodes that can be obtained from it by one step of Turing machine evolution. But (assuming the head always moves ±1 square) there are always exactly 2 s k of these. And because we are following all possible “micro-rules”, we can think of ourselves as just “blindly overwriting” whatever configuration we had, so that the structure of the graph is the same independent of what configuration we’re at.

As a simple example, here’s what happens in the s = 2, k = 1 case, after 3 steps:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 3}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraph", "IncludeEventInstances" -> False, 
  PerformanceGoal -> "Quality", VertexSize -> .9 {1, 1/(2 t + 1)}]]

There’s some trickiness at the ends, but in the central region we see as expected that each node has exactly 4 successors. If we pick out the subgraph around the blank-tape starting node it has the form:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{g = 
   With[{t = 3}, 
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
      ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
     "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
     VertexSize -> .8 {1, 1/(2 t + 1)}]]}, 
 NeighborhoodGraph[g, First[VertexList[g]], 1, 
  VertexCoordinates -> None]]

In general, the limiting neighborhood of every node is just:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 CanonicalGraph[
  With[{g = 
     With[{t = 3}, 
      ResourceFunction["MultiwayTuringMachine"][
       AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
        ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
       "IncludeEventInstances" -> False, PerformanceGoal -> "Quality",
        VertexSize -> .8 {1, 1/(2 t + 1)}]]}, 
   NeighborhoodGraph[g, First[VertexList[g]], 1]]], 
 EdgeStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
   "EdgeStyle"], 
 VertexStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
   "VertexStyle"]]

In the full graph, these neighborhoods are knitted together, giving pieces like:

Graph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Graph[
 CanonicalGraph[
  With[{g = 
     With[{t = 8}, 
      ResourceFunction["MultiwayTuringMachine"][
       AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
        ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
       PerformanceGoal -> "Quality"]]}, 
   NeighborhoodGraph[g, First[VertexList[g]], 3]]], 
 EdgeStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
   "EdgeStyle"], 
 VertexStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["StatesGraph"][
   "VertexStyle"]]

Let’s now consider the case s = 1, k = 2. After 4 steps the full graph is:

With[{t = 4}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 4}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraph", "IncludeEventInstances" -> False, 
  PerformanceGoal -> "Quality", VertexSize -> .45 {1.15, .3}]]

Here the neighborhood of the start node is:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{g = 
   With[{t = 4}, 
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, 
      ConstantArray[0, 2 t + 1]}, t, "StatesGraph", 
     "IncludeEventInstances" -> False, PerformanceGoal -> "Quality", 
     VertexSize -> 1 {1, 1/(2 t + 1)}]]}, 
 NeighborhoodGraph[g, First[VertexList[g]], 1, 
  VertexCoordinates -> None]]

And in the limiting graph, every node will have a local neighborhood with this same structure. If we look at successively larger neighborhoods, the limiting form of these for every node will be:

With
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{g = 
   With[{t = 8}, 
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, 
      ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure"]]}, 
 Table[NeighborhoodGraph[g, First[VertexList[g]], i, 
   VertexCoordinates -> None], {i, 4}]]

Here’s a table of the number of distinct nodes in these successively larger neighborhoods for Turing machines with various values of s and k:

Grid
&#10005
data = {{1, 
   1, {1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 
    31}}, {1, 
   2, {1, 5, 18, 50, 124, 288, 640, 1384, 2928, 6112, 12608, 25824, 
    52544, 106496, 215040, 433280}}, {1, 
   3, {1, 7, 39, 153, 543, 1809, 5787, 18117, 55755, 170181, 515727, 
    1557873}}, {2, 
   1, {1, 5, 10, 14, 18, 22, 26, 30, 34, 38, 42, 46, 50, 54, 58, 
    62}}, {2, 
   2, {1, 9, 36, 100, 248, 576, 1280, 2768, 5856, 12224, 25216, 51648,
     105088, 212992, 430080}}, {2, 
   3, {1, 13, 78, 306, 1086, 3618, 11574, 36234, 111510, 340362}}, {3,
    1, {1, 7, 15, 21, 27, 33, 39, 45, 51, 57, 63, 69, 75, 81, 87, 
    93}}, {3, 
   2, {1, 13, 54, 150, 372, 864, 1920, 4152, 8784, 18336, 37824, 
    77472, 157632, 319488}}, {3, 
   3, {1, 19, 117, 459, 1629, 5427, 17361, 54351, 167265}}, {4, 
   1, {1, 9, 20, 28, 36, 44, 52, 60, 68, 76, 84, 92, 100, 108, 116, 
    124}}, {4, 
   2, {1, 17, 72, 200, 496, 1152, 2560, 5536, 11712, 24448, 50432, 
    103296}}}; Grid[
 Prepend[Map[
   Style[#] &, {#1, #2, 
      Row[Append[#3, "\[Ellipsis]"], ", "]} & @@@ 
    SortBy[data, #[[2]] &], {2}], {ToString[HoldForm[s], 
    TraditionalForm], ToString[HoldForm[k], TraditionalForm], ""}], 
 Background -> {None, {1 -> GrayLevel[0.94]}}, 
 Dividers -> {Thread[{1, 4} -> Directive[Thick, GrayLevel[0.7]]], 
   Thread[{1, 2, 6, 10, 13} -> Directive[Thick, GrayLevel[0.7]]]}, 
 Frame -> All, FrameStyle -> GrayLevel[0.7], Alignment -> Left]

In the limit t   the number of nodes reached in all cases goes like:

For k = 1, one finds (for ):

For t = 1 one has:

If one ignores directedness in the graph, and just counts the total number of neighbors out to distance t, the results for k = 1 are the same as before, but in other cases they are different:

Grid
&#10005
datau = {{1, 
    1, {1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 
     31}}, {1, 
    2, {1, 7, 26, 74, 180, 412, 900, 1924, 4028, 8348, 17116, 34908, 
     70780, 118654, 215040, 433280}}, {1, 
    3, {1, 11, 63, 273, 975, 3273, 10443, 32685, 100443, 203825, 
     515727, 1557873}}, {2, 
    1, {1, 5, 10, 14, 18, 22, 26, 30, 34, 38, 42, 46, 50, 54, 58, 
     62}}, {2, 
    2, {1, 13, 52, 148, 360, 824, 1800, 3848, 8056, 16696, 34232, 
     69816, 117244, 212992, 430080}}, {2, 
    3, {1, 21, 126, 546, 1950, 6546, 20886, 43282, 111510, 
     340362}}, {3, 
    1, {1, 7, 15, 21, 27, 33, 39, 45, 51, 57, 63, 69, 75, 81, 87, 
     93}}, {3, 
    2, {1, 19, 78, 222, 540, 1236, 2700, 5772, 12084, 25044, 51348, 
     86490, 157632, 319488}}, {3, 
    3, {1, 31, 189, 819, 2925, 9819, 20757, 54351, 167265}}, {4, 
    1, {1, 9, 20, 28, 36, 44, 52, 60, 68, 76, 84, 92, 100, 108, 116, 
     124}}, {4, 
    2, {1, 25, 104, 296, 720, 1648, 3600, 7696, 16112, 27384, 50432, 
     103296}}};
Grid[Prepend[
  Map[Style[#] &, {#1, #2, 
      Row[Append[Drop[#3, -3], "\[Ellipsis]"], ", "]} & @@@ 
    SortBy[Select[datau, #[[2]] != 1 &], #[[2]] &], {2}], {ToString[
    HoldForm[s], TraditionalForm], 
   ToString[HoldForm[k], TraditionalForm], ""}], 
 Background -> {None, {1 -> GrayLevel[0.94]}}, 
 Dividers -> {Thread[{1, 4} -> Directive[Thick, GrayLevel[0.7]]], 
   Thread[{1, 2, 6, 9} -> Directive[Thick, GrayLevel[0.7]]]}, 
 Frame -> All, FrameStyle -> GrayLevel[0.7], Alignment -> Left]

(These results asymptotically seem larger by a factor .)

So for k = 1, the limiting rulial multiway graph behaves like a 1-dimensional space, but for all k  2, it behaves like an infinite-dimensional space, in which the volumes of geodesic balls grow exponentially with volume.

Finite Tapes

So far we’ve always assumed that our Turing machine tape is unbounded. But in some ways it’s easier to see what’s going on if we instead limit the tape, so we have a finite total number of states (n s kn). One thing we can do is to consider a cyclic tape.

For s = k = 1 with successively longer tapes we then get:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
	Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "\[Rule]", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Show", "[", 
    RowBox[{
     RowBox[{"TMNCGraphX", "[", 
      RowBox[{"len", ",", 
       RowBox[{"{", 
        RowBox[{"1", ",", "1"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"len", ",", "3", ",", "5"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

For s = 2, k = 1 we get:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

 Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "->", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"Arrowheads", "[", "0.06", "]"}], ",", 
       RowBox[{"Hue", "[", 
        RowBox[{"0.75", ",", "0", ",", "0.35"}], "]"}], ",", 
       RowBox[{"Dashing", "[", "None", "]"}], ",", 
       RowBox[{"AbsoluteThickness", "[", "1", "]"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Show", "[", 
    RowBox[{
     RowBox[{"TMNCGraphX", "[", 
      RowBox[{"len", ",", 
       RowBox[{"{", 
        RowBox[{"2", ",", "1"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"len", ",", "3", ",", "5"}], "}"}]}], "]"}]], "Input"]
 }, Open  ]]

For tape size 10, the results for successive values of s are:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraph", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"Arrowheads", "[", "0.04", "]"}], ",", 
       RowBox[{"Hue", "[", 
        RowBox[{"0.75", ",", "0", ",", "0.35"}], "]"}], ",", 
       RowBox[{"Dashing", "[", "None", "]"}], ",", 
       RowBox[{"AbsoluteThickness", "[", "1", "]"}]}], "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Labeled", "[", 
    RowBox[{
     RowBox[{"Show", "[", 
      RowBox[{
       RowBox[{"TMNCGraph", "[", 
        RowBox[{"10", ",", 
         RowBox[{"{", 
          RowBox[{"s", ",", "1"}], "}"}]}], "]"}], ",", 
       RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
     RowBox[{"Text", "[", "s", "]"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"s", ",", "3"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

If instead of having a cyclic tape, we just have a finite tape, and insist that the head never goes off either end, then for s = k = 1 we get:

NeighboringConfigurationsC
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurationsC", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{"1", "\[LessEqual]", 
         RowBox[{"pos0", "+", "pos1"}], "\[LessEqual]", 
         RowBox[{"Length", "[", "tape0", "]"}]}], ",", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"s1", ",", 
            RowBox[{"pos0", "+", "pos1"}]}], "}"}], ",", "tape1"}], 
         "}"}], ",", "Nothing"}], "]"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphCX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurationsC", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "\[Rule]", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Show", "[", 
    RowBox[{
     RowBox[{"TMNCGraphCX", "[", 
      RowBox[{"len", ",", 
       RowBox[{"{", 
        RowBox[{"1", ",", "1"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"len", ",", "3", ",", "5"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

For s = 2, k = 1 we get

NeighboringConfigurationsC
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurationsC", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{"1", "\[LessEqual]", 
         RowBox[{"pos0", "+", "pos1"}], "\[LessEqual]", 
         RowBox[{"Length", "[", "tape0", "]"}]}], ",", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"s1", ",", 
            RowBox[{"pos0", "+", "pos1"}]}], "}"}], ",", "tape1"}], 
         "}"}], ",", "Nothing"}], "]"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphCX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurationsC", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "\[Rule]", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"Arrowheads", "[", "0.06", "]"}], ",", 
       RowBox[{"Hue", "[", 
        RowBox[{"0.75", ",", "0", ",", "0.35"}], "]"}], ",", 
       RowBox[{"Dashing", "[", "None", "]"}], ",", 
       RowBox[{"AbsoluteThickness", "[", "1", "]"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Show", "[", 
    RowBox[{
     RowBox[{"TMNCGraphCX", "[", 
      RowBox[{"len", ",", 
       RowBox[{"{", 
        RowBox[{"2", ",", "1"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"len", ",", "3", ",", "5"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

and the overall behavior for successive s is exactly as we saw above for unbounded tapes:

NeighboringConfigurationsC
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurationsC", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{"1", "\[LessEqual]", 
         RowBox[{"pos0", "+", "pos1"}], "\[LessEqual]", 
         RowBox[{"Length", "[", "tape0", "]"}]}], ",", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"s1", ",", 
            RowBox[{"pos0", "+", "pos1"}]}], "}"}], ",", "tape1"}], 
         "}"}], ",", "Nothing"}], "]"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphC", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurationsC", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"Arrowheads", "[", "0.03", "]"}], ",", 
       RowBox[{"Hue", "[", 
        RowBox[{"0.75", ",", "0", ",", "0.35"}], "]"}], ",", 
       RowBox[{"Dashing", "[", "None", "]"}], ",", 
       RowBox[{"AbsoluteThickness", "[", "1", "]"}]}], "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Labeled", "[", 
    RowBox[{
     RowBox[{"Show", "[", 
      RowBox[{
       RowBox[{"TMNCGraphC", "[", 
        RowBox[{"10", ",", 
         RowBox[{"{", 
          RowBox[{"s", ",", "1"}], "}"}]}], "]"}], ",", 
       RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
     RowBox[{"Style", "[", 
      RowBox[{
       RowBox[{"Text", "[", "s", "]"}], ",", "10"}], "]"}]}], "]"}], 
   ",", 
   RowBox[{"{", 
    RowBox[{"s", ",", "3"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

For s = 1, k = 2, this is what happens in the cyclic case for length 3:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "->", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"TMNCGraphX", "[", 
  RowBox[{"3", ",", 
   RowBox[{"{", 
    RowBox[{"1", ",", "2"}], "}"}]}], "]"}]], "Input"]

}, Open  ]]

The results for lengths 1 through 6 are:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraph", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Graph", "[", 
    RowBox[{
     RowBox[{"TMNCGraph", "[", 
      RowBox[{"n", ",", 
       RowBox[{"{", 
        RowBox[{"1", ",", "2"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"n", ",", "6"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

Rendering these in 3D makes the connection with progressively high-dimensional hypercubes slightly clearer:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraph", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\^lt;StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Graph3D", "[", 
    RowBox[{
     RowBox[{"TMNCGraph", "[", 
      RowBox[{"n", ",", 
       RowBox[{"{", 
        RowBox[{"1", ",", "2"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}], ",", 
     RowBox[{"EdgeStyle", "\[Rule]", 
      RowBox[{"Directive", "[", 
       RowBox[{
        RowBox[{"Hue", "[", 
         RowBox[{"0.62", ",", "0.05", ",", "0.55"}], "]"}], ",", 
        RowBox[{"Opacity", "[", ".6", "]"}]}], "]"}]}]}], "]"}], ",", 
   
   RowBox[{"{", 
    RowBox[{"n", ",", "6"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

The non-cyclic case for lengths 2 through 6 (length 1 is trivial):

NeighboringConfigurationsC
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurationsC", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{"1", "\[LessEqual]", 
         RowBox[{"pos0", "+", "pos1"}], "\[LessEqual]", 
         RowBox[{"Length", "[", "tape0", "]"}]}], ",", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"s1", ",", 
            RowBox[{"pos0", "+", "pos1"}]}], "}"}], ",", "tape1"}], 
         "}"}], ",", "Nothing"}], "]"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphC", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurationsC", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Graph", "[", 
    RowBox[{
     RowBox[{"TMNCGraphC", "[", 
      RowBox[{"n", ",", 
       RowBox[{"{", 
        RowBox[{"1", ",", "2"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"n", ",", "2", ",", "6"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

Rendered in 3D these become:

NeighboringConfigurationsC
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurationsC", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{"1", "\[LessEqual]", 
         RowBox[{"pos0", "+", "pos1"}], "\[LessEqual]", 
         RowBox[{"Length", "[", "tape0", "]"}]}], ",", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"s1", ",", 
            RowBox[{"pos0", "+", "pos1"}]}], "}"}], ",", "tape1"}], 
         "}"}], ",", "Nothing"}], "]"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphC", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurationsC", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Graph3D", "[", 
    RowBox[{
     RowBox[{"TMNCGraphC", "[", 
      RowBox[{"n", ",", 
       RowBox[{"{", 
        RowBox[{"1", ",", "2"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "200"}], ",", 
     RowBox[{"VertexStyle", "\[Rule]", " ", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<StatesGraph3D\>\"", ",", "\"\<VertexStyle\>\""}], 
       "]"}]}], ",", 
     RowBox[{"EdgeStyle", "\[Rule]", " ", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<StatesGraph3D\>\"", ",", "\"\<EdgeStyle\>\""}], 
       "]"}]}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"n", ",", "2", ",", "6"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

In the case s = 2, k = 2 for a length-3 cyclic tape we get:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "->", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"Arrowheads", "[", "0.02", "]"}], ",", 
       RowBox[{"Hue", "[", 
        RowBox[{"0.75", ",", "0", ",", "0.35"}], "]"}], ",", 
       RowBox[{"Dashing", "[", "None", "]"}], ",", 
       RowBox[{"AbsoluteThickness", "[", "1", "]"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"TMNCGraphX", "[", 
  RowBox[{"3", ",", 
   RowBox[{"{", 
    RowBox[{"2", ",", "2"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

Rendering the results for lengths 1 through 6 in 3D gives:

NeighboringConfigurations
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraph", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Graph3D", "[", 
    RowBox[{
     RowBox[{"TMNCGraph", "[", 
      RowBox[{"n", ",", 
       RowBox[{"{", 
        RowBox[{"2", ",", "2"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "260"}], ",", 
     RowBox[{"VertexStyle", "\[Rule]", " ", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<StatesGraph3D\>\"", ",", "\"\<VertexStyle\>\""}], 
       "]"}]}], ",", 
     RowBox[{"EdgeStyle", "\[Rule]", " ", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<StatesGraph3D\>\"", ",", "\"\<EdgeStyle\>\""}], 
       "]"}]}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"n", ",", "6"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

In the non-cyclic case the results for lengths 2 through 6 in 3D are (the length-1 case is trivial):

NeighboringConfigurationsC
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurationsC", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{"1", "\[LessEqual]", 
         RowBox[{"pos0", "+", "pos1"}], "\[LessEqual]", 
         RowBox[{"Length", "[", "tape0", "]"}]}], ",", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"s1", ",", 
            RowBox[{"pos0", "+", "pos1"}]}], "}"}], ",", "tape1"}], 
         "}"}], ",", "Nothing"}], "]"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphC", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurationsC", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}], ",", 
    RowBox[{"VertexStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Graph3D", "[", 
    RowBox[{
     RowBox[{"TMNCGraphC", "[", 
      RowBox[{"n", ",", 
       RowBox[{"{", 
        RowBox[{"2", ",", "2"}], "}"}]}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "260"}], ",", 
     RowBox[{"VertexStyle", "\[Rule]", " ", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<StatesGraph3D\>\"", ",", "\"\<VertexStyle\>\""}], 
       "]"}]}], ",", 
     RowBox[{"EdgeStyle", "\[Rule]", " ", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<StatesGraph3D\>\"", ",", "\"\<EdgeStyle\>\""}], 
       "]"}]}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"n", ",", "2", ",", "6"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

The Turing Machine Group

It turns out that there’s a nice mathematical characterization of rulial multiway graphs for Turing machines: they’re just Cayley graphs of groups that we can call “Turing machine groups”. Why is this? Basically it’s because the possible configurations of a Turing machine have a direct correspondence with transformations that can act on these configurations. And in particular, one can pick out certain transformations that correspond to individual transitions in a non-deterministic Turing machine, and use these as generators in a presentation of the group.

Let’s start by considering the case of Turing machines with finite cyclic tapes. If the tape has length n, the total number of possible configurations of the machine is n s kn. (Note that if the tape is finite but not cyclic then one doesn’t get a group.)

Assume for now s = 1, k = 2. Then for n = 3, there are 24 possible configurations, and the rulial multiway graph is:

TMNCGraphX
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"NeighboringConfigurations", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s0_Integer", ",", "pos0_Integer"}], "}"}], ",", 
      "tape0_List"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Flatten", "[", 
   RowBox[{
    RowBox[{"Table", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"s1", ",", 
          RowBox[{"Mod", "[", 
           RowBox[{
            RowBox[{"pos0", "+", "pos1"}], ",", 
            RowBox[{"Length", "[", "tape0", "]"}], ",", "1"}], 
           "]"}]}], "}"}], ",", "tape1"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"s1", ",", "s"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"pos1", ",", "offs"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{"tape1", ",", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"ReplacePart", "[", 
           RowBox[{"tape0", ",", 
            RowBox[{"pos0", "\[Rule]", "i"}]}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"i", ",", "0", ",", 
            RowBox[{"k", "-", "1"}]}], "}"}]}], "]"}]}], "}"}]}], 
     "]"}], ",", "2"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"TMNCGraphX", "[", 
   RowBox[{"tlen_Integer", ",", 
    RowBox[{"{", 
     RowBox[{"s_Integer", ",", "k_Integer"}], "}"}], ",", 
    RowBox[{"offs_", ":", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"-", "1"}], ",", "1"}], "}"}]}]}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{
    RowBox[{"Flatten", "[", 
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], "\[DirectedEdge]", "#"}], "&"}], "/@", 
        RowBox[{"NeighboringConfigurations", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"s0", ",", "pos0"}], "}"}], ",", "tape0"}], 
           "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"s", ",", "k"}], "}"}], ",", "offs"}], "]"}]}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"s0", ",", "s"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"pos0", ",", "tlen"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"tape0", ",", 
         RowBox[{"Tuples", "[", 
          RowBox[{
           RowBox[{"Range", "[", 
            RowBox[{"0", ",", 
             RowBox[{"k", "-", "1"}]}], "]"}], ",", "tlen"}], "]"}]}],
         "}"}]}], "]"}], "]"}], ",", 
    RowBox[{"VertexShapeFunction", "\[Rule]", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Inset", "[", 
        RowBox[{
         RowBox[{"Framed", "[", 
          RowBox[{
           RowBox[{"RulePlot", "[", 
            RowBox[{
             RowBox[{"TuringMachine", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"s", ",", "2"}], "]"}], ",", 
                RowBox[{"Max", "[", 
                 RowBox[{"k", ",", "2"}], "]"}]}], "}"}], "]"}], ",", 
             
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"1", ",", 
                 RowBox[{"#2", "[", 
                  RowBox[{"[", 
                   RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "}"}], 
               ",", 
               RowBox[{"#2", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "}"}], ",", "0", 
             ",", 
             RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
             RowBox[{"Frame", "\[Rule]", "False"}], ",", 
             RowBox[{"ImageSize", "\[Rule]", "40"}]}], "]"}], ",", 
           RowBox[{"Background", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.2", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.45", ",", "0.87"}], "]"}]}], 
             "]"}]}], ",", 
           RowBox[{"FrameMargins", "\[Rule]", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"2", ",", "2"}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}]}], "}"}]}], ",", 
           RowBox[{"RoundingRadius", "\[Rule]", "0"}], ",", 
           RowBox[{"FrameStyle", "\[Rule]", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"Opacity", "[", "0.5", "]"}], ",", 
              RowBox[{"Hue", "[", 
               RowBox[{"0.62", ",", "0.52", ",", "0.82"}], "]"}]}], 
             "]"}]}]}], "]"}], ",", "#1"}], "]"}], "&"}], ")"}]}], 
    ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
      "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"TMNCGraphX", "[", 
  RowBox[{"3", ",", 
   RowBox[{"{", 
    RowBox[{"1", ",", "2"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

But this graph turns out to be a Cayley graph for the finite group A4 × 2:

{Graph
&#10005
{Graph[#, ImageSize -> 300], Graph3D[#, ImageSize -> 300]} &@
 CayleyGraph[
  PermutationGroup[
   PermutationCycles /@ 
    FiniteGroupData[{"DirectProduct", {{"AlternatingGroup", 
         4}, {"CyclicGroup", 2}}}, "MultiplicationTable"][[{3, 5, 10, 
      22}]]]]

To construct this Cayley graph let’s represent the configurations of the Turing machine by pairs of integers , where 0  i  n – 1 gives the position of the head, and the bits of u (with 0  u  2n – 1) give the values on the tape. (Since s = 1, we don’t have to worry about the state of the head.) With this representation of the configurations, consider the “multiplication” operation:

f
&#10005
f[{i_, u_}, {j_, v_}] := {Mod[i + j, n], 
  BitXor[BitShiftRight[u, j], v]} 

When this operation acts on the configurations it defines a group. Here’s the multiplication table for n = 3:

GraphicsGrid
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"op", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"i_", ",", "u_"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"j_", ",", "v_"}], "}"}]}], "]"}], ":=", 
  RowBox[{"{", 
   RowBox[{
    RowBox[{"Mod", "[", 
     RowBox[{
      RowBox[{"i", "+", "j"}], ",", "n"}], "]"}], ",", 
    RowBox[{"BitXor", "[", 
     RowBox[{
      RowBox[{"BitShiftRight", "[", 
       RowBox[{"u", ",", "j"}], "]"}], ",", "v"}], "]"}]}], "}"}], 
  " "}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"tmdec", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"i_", ",", "u_"}], "}"}], ",", "n_", ",", 
    RowBox[{"sz_", ":", "Automatic"}]}], "]"}], ":=", 
  RowBox[{"RulePlot", "[", 
   RowBox[{
    RowBox[{"TuringMachine", "[", "2506", "]"}], ",", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", 
        RowBox[{"i", "+", "1"}]}], "}"}], ",", 
      RowBox[{"IntegerDigits", "[", 
       RowBox[{"u", ",", "2", ",", "n"}], "]"}]}], "}"}], ",", "0", 
    ",", 
    RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
    RowBox[{"Frame", "\[Rule]", "None"}], ",", 
    RowBox[{"ImageSize", "\[Rule]", "sz"}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"Block", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"n", "=", "3"}], "}"}], ",", 
    RowBox[{"With", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"elems", "=", 
        RowBox[{"Catenate", "[", 
         RowBox[{"Table", "[", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"i", ",", "u"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"u", ",", "0", ",", 
             RowBox[{
              RowBox[{"2", "^", "n"}], "-", "1"}]}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"i", ",", "0", ",", 
             RowBox[{"n", "-", "1"}]}], "}"}]}], "]"}], "]"}]}], 
       "}"}], ",", 
      RowBox[{"Outer", "[", 
       RowBox[{"op", ",", "elems", ",", "elems", ",", "1"}], "]"}]}], 
     "]"}]}], "]"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"GraphicsGrid", "[", 
  RowBox[{
   RowBox[{"Map", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"tmdec", "[", 
       RowBox[{"#", ",", "3", ",", "20"}], "]"}], "&"}], ",", "%", 
     ",", 
     RowBox[{"{", "2", "}"}]}], "]"}], ",", 
   RowBox[{"Frame", "\[Rule]", "All"}]}], "]"}]], "Input"]
}, Open  ]]

Or equivalently:

With
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"op", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"i_", ",", "u_"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"j_", ",", "v_"}], "}"}]}], "]"}], ":=", 
  RowBox[{"{", 
   RowBox[{
    RowBox[{"Mod", "[", 
     RowBox[{
      RowBox[{"i", "+", "j"}], ",", "n"}], "]"}], ",", 
    RowBox[{"BitXor", "[", 
     RowBox[{
      RowBox[{"BitShiftRight", "[", 
       RowBox[{"u", ",", "j"}], "]"}], ",", "v"}], "]"}]}], "}"}], 
  " "}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"cf", "=", 
   RowBox[{
    RowBox[{"Blend", "[", 
     RowBox[{"System`PlotThemeDump`$ThemeDefaultMatrix", ",", "#1"}], 
     "]"}], "&"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"With", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{"data", "=", 
     RowBox[{"Map", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "+", 
         RowBox[{"2", 
          RowBox[{"#", "[", 
           RowBox[{"[", "2", "]"}], "]"}]}]}], "&"}], ",", 
       RowBox[{"Block", "[", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"n", "=", "3"}], "}"}], ",", 
         RowBox[{"With", "[", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"elems", "=", 
             RowBox[{"Catenate", "[", 
              RowBox[{"Table", "[", 
               RowBox[{
                RowBox[{"{", 
                 RowBox[{"i", ",", "u"}], "}"}], ",", 
                RowBox[{"{", 
                 RowBox[{"u", ",", "0", ",", 
                  RowBox[{
                   RowBox[{"2", "^", "n"}], "-", "1"}]}], "}"}], ",", 
                
                RowBox[{"{", 
                 RowBox[{"i", ",", "0", ",", 
                  RowBox[{"n", "-", "1"}]}], "}"}]}], "]"}], "]"}]}], 
            "}"}], ",", 
           RowBox[{"Outer", "[", 
            RowBox[{"op", ",", "elems", ",", "elems", ",", "1"}], 
            "]"}]}], "]"}]}], "]"}], ",", 
       RowBox[{"{", "2", "}"}]}], "]"}]}], "}"}], ",", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"ArrayPlot", "[", 
      RowBox[{"data", ",", 
       RowBox[{"Axes", "\[Rule]", "None"}], ",", 
       RowBox[{"ColorFunction", "\[Rule]", "cf"}], ",", 
       RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
       RowBox[{"Frame", "\[Rule]", "None"}]}], "]"}], ",", 
     RowBox[{"Show", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"ListPointPlot3D", "[", 
         RowBox[{
          RowBox[{"Reverse", "[", "data", "]"}], ",", 
          RowBox[{"Axes", "\[Rule]", "None"}], ",", 
          RowBox[{"ColorFunction", "\[Rule]", 
           RowBox[{"(", 
            RowBox[{
             RowBox[{"cf", "[", "#3", "]"}], "&"}], ")"}]}], ",", 
          RowBox[{"DataRange", "\[Rule]", "All"}], ",", 
          RowBox[{"BoxRatios", "\[Rule]", "1"}], ",", 
          RowBox[{"Lighting", "\[Rule]", "\"\<Neutral\>\""}], ",", 
          RowBox[{"PlotStyle", "\[Rule]", 
           RowBox[{"EdgeForm", "[", "]"}]}]}], "]"}], "/.", 
        RowBox[{"Point", "\[Rule]", "Cuboid"}]}], ",", 
       RowBox[{"PlotRange", "\[Rule]", "All"}]}], "]"}]}], "}"}]}], 
  "]"}]], "Input"]
}, Open  ]]

But now consider the four elements , (position –1 wraps around on the cyclic tape):

tmdec
&#10005
Cell[CellGroupData[{
    Cell[BoxData[
 RowBox[{
  RowBox[{"tmdec", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"i_", ",", "u_"}], "}"}], ",", "n_", ",", 
    RowBox[{"sz_", ":", "Automatic"}]}], "]"}], ":=", 
  RowBox[{"RulePlot", "[", 
   RowBox[{
    RowBox[{"TuringMachine", "[", "2506", "]"}], ",", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", 
        RowBox[{"i", "+", "1"}]}], "}"}], ",", 
      RowBox[{"IntegerDigits", "[", 
       RowBox[{"u", ",", "2", ",", "n"}], "]"}]}], "}"}], ",", "0", 
    ",", 
    RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
    RowBox[{"Frame", "\[Rule]", "None"}], ",", 
    RowBox[{"ImageSize", "\[Rule]", "sz"}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"tmdec", "[", 
    RowBox[{"#", ",", "3", ",", "50"}], "]"}], "&"}], "/@", 
  RowBox[{"{", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"1", ",", "0"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"2", ",", "0"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"1", ",", "1"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"2", ",", "1"}], "}"}]}], "}"}]}]], "Input"]
}, Open  ]]

We can consider these as the result of applying the 4 possible Turing machine transitions

ElementaryTuringGraphic
&#10005
Cell[CellGroupData[{
Cell[BoxData[{
 RowBox[{
  RowBox[{"CloudGet", "[", 
   RowBox[{
   "CloudObject", "[", 
    "\"\<https://www.wolframcloud.com/obj/wolframphysics/Bulletin/\
DeltaTM.wl\>\"", "]"}], "]"}], ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{
   RowBox[{"tmg0", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"{", 
       RowBox[{"s_", ",", "a_"}], "}"}], "\[Rule]", 
      RowBox[{"{", 
       RowBox[{"sp_", ",", "ap_", ",", "dir_"}], "}"}]}], ",", 
     "stot_", ",", "k_"}], "]"}], ":=", 
   RowBox[{"With", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"cf", "=", 
       RowBox[{"Apply", "[", 
        RowBox[{"Function", ",", 
         RowBox[{"{", 
          RowBox[{"Piecewise", "[", 
           RowBox[{
            RowBox[{"Table", "[", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{
                RowBox[{"Blend", "[", 
                 RowBox[{
                  RowBox[{"{", 
                   RowBox[{
                    RowBox[{"RGBColor", "[", 
                    RowBox[{"0.977", ",", "0.952", ",", "0."}], "]"}],
                     ",", 
                    RowBox[{"RGBColor", "[", 
                    RowBox[{"0.965", ",", "0.401", ",", "0.18"}], 
                    "]"}]}], "}"}], ",", 
                  RowBox[{"If", "[", 
                   RowBox[{
                    RowBox[{"k", "\[LessEqual]", "2"}], ",", 
                    RowBox[{"1", "/", "2"}], ",", 
                    RowBox[{"i", "/", 
                    RowBox[{"(", 
                    RowBox[{"k", "-", "2"}], ")"}]}]}], "]"}]}], 
                 "]"}], ",", 
                RowBox[{"#", ">", "i"}]}], "}"}], ",", 
              RowBox[{"{", 
               RowBox[{"i", ",", 
                RowBox[{"k", "-", "2"}], ",", "0", ",", 
                RowBox[{"-", "1"}]}], "}"}]}], "]"}], ",", "White"}], 
           "]"}], "}"}]}], "]"}]}], "}"}], ",", 
     RowBox[{"Graphics", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"White", ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"Scaled", "[", 
              RowBox[{"{", 
               RowBox[{"0", ",", "0"}], "}"}], "]"}], ",", 
             RowBox[{"Scaled", "[", 
              RowBox[{"{", 
               RowBox[{"1", ",", "1"}], "}"}], "]"}]}], "]"}]}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"Directive", "[", 
            RowBox[{
             RowBox[{"EdgeForm", "[", 
              RowBox[{"Directive", "[", 
               RowBox[{
                RowBox[{"GrayLevel", "[", ".6", "]"}], ",", 
                RowBox[{"AbsoluteThickness", "[", "0.7", "]"}]}], 
               "]"}], "]"}], ",", 
             RowBox[{"cf", "[", "a", "]"}]}], "]"}], ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"1", ",", "0"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"2", ",", "1"}], "}"}]}], "]"}]}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"Directive", "[", 
            RowBox[{
             RowBox[{"EdgeForm", "[", 
              RowBox[{"Directive", "[", 
               RowBox[{
                RowBox[{"GrayLevel", "[", ".6", "]"}], ",", 
                RowBox[{"AbsoluteThickness", "[", "0.7", "]"}]}], 
               "]"}], "]"}], ",", 
             RowBox[{"cf", "[", "ap", "]"}]}], "]"}], ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"1", ",", 
               RowBox[{
                RowBox[{"-", "5"}], "/", "4"}]}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"2", ",", 
               RowBox[{
                RowBox[{"-", "1"}], "/", "4"}]}], "}"}]}], "]"}]}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
           "NKSSpecialFunctions`RulePlot`Dump`TuringMarker", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"1.5", ",", "0.5"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"s", ",", "stot"}], "}"}]}], "]"}], ",", 
           RowBox[{
           "NKSSpecialFunctions`RulePlot`Dump`TuringMarker", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"1.5", "+", "dir"}], ",", 
               RowBox[{
                RowBox[{"-", "3"}], "/", "4"}]}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"sp", ",", "stot"}], "}"}]}], "]"}]}], "}"}]}], 
        "}"}], ",", 
       RowBox[{"PlotRange", "\[Rule]", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{
            RowBox[{
             RowBox[{"-", "1"}], "/", "2"}], ",", 
            RowBox[{"3", "+", 
             RowBox[{"1", "/", "2"}]}]}], "}"}], ",", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{
             RowBox[{"-", "7"}], "/", "4"}], ",", 
            RowBox[{"3", "/", "2"}]}], "}"}]}], "}"}]}]}], "]"}]}], 
    "]"}]}], ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{
   RowBox[{"ElementaryTuringGraphic", "[", 
    RowBox[{"rules_List", ",", 
     RowBox[{"s_Integer", ":", "2"}], ",", 
     RowBox[{"k_Integer", ":", "2"}], ",", "opts___"}], "]"}], ":=", 
   RowBox[{"Table", "[", 
    RowBox[{
     RowBox[{
     "NKSSpecialFunctions`RulePlot`Dump`AtlasGraphicsGrid", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"{", 
         RowBox[{"tmg0", "[", 
          RowBox[{"r", ",", "s", ",", "k"}], "]"}], "}"}], "}"}], ",",
        "opts", ",", 
       RowBox[{"Frame", "\[Rule]", "True"}]}], "]"}], ",", 
     RowBox[{"{", 
      RowBox[{"r", ",", "rules"}], "}"}]}], "]"}]}], ";"}]}], "Input"],

Cell[BoxData[
 RowBox[{"With", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"s", " ", "=", " ", "1"}], ",", " ", 
     RowBox[{"k", " ", "=", " ", "2"}]}], "}"}], ",", " ", 
   RowBox[{"ElementaryTuringGraphic", "[", 
    RowBox[{
     RowBox[{"Flatten", "[", 
      RowBox[{"Table", "[", 
       RowBox[{
        RowBox[{"DeltaTMRule", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"si", ",", " ", "ki"}], "}"}], ",", " ", 
          RowBox[{"{", 
           RowBox[{"s", ",", " ", "k"}], "}"}]}], "]"}], ",", " ", 
        RowBox[{"{", 
         RowBox[{"si", ",", " ", "s"}], "}"}], ",", " ", 
        RowBox[{"{", 
         RowBox[{"ki", ",", " ", "0", ",", "0"}], "}"}]}], "]"}], 
      "]"}], ",", 
     RowBox[{"ImageSize", " ", "\[Rule]", " ", "50"}]}], "]"}]}], 
  "]"}]], "Input"]
}, Open  ]]

to the configuration (corresponding to ):

tmdec
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"tmdec", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"i_", ",", "u_"}], "}"}], ",", "n_", ",", 
    RowBox[{"sz_", ":", "Automatic"}]}], "]"}], ":=", 
  RowBox[{"RulePlot", "[", 
   RowBox[{
    RowBox[{"TuringMachine", "[", "2506", "]"}], ",", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", 
        RowBox[{"i", "+", "1"}]}], "}"}], ",", 
      RowBox[{"IntegerDigits", "[", 
       RowBox[{"u", ",", "2", ",", "n"}], "]"}]}], "}"}], ",", "0", 
    ",", 
    RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
    RowBox[{"Frame", "\[Rule]", "None"}], ",", 
    RowBox[{"ImageSize", "\[Rule]", "sz"}]}], "]"}]}]], "Input"],

    Cell[BoxData[
 RowBox[{"tmdec", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{"0", ",", "0"}], "}"}], ",", "3", ",", "50"}], 
  "]"}]], "Input"]
}, Open  ]]

And now by treating these elements as generators (and effectively applying them not just to the initial configuration, but to any configuration), we get as the Cayley graph of the group exactly the rulial multiway graph above.

It’s worth noting that the 4 elements we’ve used don’t correspond to the minimal set of generators for the group. Two elements suffice. And for example, we can use and , which can be thought of respectively as moving the head right, and flipping the color of one cell (again relative to the “identity” configuration ):

tmdec
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"tmdec", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"i_", ",", "u_"}], "}"}], ",", "n_", ",", 
    RowBox[{"sz_", ":", "Automatic"}]}], "]"}], ":=", 
  RowBox[{"RulePlot", "[", 
   RowBox[{
    RowBox[{"TuringMachine", "[", "2506", "]"}], ",", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", 
        RowBox[{"i", "+", "1"}]}], "}"}], ",", 
      RowBox[{"IntegerDigits", "[", 
       RowBox[{"u", ",", "2", ",", "n"}], "]"}]}], "}"}], ",", "0", 
    ",", 
    RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
    RowBox[{"Frame", "\[Rule]", "None"}], ",", 
    RowBox[{"ImageSize", "\[Rule]", "sz"}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"tmdec", "[", 
    RowBox[{"#", ",", "3", ",", "50"}], "]"}], "&"}], "/@", 
  RowBox[{"{", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"1", ",", "0"}], "}"}], ",", 
    RowBox[{"{", 
     RowBox[{"0", ",", "1"}], "}"}]}], "}"}]}]], "Input"]
}, Open  ]]

With these generators, we get the Cayley graph:

With
&#10005
With[{n = 3}, 
 With[{g = 
    CayleyGraph[
     PermutationGroup[{Cycles[{{1, 2}}], 
       Cycles[{Range[1, 2 n - 1, 2], Range[2, 2 n, 2]}]}]]}, 
  Show[#, ImageSize -> 200] & /@ {g, Graph3D[g]}]]

How else can we characterize the group? For any tape size n we can write it in terms of explicit permutations:

PermutationGroup
&#10005
PermutationGroup[{Cycles[{{1, 2}}], 
  Cycles[{Range[1, 2 n - 1, 2], Range[2, 2 n, 2]}]}]

(For , the group can be generated by the permutations .)

We can also represent it symbolically in terms of generators and relations. Calling our “move-right” generator R, and “bit-flip” generator F, the group then satisfies at least the relations:

OK, so does the group have a name? For n = 2, it’s the 8-element dihedral group D4 and for n = 3, it’s the 24-element group A4 × 2. For larger n, there doesn’t seem to be a standard name. But given our derivation we can just call it . And we can express it as a semidirect product (or wreath product):

The normal subgroup (2)n here represents states of the tape, and corresponds to a Boolean n-cube. The cyclic group n represents the position of the head, and acts on the Boolean n-cube by rotating its coordinates.

For any n, we can use just two generators, producing the sequence of Cayley graphs:

Table
&#10005
Table[CayleyGraph[
  PermutationGroup[{Cycles[{{1, 2}}], 
    Cycles[{Range[1, 2 n - 1, 2], Range[2, 2 n, 2]}]}]], {n, 2, 5}]

Undirected versions of these are exactly the cube-connected cycle graphs that have arisen in studying communications networks:

Prepend
&#10005
Prepend[Table[GraphData[{"CubeConnectedCycle", n}], {n, 3, 5}], 
 CloudGet["https://wolfr.am/N9FGlZe8"]]

So now what about the limit n  ? Now, the group is no longer finite, but we’ve still got the relations

and in the end, we can see that the group can be described as a semidirect product:

In a sense, the group—and its Cayley graph—is dominated by the infinite-dimensional Boolean hypercube. But there’s more going on. And perhaps there’s a useful characterization of the limit that can be derived by the methods of modern geometric theory.

For arbitrary s and k, we can potentially generalize to get:

(Thanks to Tali Beynon, Todd Rowland, Ed Pegg, Jose Martin-Garcia and Christopher Wolfram for trying to help me unscramble the fairly elementary group theory used here.)

Causal Graphs for Deterministic Turing Machines

In a deterministic Turing machine, every step involves one updating event—and the causal graph can be drawn by just joining successive locations of the head, and successive points where the head returns to a square where it has been before:

TMCausalPlot
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/TMCausal.\
wl"]];
TMCausalPlot[2506, {{1, 10}, ConstantArray[0, 21]}, 20]

Continuing for a few more steps, the causal graph for this particular Turing machine becomes:

With[{t = 40}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/TMCausal.\
wl"]];
With[{t = 40}, 
 Graph[Rule @@@ 
   TMCausalData[
    TuringMachine[2506, {{1, t}, ConstantArray[0, 2 t + 1]}, t]], 
  EdgeStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "EdgeStyle"], 
  VertexStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "VertexStyle"]]]

Continuing for more steps, and redrawing the graph, we see that we get a simple grid:

With[{t = 400}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/TMCausal.\
wl"]]; With[{t = 400}, 
 Graph[Rule @@@ 
   TMCausalData[
    TuringMachine[2506, {{1, t}, ConstantArray[0, 2 t + 1]}, t]], 
  EdgeStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "EdgeStyle"], 
  VertexStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "VertexStyle"]]]

Of s = 2, k = 2 Turing machines, the one with the most exotic causal graph is the “binary counter” machine 1953:

TMCausalPlot
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/TMCausal.\
wl"]];
TMCausalPlot[1953, {{1, 10}, ConstantArray[0, 21]}, 20]
With[{t = 1000}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/TMCausal.\
wl"]]; With[{t = 1000}, 
 Graph[Rule @@@ 
   TMCausalData[
    TuringMachine[1953, {{1, t}, ConstantArray[0, 2 t + 1]}, t]], 
  EdgeStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "EdgeStyle"], 
  VertexStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "VertexStyle"]]]

The causal graph gives a good “overall map” of the behavior of a Turing machine. Here are a few Turing machines (with s = 3, k = 2 and s = 4, k = 2) from A New Kind of Science (compare also the s = 2, k = 3 universal Turing machine):

RulePlot
&#10005
RulePlot[TuringMachine[#], {1, {{}, 0}}, 100, 
   ImageSize -> {Automatic, 400}] & /@ {{{1, 0} -> {3, 1, -1}, {1, 
     1} -> {2, 0, 1}, {2, 0} -> {1, 1, 1}, {2, 1} -> {3, 1, 1}, {3, 
     0} -> {2, 1, 1}, {3, 1} -> {1, 0, -1}}, {{1, 0} -> {3, 
     1, -1}, {1, 1} -> {2, 1, -1}, {2, 0} -> {1, 1, -1}, {2, 1} -> {4,
      1, 1}, {3, 0} -> {2, 1, 1}, {3, 1} -> {1, 0, -1}, {4, 0} -> {2, 
     1, -1}, {4, 1} -> {4, 0, 1}}, {{1, 0} -> {4, 0, 1}, {1, 1} -> {3,
      1, -1}, {2, 0} -> {1, 1, -1}, {2, 1} -> {1, 1, 1}, {3, 0} -> {1,
      1, 1}, {3, 1} -> {3, 0, -1}, {4, 0} -> {2, 1, 1}, {4, 1} -> {2, 
     1, 1}}, {{1, 0} -> {2, 1, -1}, {1, 1} -> {1, 0, 1}, {2, 0} -> {4,
      0, -1}, {2, 1} -> {3, 1, 1}, {3, 0} -> {4, 1, -1}, {3, 1} -> {1,
      1, 1}, {4, 0} -> {1, 1, 1}, {4, 1} -> {2, 0, -1}}}

And here are their respective causal graphs:

With[{t = 1000}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/TMCausal.\
wl"]]; With[{t = 1000}, 
   Graph[Rule @@@ 
     TMCausalData[
      TuringMachine[#, {{1, t}, ConstantArray[0, 2 t + 1]}, t]], 
    EdgeStyle -> 
     ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph",
       "EdgeStyle"], 
    VertexStyle -> 
     ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph",
       "VertexStyle"]]] & /@ {{{1, 0} -> {3, 1, -1}, {1, 1} -> {2, 0, 
     1}, {2, 0} -> {1, 1, 1}, {2, 1} -> {3, 1, 1}, {3, 0} -> {2, 1, 
     1}, {3, 1} -> {1, 0, -1}}, {{1, 0} -> {3, 1, -1}, {1, 1} -> {2, 
     1, -1}, {2, 0} -> {1, 1, -1}, {2, 1} -> {4, 1, 1}, {3, 0} -> {2, 
     1, 1}, {3, 1} -> {1, 0, -1}, {4, 0} -> {2, 1, -1}, {4, 1} -> {4, 
     0, 1}}, {{1, 0} -> {4, 0, 1}, {1, 1} -> {3, 1, -1}, {2, 0} -> {1,
      1, -1}, {2, 1} -> {1, 1, 1}, {3, 0} -> {1, 1, 1}, {3, 1} -> {3, 
     0, -1}, {4, 0} -> {2, 1, 1}, {4, 1} -> {2, 1, 1}}, {{1, 0} -> {2,
      1, -1}, {1, 1} -> {1, 0, 1}, {2, 0} -> {4, 0, -1}, {2, 1} -> {3,
      1, 1}, {3, 0} -> {4, 1, -1}, {3, 1} -> {1, 1, 1}, {4, 0} -> {1, 
     1, 1}, {4, 1} -> {2, 0, -1}}}

Rulial Multiway Causal Graphs

What can we say about the causal graph associated with a rulial multiway system? The first important observation is that rulial multiway graphs always exhibit causal invariance, since by including transitions associated with all possible rules one is inevitably including both rules and their inverses, with the result that every branching of edges in the rulial multiway graph is always associated with a corresponding merging.

It’s fairly easy to see this for two steps of s = 2, k = 2 Turing machines. The rulial multiway states graph here is:

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; LayeredGraphPlot[
 With[{t = 2}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure"]], 
 AspectRatio -> 1/2]

This can also be rendered:

With[{t = 2}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 2}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "StatesGraphStructure"]]

Explicitly showing events we get:

With[{t = 2}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 2}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "EvolutionEventsGraphStructure"]]

Including causal connections we get

With[{t = 2}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 2}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "EvolutionCausalGraphStructure"]]

or after 3 steps:

With[{t = 3}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 3}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "EvolutionCausalGraphStructure"]]

The pure rulial multiway causal graph in this case is then:

With[{t = 3}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 3}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "CausalGraphStructure"]]

In layered form this becomes:

With[{t = 3}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 3}, 
 LayeredGraphPlot[
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "CausalGraphStructure"], 
  AspectRatio -> 1/2]]

After 5 steps the growth in the number of nodes reached going from the root grows like:

ListLogPlot
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{"\[LeftAssociation]", 
    RowBox[{
     RowBox[{
      RowBox[{"{", "1", "}"}], "\[Rule]", "1184"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", "9"}], "}"}], "\[Rule]", "400"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "344", ",", "656", ",", "1208",
         ",", "1728", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "344", ",", "592", ",", "1112",
         ",", "1688", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "288", ",", "576", ",", "1208",
         ",", "1728", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "288", ",", "512", ",", "1112",
         ",", "1688", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "344", ",", "656", 
        ",", "1208", ",", "1728", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "344", ",", "592", 
        ",", "1112", ",", "1688", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "288", ",", "576", 
        ",", "1208", ",", "1728", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "288", ",", "512", 
        ",", "1112", ",", "1688", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "136", ",", "560", ",", "1208", ",", 
        "1728", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "136", ",", "496", ",", "1112", ",", 
        "1688", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "464", ",", "1208", ",", "1728",
         ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "400", ",", "1112", ",", "1688",
         ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", "33", ",", "73"}], "}"}], "\[Rule]", "32"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", "25", ",", "73"}], "}"}], "\[Rule]", "32"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "184", ",", "800", ",", "1552", ",", 
        "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "352", ",", "816", ",", "1528",
         ",", "1848", ",", "1984"}], "}"}], "\[Rule]", "16"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "712", ",", "1568", ",", 
        "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "392", ",", "960", ",", "1680",
         ",", "1968", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "456", ",", "1032", ",", 
        "1672", ",", "1928", ",", "1984"}], "}"}], "\[Rule]", "8"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "656", ",", "1496", ",", 
        "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "69", ",", "285", ",", "720", ",", "1336", ",", 
        "1816", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "69", ",", "285", ",", "696", ",", "1296", ",", 
        "1816", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "45", ",", "253", ",", "696", ",", "1304", ",", 
        "1816", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "45", ",", "253", ",", "672", ",", "1264", ",", 
        "1816", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "105", ",", "344", ",", "656", ",", "1208",
         ",", "1728", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "105", ",", "344", ",", "592", ",", "1112",
         ",", "1688", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "185", ",", "800", ",", "1552", ",", 
        "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "105", ",", "712", ",", "1568", ",", 
        "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "105", ",", "656", ",", "1496", ",", 
        "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "352", ",", "816", 
        ",", "1528", ",", "1848", ",", "1984"}], "}"}], "\[Rule]", 
      "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "136", ",", "640", ",", "1352", ",", 
        "1784", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "480", ",", "1272", ",", "1776",
         ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "136", ",", "560", ",", "1264", ",", 
        "1800", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "400", ",", "1176", ",", "1752",
         ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "392", ",", "960", 
        ",", "1680", ",", "1968", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "456", ",", "1032", 
        ",", "1672", ",", "1928", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "117", ",", "588", ",", "1376", ",", "1864", ",", 
        "1984"}], "}"}], "\[Rule]", "2"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "33", ",", "136", ",", "312", ",", "728", ",", 
        "1360", ",", "1792", ",", "1984"}], "}"}], "\[Rule]", "4"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "69", ",", "540", ",", "1384", ",", "1864", ",", 
        "1984"}], "}"}], "\[Rule]", "2"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "25", ",", "200", ",", "616", ",", "1272", ",", 
        "1808", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "33", ",", "376", ",", "1048", ",", "1656", ",", 
        "1928", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "33", ",", "136", ",", "312", ",", "648", ",", 
        "1272", ",", "1808", ",", "1984"}], "}"}], "\[Rule]", "4"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "25", ",", "344", ",", "1080", ",", "1712", ",", 
        "1928", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "25", ",", "200", ",", "696", ",", "1360", ",", 
        "1792", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "33", ",", "248", ",", "896", ",", "1624", ",", 
        "1968", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "25", ",", "136", ",", "432", ",", "1016", ",", 
        "1656", ",", "1928", ",", "1984"}], "}"}], "\[Rule]", "4"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "33", ",", "312", ",", "976", ",", "1656", ",", 
        "1928", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "25", ",", "136", ",", "368", ",", "936", ",", 
        "1624", ",", "1968", ",", "1984"}], "}"}], "\[Rule]", "4"}], 
     ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "352", ",", "760", ",", "1352",
         ",", "1784", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "288", ",", "600", ",", "1272",
         ",", "1776", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "352", ",", "680", ",", "1264",
         ",", "1800", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "104", ",", "288", ",", "520", ",", "1176",
         ",", "1752", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "69", ",", "349", ",", "856", ",", "1448", ",", 
        "1848", ",", "1984"}], "}"}], "\[Rule]", "8"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "45", ",", "253", ",", "744", ",", "1360", ",", 
        "1792", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "105", ",", "352", ",", "760", ",", "1352",
         ",", "1784", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "45", ",", "253", ",", "664", ",", "1272", ",", 
        "1808", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "105", ",", "352", ",", "680", ",", "1264",
         ",", "1800", ",", "1984"}], "}"}], "\[Rule]", "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "352", ",", "760", 
        ",", "1352", ",", "1784", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "288", ",", "600", 
        ",", "1272", ",", "1776", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "352", ",", "680", 
        ",", "1264", ",", "1800", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "9", ",", "72", ",", "160", ",", "288", ",", "520", 
        ",", "1176", ",", "1752", ",", "1984"}], "}"}], "\[Rule]", 
      "4"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "33", ",", "312", ",", "968", ",", "1624", ",", 
        "1968", ",", "1984"}], "}"}], "\[Rule]", "2"}], ",", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "1", ",", "25", ",", "264", ",", "992", ",", "1696", ",", 
        "1968", ",", "1984"}], "}"}], "\[Rule]", "2"}]}], 
    "\[RightAssociation]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"ListLogPlot", "[", 
  RowBox[{
   RowBox[{"Select", "[", 
    RowBox[{
     RowBox[{"Keys", "[", "data", "]"}], ",", 
     RowBox[{
      RowBox[{
       RowBox[{"Length", "[", "#", "]"}], "\[Equal]", "9"}], "&"}]}], 
    "]"}], ",", 
   RowBox[{"Joined", "\[Rule]", "True"}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", 
      "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
     RowBox[{"\"\<GenericLinePlot\>\"", ",", "\"\<PlotStyles\>\""}], 
     "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

Causal invariance implies that this multiway causal graph is ultimately composed of a large number of interwoven copies of a single causal graph. After any given number of steps of evolution, the various copies of this causal graph will have “reached different stages”. Here are the results after 3 steps:

Counts
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Counts[
 IndexGraph[#, ImageSize -> Tiny] & /@ 
  With[{t = 3}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "CausalGraphStructureInstances"]]]

And after 4 steps:

Counts
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Counts[
 IndexGraph[#, ImageSize -> Tiny] & /@ 
  With[{t = 4}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "CausalGraphStructureInstances"]]]

One can also look at rulial multiway causal graphs for other sets of Turing machines. For s = 1, k = 1 one has (here after 5 steps)

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; LayeredGraphPlot[
 SimpleGraph[
  With[{t = 5}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, 
     ConstantArray[0, 2 t + 1]}, t, "CausalGraphStructure"]]]]

which is equivalent to:

With[{t = 5}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 5}, 
 SimpleGraph[
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
   "CausalGraphStructure"]]]

The individual causal graphs in this case are immediately all the same, and are just:

Counts
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Counts[
 IndexGraph[#, ImageSize -> Tiny] & /@ 
  With[{t = 5}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{1, 1}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "CausalGraphStructureInstances"]]]

For s = 2, k = 1 one already has after 3 steps

With[{t = 3}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 3}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "CausalGraphStructure"]]

or in layered form:

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; LayeredGraphPlot[
 With[{t = 3}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "CausalGraphStructure"]], 
 AspectRatio -> 1/3]

After 5 steps this becomes

With[{t = 5}
&#10005
CloudGet[CloudObject[
   "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]];
With[{t = 5}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "CausalGraphStructure"]]

or in layered form:

LayeredGraphPlot
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; LayeredGraphPlot[
 With[{t = 5}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "CausalGraphStructure"]], 
 AspectRatio -> 1/3]

In this case, the individual causal graphs after 5 steps are:

Counts
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Counts[
 IndexGraph[#, ImageSize -> Tiny] & /@ 
  With[{t = 5}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "CausalGraphStructureInstances"]]]

For s = 1, k = 2 one has very similar results to the case s = 2, k = 2. After 3 steps the causal graph is:

With[{t = 3}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 3}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "CausalGraphStructure"]]

Or after 5 steps:

With[{t = 4}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 4}, 
 ResourceFunction["MultiwayTuringMachine"][
  AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
   t, "CausalGraphStructure"]]

Removing multiple edges this is:

SimpleGraph
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; SimpleGraph[
 With[{t = 4}, 
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "CausalGraphStructure"]]]

The individual causal graphs in this case are:

Counts
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Counts[
 IndexGraph[#, ImageSize -> Tiny] & /@ 
  With[{t = 5}, 
   ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "CausalGraphStructureInstances"]]]

Rulial Graphs

Just as for ordinary multiway graphs one can study branchial graphs which represent their transversals, so similarly for rulial multiway graphs one can study rulial graphs which represent their transversals. The layered way we have drawn multiway graphs corresponds to a particular choice of foliation—and with this choice, we can immediately generate rulial graphs.

For s = 2, k = 1 one has after 2 steps

With[{t = 2}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 2}, 
 Graph[ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "BranchialGraph", 
   PerformanceGoal -> "Quality", VertexSize -> .45 {1.15, .3}], 
  EdgeStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["RulialGraph", 
    "EdgeStyle"]]]

while after 3 steps one gets:

With[{t = 3}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Framed[
 With[{t = 3}, 
  Graph[ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 1}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "BranchialGraph"], PerformanceGoal -> "Quality", 
   VertexSize -> .8 {1, 1/(2 t + 1)}, 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["RulialGraph", 
     "EdgeStyle"]]], FrameStyle -> LightGray]

For s = 1, k = 2 one gets after 2 steps:

With[{t = 2}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Framed[
 With[{t = 2}, 
  Graph[ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "BranchialGraph"], PerformanceGoal -> "Quality", 
   VertexSize -> .45 {1.15, .3}, 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["RulialGraph", 
     "EdgeStyle"]]], FrameStyle -> LightGray]

while after 3 steps one has:

With[{t = 3}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Framed[
 With[{t = 3}, 
  Graph[ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "BranchialGraph"], PerformanceGoal -> "Quality", 
   VertexSize -> .5 {1.15, .3}, 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["RulialGraph", 
     "EdgeStyle"]]], FrameStyle -> LightGray]

The sequence of results for steps 1 through 4 is:

Table
&#10005
Table[Framed[
  Graph[ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{1, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "BranchialGraphStructure"], 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["RulialGraph", 
     "EdgeStyle"]], FrameStyle -> LightGray], {t, 1, 4}]

For s = 2, k = 2 the corresponding results are:

Table
&#10005
Table[Framed[
  Graph[ResourceFunction["MultiwayTuringMachine"][
    AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, Table[0, 2 t + 1]}, t, 
    "BranchialGraphStructure"], 
   EdgeStyle -> 
    ResourceFunction["WolframPhysicsProjectStyleData"]["RulialGraph", 
     "EdgeStyle"]], FrameStyle -> LightGray], {t, 1, 4}]

What do these pictures mean? Just as branchial graphs in ordinary multiway systems can be thought of as “entanglement maps” for states (interpreted in our models as quantum states) in ordinary “multiway space”, so here these rulial graphs can be thought of as “entanglement maps” for states in rulial space. In other words, they are a kind of map of which Turing machine configurations are “evolutionarily close” to which other ones, in the sense that they can be reached with only a few different choices of rules.

Deterministic Turing Machine Paths in Rulial Space

The rulial multiway graph defines all paths that can be followed by all non-deterministic Turing machines. At each node in this graph there is therefore an outgoing edge corresponding to any possible Turing machine transition from the configuration corresponding to that node. But what if one considers just a single deterministic Turing machine? Its evolution is then a single path within the rulial multiway graph.

Consider for example the s = 2, k = 2 Turing machine:

RulePlot
&#10005
RulePlot[TuringMachine[2506]]

This machine evolves from a blank tape according to:

RulePlot
&#10005
RulePlot[TuringMachine[2506], {1, {{}, 0}}, 10, Mesh -> True, 
 Frame -> None]

This corresponds to a path in the rulial multiway graph:

With[{t = 4}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 4}, 
 HighlightGraph[
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
   VertexSize -> 1], 
  Style[PathGraph[
    Rule @@@ 
     Partition[
      ToString /@ 
       TuringMachine[
        2506, {{2, t + 1, 0}, 
         ReplacePart[ConstantArray[0, 2 t + 1], t + 1 -> 1]}, t], 2, 
      1]], Thickness[0.01], Red]]]

With a different initial condition

RulePlot
&#10005
RulePlot[TuringMachine[2506], {1, {{1}, 0}}, 10, Mesh -> True, 
 Frame -> None]

the path can be very different:

With[{t = 4}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 4}, 
 HighlightGraph[
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
   VertexSize -> 1], 
  Style[PathGraph[
    Rule @@@ 
     Partition[
      ToString /@ 
       TuringMachine[
        2506, {{1, t + 1, 0}, 
         ReplacePart[ConstantArray[0, 2 t + 1], t + 1 -> 1]}, t], 2, 
      1]], Thickness[0.01], Red]]]

This pictures might make it seem that the paths corresponding to the evolution of deterministic Turing machines are geodesics in the rulial multiway graph. But in general they are definitely not. For example, starting from a blank tape, the rule

RulePlot
&#10005
RulePlot[TuringMachine[{{1, 0} -> {2, 0, -1}, {1, 1} -> {1, 
     0, -1}, {2, 0} -> {1, 1, 1}, {2, 1} -> {1, 0, -1}}]]

follows this path on the rulial multiway graph:

With[{t = 4}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 4}, 
 HighlightGraph[
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
   VertexSize -> 1], 
  Style[PathGraph[
    Rule @@@ 
     Partition[
      ToString /@ 
       TuringMachine[{{1, 0} -> {2, 0, -1}, {1, 1} -> {1, 0, -1}, {2, 
           0} -> {1, 1, 1}, {2, 1} -> {1, 0, -1}}, {{1, t + 1, 0}, 
         ConstantArray[0, 2 t + 1]}, t], 2, 1]], Thickness[0.01], 
   Red]]]

If one allows any possible Turing machine transition at every step, then one can follow a geodesic path from a node corresponding to an initial condition to a node corresponding to any other configuration. But if one restricts oneself to the transitions in a particular (deterministic) Turing machine, then there will in general be many configurations one will never reach, and even those that one can reach, one may reach by a circuitous route in the rulial multiway graph.

Given a particular configuration (corresponding to a node in the rulial multiway graph), there may be many deterministic Turing machines that can reach it from a given initial state. One can consider each of these Turing machines to be implementing a certain algorithm. So then the “optimal algorithm” will be the one which is shortest among deterministic Turing machine paths. As I just mentioned, this won’t typically be the shortest possible path: that will usually be achieved by a non-deterministic Turing machine with a particular sequence of transitions. But there is still an optimal case (i.e. an “optimal algorithm”) among deterministic Turing machines.

But now we can ask across all possible deterministic Turing machines where they can reach in the rulial multiway graph. Here is the result for 4 steps for the s = 2, k = 2 Turing machines we have been considering:

With[{t = 4}
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; With[{t = 4}, 
 With[{g = 
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
      ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
     VertexSize -> 1]}, 
  HighlightGraph[g, 
   Style[PathGraph[
       Rule @@@ 
        Partition[
         ToString /@ 
          TuringMachine[#, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
            t], 2, 1]], Thick, Red] & /@ Range[0, 4095]]]]

These are the results for 1, 2 and 3 steps:

ParallelTable
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; ParallelTable[
 With[{t = 4}, 
  With[{g = 
     ResourceFunction["MultiwayTuringMachine"][
      AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
       ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure", 
      EdgeStyle -> 
       Directive[
        ResourceFunction["WolframPhysicsProjectStyleData"][
          "StatesGraph"]["EdgeStyle"], Opacity[0.4]], 
      VertexSize -> 1]}, 
   HighlightGraph[g, 
    Style[PathGraph[
        Rule @@@ 
         Partition[
          ToString /@ 
           TuringMachine[#, {{1, t + 1, 0}, 
             ConstantArray[0, 2 t + 1]}, tt], 2, 1]], Thick, 
       Append[Red, 1]] & /@ Range[0, 4095]]]], {tt, 3}]

What is the significance of this? Essentially what we’re seeing is a comparison of what can be achieved with deterministic computation versus non-deterministic. Taking the 4-step case as an example, the “background” gray rulial multiway graph shows what can be achieved with arbitrary non-deterministic computation in 4 steps. The red region is what deterministic computation can achieve in the same number of steps.

In a sense this is a very simple empirical analog of the P vs. NP problem. Unlike the real P vs. NP case, we’re not allowing arbitrary polynomial-time algorithms here; we’re just looking at possible s = 2, k = 2 Turing machine algorithms running specifically for 4 steps. But if we were to generalize this appropriately, P = NP would imply that the “red region” must in some limit in effect “reach anywhere in the graph”.

A little more precisely, the official definition of P and NP is for decision problems: you start from some initial condition which defines an instance of a problem (“Is this Boolean formula satisfiable?” or whatever), and the system must eventually evolve to a state representing either “yes” or “no”. We can imagine setting things up so that the outcomes correspond to particular configurations of the Turing machine. The inputs are then also encoded as initial configurations of the Turing machine, and we want to know what happens as we consider inputs of progressively larger sizes. We can imagine drawing the rulial multiway graph so that progressively larger inputs are shown “progressively further from the center”. If we consider non-deterministic Turing machines (associated with the class of NP computations), then the shortest computation will be a geodesic path in the rulial multiway graph from the input configuration to the outcome configuration. But for deterministic Turing machines (associated with the class P) it will in general be some much more circuitous path.

The standard P vs. NP problem asks about limiting behavior as one increases the size of the input computation—and one might imagine that the question could be “geometrized” in terms of some continuum limit of the rulial multiway graph. Of course, there is no guarantee that any reasonable limit exists. And it could perfectly well be that the question of whether P  NP is actually undecidable, or in other words, that no finite proof of it can be given within a standard axiom system (such as Peano arithmetic or ZFC set theory).

One could imagine empirically testing more and more Turing machines, and seeing how they perform on an NP-complete problem. One might think of plotting their running times as a function of n for increasingly large n. For a while a particular Turing machine might be the winner. But then another one might take over. And there might be no end to how many “surprises” would occur as one increases n. (Somehow this is reminiscent of the story of the Skewes number and whether LogIntegral[n] > Prime[n].)

In computational complexity theory one usually thinks about explicitly constructing optimal algorithms by standard “engineering-like” methods. But I think there’s a lot to be learned from a more empirical approach—in which one doesn’t try to construct optimal algorithms, but just finds them “in the wild” by searching all possible programs. In the past, it might not have seemed that “just searching for programs” would ever produce anything terribly interesting. But one of the big consequences of what I discussed in A New Kind of Science is that even among tiny programs—small enough that one can, for example, enumerate all of them—there’s often very complex and potentially “useful” behavior. And that makes it seem much more reasonable to try to do “empirical computational complexity theory”—enumerating possible programs to find optimal ones, and so on.

Small programs can be thought of as ones with low algorithmic complexity. So searching for “fast programs” among these can be thought of as searching for programs with low time complexity, and low algorithmic complexity. We don’t know exactly how strong the constraint of low algorithmic complexity is, but from what I’ve seen in the computational universe (and what’s embodied in things like the Principle of Computational Equivalence), it seems as if it’s not such a big constraint.

I studied “empirical computational complexity theory” a bit in A New Kind of Science, notably for Turing machines. And one of the interesting observations was that the optimal algorithm for things was often not something that could readily be constructed in a step-by-step engineering way. Instead, it was something that one basically could only find pretty much by doing a search of possible programs—and where there usually didn’t seem to be a “general form” that would “apply for all n”, and let one readily deduce the properties of the n   limit. In other words, it didn’t seem like what we’d now think of as the sequence of paths in rulial space would show any sign of converging to a “continuum limit”.

A decent example of all this occurs in sorting networks. Imagine that you are given a collection of n numbers to sort. You can straightforwardly do this by making about n2 pairwise comparisons, and it’s pretty easy to optimize this a bit.

But explicit searches have revealed that the actual optimal networks for successive n are:

Table
&#10005
Cell[CellGroupData[{
Cell[BoxData[{
 RowBox[{
  RowBox[{"CloudGet", "[", 
   RowBox[{
   "CloudObject", "[", 
    "\"\<https://www.wolframcloud.com/obj/wolframphysics/Bulletin/\
SortingNetworks.wl\>\"", "]"}], "]"}], ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{"NetworkGraphics", "[", 
   RowBox[{"net_", ",", "seq_", ",", "opts___"}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"n", "=", 
       RowBox[{"Max", "[", "net", "]"}]}], ",", 
      RowBox[{"net1", "=", 
       RowBox[{"Flatten", "[", 
        RowBox[{"net", ",", "1"}], "]"}]}], ",", "len"}], "}"}], ",", 
    "\[IndentingNewLine]", 
    RowBox[{
     RowBox[{"len", "=", 
      RowBox[{"Length", "[", "net1", "]"}]}], ";", 
     "\[IndentingNewLine]", 
     RowBox[{"Graphics", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{"GrayLevel", "[", "0.8", "]"}], ",", 
         RowBox[{"Map", "[", 
          RowBox[{"Reverse", ",", 
           RowBox[{"If", "[", 
            RowBox[{
             RowBox[{"seq", "===", "None"}], ",", 
             RowBox[{"Table", "[", 
              RowBox[{
               RowBox[{"Line", "[", 
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"{", 
                   RowBox[{"0", ",", "y"}], "}"}], ",", 
                  RowBox[{"{", 
                   RowBox[{
                    RowBox[{"len", "+", "1"}], ",", "y"}], "}"}]}], 
                 "}"}], "]"}], ",", 
               RowBox[{"{", 
                RowBox[{"y", ",", "n"}], "}"}]}], "]"}], ",", 
             RowBox[{"MapIndexed", "[", 
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"Thickness", "[", 
                   RowBox[{"0.1", "/", 
                    RowBox[{"Length", "[", 
                    RowBox[{"First", "[", "seq", "]"}], "]"}]}], 
                   "]"}], ",", 
                  RowBox[{
                   RowBox[{"ColorData", "[", "\"\<Rainbow\>\"", "]"}],
                    "[", 
                   RowBox[{
                    RowBox[{"#", "[", 
                    RowBox[{"[", "1", "]"}], "]"}], "/", "n"}], "]"}],
                   ",", 
                  RowBox[{"Line", "[", 
                   RowBox[{"{", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"#", "[", 
                    RowBox[{"[", 
                    RowBox[{"2", ",", "1"}], "]"}], "]"}], ",", 
                    RowBox[{"#2", "[", 
                    RowBox[{"[", "1", "]"}], "]"}]}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"#", "[", 
                    RowBox[{"[", 
                    RowBox[{"2", ",", "2"}], "]"}], "]"}], ",", 
                    RowBox[{"#2", "[", 
                    RowBox[{"[", "1", "]"}], "]"}]}], "}"}]}], "}"}], 
                   "]"}]}], "}"}], "&"}], ",", 
               RowBox[{"Map", "[", 
                RowBox[{
                 RowBox[{"Function", "[", 
                  RowBox[{"x", ",", 
                   RowBox[{"With", "[", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{"lengths", "=", 
                    RowBox[{"Length", "/@", 
                    RowBox[{"Split", "[", "x", "]"}]}]}], "}"}], ",", 
                    
                    RowBox[{
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"#", "[", 
                    RowBox[{"[", "1", "]"}], "]"}], ",", 
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{
                    RowBox[{"#", "[", 
                    RowBox[{"[", "3", "]"}], "]"}], "-", 
                    RowBox[{"#", "[", 
                    RowBox[{"[", "2", "]"}], "]"}]}], ",", 
                    RowBox[{"#", "[", 
                    RowBox[{"[", "3", "]"}], "]"}]}], "}"}]}], "}"}], 
                    "&"}], "/@", 
                    RowBox[{"Thread", "[", 
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"Map", "[", 
                    RowBox[{"First", ",", 
                    RowBox[{"Split", "[", "x", "]"}]}], "]"}], ",", 
                    "lengths", ",", 
                    RowBox[{"Accumulate", "[", "lengths", "]"}]}], 
                    "}"}], "]"}]}]}], "]"}]}], "]"}], ",", 
                 RowBox[{"Transpose", "[", "seq", "]"}]}], "]"}], ",", 
               RowBox[{"{", 
                RowBox[{"-", "3"}], "}"}]}], "]"}]}], "]"}]}], "]"}], 
         ",", 
         RowBox[{"{", 
          RowBox[{"Black", ",", 
           RowBox[{"If", "[", 
            RowBox[{
             RowBox[{"seq", "=!=", "None"}], ",", 
             RowBox[{"Thickness", "[", 
              RowBox[{"0.1", "/", 
               RowBox[{"Length", "[", 
                RowBox[{"First", "[", "seq", "]"}], "]"}]}], "]"}], 
             ",", 
             RowBox[{"{", "}"}]}], "]"}], ",", 
           RowBox[{"MapIndexed", "[", 
            RowBox[{
             RowBox[{
              RowBox[{"Line", "[", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"{", 
                  RowBox[{
                   RowBox[{"#2", "[", 
                    RowBox[{"[", "1", "]"}], "]"}], ",", 
                   RowBox[{"#1", "[", 
                    RowBox[{"[", "1", "]"}], "]"}]}], "}"}], ",", 
                 RowBox[{"{", 
                  RowBox[{
                   RowBox[{"#2", "[", 
                    RowBox[{"[", "1", "]"}], "]"}], ",", 
                   RowBox[{"#1", "[", 
                    RowBox[{"[", "2", "]"}], "]"}]}], "}"}]}], "}"}], 
               "]"}], "&"}], ",", "net1", ",", 
             RowBox[{"{", 
              RowBox[{"-", "2"}], "}"}]}], "]"}]}], "}"}]}], "}"}], 
       ",", "opts", ",", 
       RowBox[{"PlotRange", "\[Rule]", "All"}], ",", 
       RowBox[{"ImageSize", "\[Rule]", "450"}], ",", 
       RowBox[{"AspectRatio", "\[Rule]", 
        RowBox[{"1", "/", "2"}]}]}], "]"}]}]}], "]"}]}]}], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Show", "[", 
    RowBox[{
     RowBox[{"NetworkGraphics", "[", 
      RowBox[{
       RowBox[{"OptimalSort", "[", "n", "]"}], ",", "None"}], "]"}], 
     ",", 
     RowBox[{"ImageSize", "\[Rule]", 
      RowBox[{"{", 
       RowBox[{"Automatic", ",", "70"}], "}"}]}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"n", ",", "4", ",", "16"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

What’s notable is how complicated and “random” they look; there doesn’t seem to be any obvious pattern to them (and my guess is that there fundamentally isn’t). Here’s a plot of the sizes of these networks (divided by n2):

ListLinePlot
&#10005
ListLinePlot[{1, 3, 5, 9, 12, 16, 19, 25, 29, 35, 39, 45, 51, 56, 60}/
  Range[2, 16]^2, AspectRatio -> 1/3, DataRange -> {2, 16}, 
 Filling -> Axis, Frame -> True, Mesh -> All, MeshStyle -> Tiny, 
 PlotRange -> {{1.5, 16.5}, Automatic}, 
 PlotStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"][
   "GenericLinePlot", "PlotStyles"]]

It’s worth noting that these are deterministic networks. One could also imagine non-deterministic networks (and indeed one could construct a rulial multiway graph by considering all possible successive placements of pairwise comparisons)—and in a non-deterministic network it’s always possible to sort n numbers in at most n – 1 steps.

The Space of Deterministic Turing Machine Computations

We’ve just seen how the results of deterministic Turing machine computations lay out in the rulial multiway space of all possible non-deterministic Turing machine computations. But what happens if we just look at the graph of deterministic Turing machine computations on their own?

Here are the full rulial multiway graphs for 2 and 3 steps with the graphs of deterministic Turing machine computations superimposed, as before:

Table
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; Table[
 With[{g = 
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
      ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure"]}, 
  HighlightGraph[g, 
   Style[PathGraph[
       Rule @@@ 
        Partition[
         ToString /@ 
          TuringMachine[#, {{1, t + 1, 0}, Table[0, 2 t + 1]}, t], 2, 
         1]], Thick, Red] & /@ Range[0, 4095]]], {t, 2, 3}]

But now let’s “pull out just the red subgraphs”—in other words, include as nodes only those configurations that at least one of the 4096 s = 2, k = 2 Turing machines can reach after 2 or 3 steps:

Table
&#10005
Table[SimpleGraph[
  Flatten@Table[
    DirectedEdge @@@ 
     Partition[
      TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]}, 
       t], 2, 1], {tm, 0, 4095}], ImageSize -> 300, 
  EdgeStyle -> RGBColor[0.92, 0.17, 0.24]], {t, 2, 3}]

Notice that after 2 steps, deterministic Turing machines can still reach all 36 configurations that non-deterministic ones can reach (though not through quite as many paths). But after 3 steps, among all the deterministic Turing machines, they can only reach 68 possible configurations, while non-deterministic ones can reach 100 configurations.

For all possible non-deterministic Turing machines with s = 2, k = 2 the total number of configurations that can be reached eventually roughly doubles on successive steps:

VertexCount
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; ParallelTable[
 VertexCount[
  ResourceFunction["MultiwayTuringMachine"][
   AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
    ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure"]], {t, 0, 
  10}]

For deterministic Turing machines, however, the number of possible configurations that can be reached soon increases much more slowly:

ParallelTable
&#10005
ParallelTable[
 If[t == 0, 1, 
  VertexCount[
   Graph[Flatten@
     Table[DirectedEdge @@@ 
       Partition[
        TuringMachine[tm, {{1, t + 1, 0}, Table[0, 2 t + 1]}, t], 2, 
        1], {tm, 0, 4095}]]]], {t, 0, 15}]

And in fact there’s an obvious bound here: at any given step, the most that can happen is that each of the 4096 s = 2, k = 2 Turing machines leads to a new configuration—or, in other words, the maximum number of configurations reached increases by 4096.

Looking at the differences on successive steps, we find:

data = Block
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{"Block", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"t", "=", "200"}], ",", "rr"}], "}"}], ",", 
     "\[IndentingNewLine]", 
     RowBox[{
      RowBox[{"rr", "=", 
       RowBox[{"Table", "[", 
        RowBox[{
         RowBox[{"Rule", "@@@", 
          RowBox[{"Partition", "[", 
           RowBox[{
            RowBox[{"TuringMachine", "[", 
             RowBox[{"tm", ",", 
              RowBox[{"{", 
               RowBox[{
                RowBox[{"{", 
                 RowBox[{"1", ",", 
                  RowBox[{"t", "+", "1"}], ",", "0"}], "}"}], ",", 
                RowBox[{"ConstantArray", "[", 
                 RowBox[{"0", ",", 
                  RowBox[{
                   RowBox[{"2", "t"}], "+", "1"}]}], "]"}]}], "}"}], 
              ",", "t"}], "]"}], ",", "2", ",", "1"}], "]"}]}], ",", 
         RowBox[{"{", 
          RowBox[{"tm", ",", "0", ",", "4095"}], "}"}]}], "]"}]}], 
      ";", "\[IndentingNewLine]", 
      RowBox[{"Table", "[", 
       RowBox[{
        RowBox[{"VertexCount", "[", 
         RowBox[{"Flatten", "[", 
          RowBox[{"Take", "[", 
           RowBox[{"rr", ",", "All", ",", "u"}], "]"}], "]"}], "]"}], 
        ",", 
        RowBox[{"{", 
         RowBox[{"u", ",", "t"}], "}"}]}], "]"}]}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[{
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{"{", 
    RowBox[{
    "9", ",", "36", ",", "68", ",", "94", ",", "144", ",", "248", ",",
      "322", ",", "382", ",", "458", ",", "559", ",", "659", ",", 
     "737", ",", "823", ",", "921", ",", "1007", ",", "1091", ",", 
     "1189", ",", "1303", ",", "1399", ",", "1483", ",", "1575", ",", 
     "1689", ",", "1793", ",", "1870", ",", "1950", ",", "2050", ",", 
     "2146", ",", "2234", ",", "2340", ",", "2456", ",", "2550", ",", 
     "2638", ",", "2730", ",", "2840", ",", "2948", ",", "3036", ",", 
     "3128", ",", "3238", ",", "3332", ",", "3417", ",", "3515", ",", 
     "3629", ",", "3727", ",", "3807", ",", "3901", ",", "4017", ",", 
     "4123", ",", "4205", ",", "4289", ",", "4395", ",", "4493", ",", 
     "4583", ",", "4689", ",", "4799", ",", "4889", ",", "4969", ",", 
     "5055", ",", "5165", ",", "5275", ",", "5368", ",", "5460", ",", 
     "5568", ",", "5658", ",", "5738", ",", "5838", ",", "5954", ",", 
     "6052", ",", "6134", ",", "6228", ",", "6346", ",", "6454", ",", 
     "6546", ",", "6638", ",", "6746", ",", "6846", ",", "6934", ",", 
     "7040", ",", "7156", ",", "7250", ",", "7330", ",", "7416", ",", 
     "7526", ",", "7638", ",", "7725", ",", "7817", ",", "7927", ",", 
     "8017", ",", "8103", ",", "8205", ",", "8321", ",", "8419", ",", 
     "8507", ",", "8605", ",", "8717", ",", "8823", ",", "8913", ",", 
     "9001", ",", "9105", ",", "9199", ",", "9285", ",", "9387", ",", 
     "9509", ",", "9605", ",", "9683", ",", "9773", ",", "9889", ",", 
     "10003", ",", "10093", ",", "10187", ",", "10297", ",", "10389", 
     ",", "10476", ",", "10576", ",", "10682", ",", "10774", ",", 
     "10856", ",", "10946", ",", "11052", ",", "11152", ",", "11236", 
     ",", "11322", ",", "11430", ",", "11530", ",", "11618", ",", 
     "11726", ",", "11842", ",", "11936", ",", "12018", ",", "12108", 
     ",", "12224", ",", "12336", ",", "12428", ",", "12524", ",", 
     "12634", ",", "12730", ",", "12818", ",", "12920", ",", "13034", 
     ",", "13132", ",", "13216", ",", "13308", ",", "13426", ",", 
     "13532", ",", "13615", ",", "13701", ",", "13807", ",", "13903", 
     ",", "13991", ",", "14097", ",", "14213", ",", "14305", ",", 
     "14391", ",", "14483", ",", "14593", ",", "14703", ",", "14793", 
     ",", "14887", ",", "14999", ",", "15095", ",", "15185", ",", 
     "15289", ",", "15407", ",", "15509", ",", "15591", ",", "15685", 
     ",", "15799", ",", "15903", ",", "15987", ",", "16073", ",", 
     "16177", ",", "16275", ",", "16369", ",", "16479", ",", "16593", 
     ",", "16687", ",", "16771", ",", "16863", ",", "16977", ",", 
     "17089", ",", "17180", ",", "17272", ",", "17382", ",", "17474", 
     ",", "17554", ",", "17654", ",", "17770", ",", "17868", ",", 
     "17948", ",", "18042", ",", "18160", ",", "18268", ",", "18360", 
     ",", "18450", ",", "18552", ",", "18646", ",", "18734", ",", 
     "18838", ",", "18954", ",", "19048", ",", "19134"}], "}"}]}], 
  ";"}], "\[IndentingNewLine]", 
 RowBox[{"ListLinePlot", "[", 
  RowBox[{
   RowBox[{"Differences", "[", "data", "]"}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "3"}]}], ",", 
   RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", 
      "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
     RowBox[{"\"\<GenericLinePlot\>\"", ",", "\"\<PlotStyles\>\""}], 
     "]"}]}]}], "]"}]}], "Input"]
}, Open  ]]

In other words, among all 4096 Turing machines, about 100 “novel configurations” are reached at each successive step. (The actual sequence here looks surprisingly random; it’s not clear whether there’s any particular regularity.)

Now let’s look at the actual graphs formed. With 4 through 7 steps we get:

Table
&#10005
Table[SimpleGraph[
  Flatten@Table[
    DirectedEdge @@@ 
     Partition[
      TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]}, 
       t], 2, 1], {tm, 0, 4095}], 
  EdgeStyle -> RGBColor[0.92, 0.17, 0.24]], {t, 4, 7}]

After 10 and 20 steps the results are:

ParallelTable
&#10005
ParallelTable[
 SimpleGraph[
  Flatten@Table[
    DirectedEdge @@@ 
     Partition[
      TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]}, 
       t], 2, 1], {tm, 0, 4095}], 
  EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
  VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
  ImageSize -> 250], {t, {10, 20}}]

Here is the result after 50 steps:

big = SimpleGraph
&#10005
big = SimpleGraph[
  With[{t = 50}, 
   Flatten@Table[
     DirectedEdge @@@ 
      Partition[
       TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]}, 
        t], 2, 1], {tm, 0, 4095}]], 
  EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
  VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
  VertexSize -> Tiny, 
  GraphLayout -> {"SpringElectricalEmbedding", 
    "InferentialDistance" -> 200, "StepControl" -> "NonMonotonic"}]

There’s a surprising amount of structure in these graphs. There’s a “central region” near the initial blank-tape configuration (shown highlighted below) in which many different Turing machines end up visiting the same configurations:

With
&#10005
With[{big = 
   SimpleGraph[
    With[{t = 50}, 
     Flatten@Table[
       DirectedEdge @@@ 
        Partition[
         TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
           t], 2, 1], {tm, 0, 4095}]], 
    EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
    VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
    VertexSize -> Tiny, 
    GraphLayout -> {"SpringElectricalEmbedding", 
      "InferentialDistance" -> 200, 
      "StepControl" -> "NonMonotonic"}]}, 
 HighlightGraph[big, Style[First[VertexList[big]], PointSize[0.04]], 
  PlotRange -> {{55, 85}, {50, 80}}]]

Here’s a 3D rendering of this region:

With
&#10005
With[{big = 
   SimpleGraph[
    With[{t = 50}, 
     Flatten@Table[
       DirectedEdge @@@ 
        Partition[
         TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
           t], 2, 1], {tm, 0, 4095}]], 
    EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
    VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
    VertexSize -> Tiny, 
    GraphLayout -> {"SpringElectricalEmbedding", 
      "InferentialDistance" -> 200, 
      "StepControl" -> "NonMonotonic"}]}, 
 Graph3D[big, 
  PlotRange -> {{50.32304, 70.32304}, {43.796, 63.796}, {30.87617, 
     50.87617}}, Boxed -> True]]

But away from this region there end up being “spokes” (about 100 of them) corresponding to Turing machines that “independently explore new territory” in the space of configurations.

What are those “configurations on the edge” like? Here are sorted collections of them for the first few steps:

Table
&#10005
Table[Labeled[
  With[{g = 
     SimpleGraph[
      Flatten@Table[
        DirectedEdge @@@ 
         Partition[
          TuringMachine[
           tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]}, t], 2, 
          1], {tm, 0, 4095}]]}, 
   RulePlot[TuringMachine[2506], 
    Sort[Select[VertexList[g], VertexOutDegree[g, #] == 0 &]], 
    Mesh -> All, Frame -> None]], Style[Text[t], 10]], {t, 2, 5}]

For comparison, here is the result for all configurations that can be reached by non-deterministic Turing machines after just 2 steps:

RulePlot[TuringMachine[2506]
&#10005
CloudGet[CloudObject[
  "https://www.wolframcloud.com/obj/wolframphysics/Bulletin/DeltaTM.\
wl"]]; RulePlot[TuringMachine[2506], 
 With[{t = 2}, 
  ToExpression /@ 
   VertexList[
    ResourceFunction["MultiwayTuringMachine"][
     AllDeltaTMRules[{2, 2}], {{1, t + 1, 0}, 
      ConstantArray[0, 2 t + 1]}, t, "StatesGraphStructure"]]], 
 Mesh -> All, Frame -> None]

Here are the results for deterministic Turing machines after more steps:

ParallelTable
&#10005
ParallelTable[
 Labeled[With[{g = 
     SimpleGraph[
      Flatten@Table[
        DirectedEdge @@@ 
         Partition[
          TuringMachine[
           tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]}, t], 2, 
          1], {tm, 0, 4095}]]}, 
   RulePlot[TuringMachine[2506], 
    Sort[Select[VertexList[g], VertexOutDegree[g, #] == 0 &]]]], 
  Style[Text[t], 10]], {t, 10, 50, 10}]

We can also ask which machines are the ones that typically “explore new territory”. Here’s the result for 30 steps:

Keys
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{"With", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{"t", "=", "30"}], "}"}], ",", 
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"g", "=", 
        RowBox[{"SimpleGraph", "[", 
         RowBox[{"Flatten", "@", 
          RowBox[{"Table", "[", 
           RowBox[{
            RowBox[{"DirectedEdge", "@@@", 
             RowBox[{"Partition", "[", 
              RowBox[{
               RowBox[{"TuringMachine", "[", 
                RowBox[{"tm", ",", 
                 RowBox[{"{", 
                  RowBox[{
                   RowBox[{"{", 
                    RowBox[{"1", ",", 
                    RowBox[{"t", "+", "1"}], ",", "0"}], "}"}], ",", 
                   RowBox[{"Table", "[", 
                    RowBox[{"0", ",", 
                    RowBox[{
                    RowBox[{"2", "t"}], "+", "1"}]}], "]"}]}], "}"}], 
                 ",", "t"}], "]"}], ",", "2", ",", "1"}], "]"}]}], 
            ",", 
            RowBox[{"{", 
             RowBox[{"tm", ",", "0", ",", "4095"}], "}"}]}], "]"}]}], 
         "]"}]}], ",", 
       RowBox[{"allstates", "=", 
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"TuringMachine", "[", 
           RowBox[{"tm", ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"{", 
               RowBox[{"1", ",", 
                RowBox[{"t", "+", "1"}], ",", "0"}], "}"}], ",", 
              RowBox[{"Table", "[", 
               RowBox[{"0", ",", 
                RowBox[{
                 RowBox[{"2", "t"}], "+", "1"}]}], "]"}]}], "}"}], 
            ",", "t"}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"tm", ",", "0", ",", "4095"}], "}"}]}], "]"}]}], 
       ",", "tms"}], "}"}], ",", 
     RowBox[{
      RowBox[{"tms", "=", 
       RowBox[{"Map", "[", 
        RowBox[{
         RowBox[{
          RowBox[{"Position", "[", 
           RowBox[{"allstates", ",", "#"}], "]"}], "&"}], ",", 
         RowBox[{"Select", "[", 
          RowBox[{
           RowBox[{"VertexList", "[", "g", "]"}], ",", 
           RowBox[{
            RowBox[{
             RowBox[{"VertexOutDegree", "[", 
              RowBox[{"g", ",", "#"}], "]"}], "\[Equal]", "0"}], 
            "&"}]}], "]"}]}], "]"}]}], ";", 
      RowBox[{
       RowBox[{
        RowBox[{"(", 
         RowBox[{
          RowBox[{"#", "[", 
           RowBox[{"[", 
            RowBox[{"1", ",", "1"}], "]"}], "]"}], "-", "1"}], ")"}], 
        "&"}], "/@", 
       RowBox[{"tms", "[", 
        RowBox[{"[", 
         RowBox[{
          RowBox[{"Position", "[", 
           RowBox[{
            RowBox[{"Length", "/@", "tms"}], ",", "1"}], "]"}], "//", 
          "Flatten"}], "]"}], "]"}]}]}]}], "]"}]}], "]"}]], "Input"],

Cell[BoxData[
 RowBox[{"Keys", "[", 
  RowBox[{"ReverseSort", "[", 
   RowBox[{"Counts", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"RulePlot", "[", 
       RowBox[{
        RowBox[{"TuringMachine", "[", "#", "]"}], ",", 
        RowBox[{"{", 
         RowBox[{"1", ",", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", "}"}], ",", "0"}], "}"}]}], "}"}], ",", "9", 
        ",", 
        RowBox[{"ImageSize", "\[Rule]", 
         RowBox[{"{", 
          RowBox[{"Automatic", ",", "45"}], "}"}]}], ",", 
        RowBox[{"FrameStyle", "\[Rule]", "LightGray"}]}], "]"}], 
      "&"}], "/@", 
     RowBox[{"{", 
      RowBox[{
      "378", ",", "391", ",", "407", ",", "819", ",", "974", ",", 
       "990", ",", "1331", ",", "1402", ",", "1415", ",", "1431", ",",
        "1439", ",", "1482", ",", "1498", ",", "1506", ",", "1514", 
       ",", "1520", ",", "1528", ",", "1530", ",", "1843", ",", 
       "1914", ",", "1923", ",", "1939", ",", "1955", ",", "1963", 
       ",", "1969", ",", "1971", ",", "1977", ",", "1998", ",", 
       "2006", ",", "2014", ",", "2426", ",", "2451", ",", "2455", 
       ",", "2498", ",", "2506", ",", "2514", ",", "2518", ",", 
       "2522", ",", "2530", ",", "2538", ",", "2546", ",", "2554", 
       ",", "2867", ",", "2947", ",", "2955", ",", "2963", ",", 
       "2971", ",", "2975", ",", "2979", ",", "2987", ",", "2995", 
       ",", "3003", ",", "3034", ",", "3038", ",", "3459", ",", 
       "3479", ",", "3499", ",", "3522", ",", "3530", ",", "3538", 
       ",", "3540", ",", "3554", ",", "3562", ",", "3570", ",", 
       "3578", ",", "3971", ",", "3979", ",", "3995", ",", "3997", 
       ",", "4003", ",", "4011", ",", "4019", ",", "4027", ",", 
       "4042", ",", "4062", ",", "4066"}], "}"}]}], "]"}], "]"}], 
  "]"}]], "Input"]
}, Open  ]]

As we go to more steps, the graph of configurations that can be reached by deterministic Turing machines grows. But does at least the core of it reach some kind of limit after sufficiently many steps? We can get a sense of this by looking—as we have done so many times before—at the growth rate of the geodesic ball in the graph starting from the initial blank-tape configuration. The total number of new configurations that can be reached on each new layer of the geodesic ball is at most 4096—and in reality it’s much smaller. Here are the numbers of new configurations added on successive layers at steps 100, 200, …, 500 in the overall evolution:

ListLinePlot
&#10005
Cell[CellGroupData[{
	Cell[BoxData[
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{
    RowBox[{
    "ResourceFunction", "[", "\"\<ParallelMapMonitored\>\"", "]"}], 
    "[", 
    RowBox[{
     RowBox[{
      RowBox[{"With", "[", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"t", "=", "#"}], "}"}], ",", 
        RowBox[{"Differences", "[", 
         RowBox[{"First", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"Values", "[", 
             RowBox[{
              RowBox[{
              "ResourceFunction", "[", 
               "\"\<GraphNeighborhoodVolumes\>\"", "]"}], "[", 
              RowBox[{"#", ",", 
               RowBox[{"{", 
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"{", 
                   RowBox[{"1", ",", 
                    RowBox[{"t", "+", "1"}], ",", "0"}], "}"}], ",", 
                  RowBox[{"ConstantArray", "[", 
                   RowBox[{"0", ",", 
                    RowBox[{
                    RowBox[{"2", "t"}], "+", "1"}]}], "]"}]}], "}"}], 
                "}"}]}], "]"}], "]"}], "&"}], "[", 
           RowBox[{"Graph", "[", 
            RowBox[{"Flatten", "@", 
             RowBox[{"ParallelTable", "[", 
              RowBox[{
               RowBox[{"DirectedEdge", "@@@", 
                RowBox[{"Partition", "[", 
                 RowBox[{
                  RowBox[{"TuringMachine", "[", 
                   RowBox[{"tm", ",", 
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{"1", ",", 
                    RowBox[{"t", "+", "1"}], ",", "0"}], "}"}], ",", 
                    RowBox[{"ConstantArray", "[", 
                    RowBox[{"0", ",", 
                    RowBox[{
                    RowBox[{"2", "t"}], "+", "1"}]}], "]"}]}], "}"}], 
                    ",", "t"}], "]"}], ",", "2", ",", "1"}], "]"}]}], 
               ",", 
               RowBox[{"{", 
                RowBox[{"tm", ",", "0", ",", "4095"}], "}"}]}], 
              "]"}]}], "]"}], "]"}], "]"}], "]"}]}], "]"}], "&"}], 
     ",", 
     RowBox[{"Range", "[", 
      RowBox[{"100", ",", "500", ",", "100"}], "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
      "8", ",", "27", ",", "56", ",", "88", ",", "110", ",", "148", 
       ",", "166", ",", "177", ",", "196", ",", "198", ",", "200", 
       ",", "198", ",", "190", ",", "190", ",", "186", ",", "187", 
       ",", "186", ",", "186", ",", "176", ",", "176", ",", "170", 
       ",", "170", ",", "162", ",", "164", ",", "158", ",", "158", 
       ",", "150", ",", "149", ",", "144", ",", "146", ",", "136", 
       ",", "138", ",", "134", ",", "138", ",", "124", ",", "122", 
       ",", "104", ",", "106", ",", "94", ",", "100", ",", "96", ",", 
       "95", ",", "88", ",", "92", ",", "90", ",", "94", ",", "88", 
       ",", "92", ",", "88", ",", "92", ",", "76", ",", "64", ",", 
       "54", ",", "56", ",", "54", ",", "54", ",", "56", ",", "55", 
       ",", "52", ",", "52", ",", "54", ",", "54", ",", "52", ",", 
       "52", ",", "52", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "48", ",", "50", ",", "48", ",", "48", ",", 
       "42", ",", "44", ",", "36", ",", "36"}], "}"}], ",", 
     RowBox[{"{", 
      RowBox[{
      "8", ",", "27", ",", "56", ",", "88", ",", "110", ",", "148", 
       ",", "168", ",", "185", ",", "212", ",", "212", ",", "218", 
       ",", "212", ",", "214", ",", "218", ",", "214", ",", "215", 
       ",", "216", ",", "216", ",", "204", ",", "202", ",", "196", 
       ",", "194", ",", "188", ",", "190", ",", "186", ",", "192", 
       ",", "186", ",", "185", ",", "182", ",", "182", ",", "176", 
       ",", "180", ",", "178", ",", "178", ",", "172", ",", "172", 
       ",", "166", ",", "168", ",", "162", ",", "166", ",", "164", 
       ",", "165", ",", "156", ",", "160", ",", "156", ",", "154", 
       ",", "148", ",", "148", ",", "150", ",", "150", ",", "148", 
       ",", "148", ",", "148", ",", "150", ",", "140", ",", "142", 
       ",", "138", ",", "141", ",", "132", ",", "136", ",", "134", 
       ",", "138", ",", "132", ",", "136", ",", "134", ",", "138", 
       ",", "132", ",", "132", ",", "118", ",", "120", ",", "104", 
       ",", "100", ",", "96", ",", "98", ",", "90", ",", "97", ",", 
       "94", ",", "94", ",", "88", ",", "92", ",", "90", ",", "98", 
       ",", "92", ",", "92", ",", "90", ",", "94", ",", "88", ",", 
       "94", ",", "90", ",", "96", ",", "90", ",", "94", ",", "92", 
       ",", "96", ",", "90", ",", "93", ",", "90", ",", "94", ",", 
       "86", ",", "90", ",", "78", ",", "66", ",", "52", ",", "52", 
       ",", "54", ",", "54", ",", "52", ",", "52", ",", "54", ",", 
       "54", ",", "54", ",", "54", ",", "56", ",", "56", ",", "54", 
       ",", "54", ",", "56", ",", "55", ",", "52", ",", "52", ",", 
       "54", ",", "54", ",", "50", ",", "50", ",", "52", ",", "52", 
       ",", "50", ",", "50", ",", "52", ",", "52", ",", "50", ",", 
       "50", ",", "52", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "48", ",", "50", ",", "48", ",", "48", ",", "42", ",", 
       "44", ",", "36", ",", "36"}], "}"}], ",", 
     RowBox[{"{", 
      RowBox[{
      "8", ",", "27", ",", "56", ",", "88", ",", "110", ",", "148", 
       ",", "168", ",", "187", ",", "218", ",", "222", ",", "236", 
       ",", "226", ",", "232", ",", "230", ",", "228", ",", "227", 
       ",", "230", ",", "232", ",", "234", ",", "230", ",", "226", 
       ",", "220", ",", "214", ",", "210", ",", "208", ",", "214", 
       ",", "210", ",", "209", ",", "204", ",", "202", ",", "196", 
       ",", "198", ",", "198", ",", "198", ",", "190", ",", "188", 
       ",", "182", ",", "184", ",", "176", ",", "178", ",", "176", 
       ",", "177", ",", "168", ",", "172", ",", "168", ",", "170", 
       ",", "164", ",", "166", ",", "164", ",", "168", ",", "162", 
       ",", "166", ",", "164", ",", "170", ",", "164", ",", "168", 
       ",", "166", ",", "169", ",", "162", ",", "166", ",", "162", 
       ",", "164", ",", "156", ",", "160", ",", "156", ",", "154", 
       ",", "148", ",", "148", ",", "150", ",", "148", ",", "148", 
       ",", "148", ",", "150", ",", "150", ",", "148", ",", "147", 
       ",", "148", ",", "148", ",", "140", ",", "142", ",", "136", 
       ",", "138", ",", "130", ",", "134", ",", "132", ",", "136", 
       ",", "130", ",", "134", ",", "132", ",", "138", ",", "132", 
       ",", "136", ",", "134", ",", "138", ",", "132", ",", "135", 
       ",", "132", ",", "136", ",", "130", ",", "134", ",", "132", 
       ",", "128", ",", "110", ",", "110", ",", "98", ",", "104", ",",
        "92", ",", "92", ",", "90", ",", "94", ",", "90", ",", "98", 
       ",", "96", ",", "96", ",", "90", ",", "94", ",", "92", ",", 
       "99", ",", "92", ",", "92", ",", "90", ",", "94", ",", "88", 
       ",", "96", ",", "94", ",", "94", ",", "88", ",", "92", ",", 
       "90", ",", "96", ",", "90", ",", "92", ",", "90", ",", "96", 
       ",", "90", ",", "94", ",", "92", ",", "96", ",", "90", ",", 
       "94", ",", "92", ",", "95", ",", "88", ",", "92", ",", "90", 
       ",", "94", ",", "88", ",", "92", ",", "88", ",", "92", ",", 
       "76", ",", "64", ",", "54", ",", "54", ",", "52", ",", "52", 
       ",", "54", ",", "54", ",", "54", ",", "54", ",", "56", ",", 
       "56", ",", "54", ",", "54", ",", "56", ",", "56", ",", "54", 
       ",", "53", ",", "54", ",", "54", ",", "52", ",", "52", ",", 
       "54", ",", "54", ",", "52", ",", "52", ",", "54", ",", "54", 
       ",", "52", ",", "52", ",", "54", ",", "54", ",", "52", ",", 
       "52", ",", "52", ",", "52", ",", "50", ",", "50", ",", "52", 
       ",", "52", ",", "50", ",", "50", ",", "52", ",", "52", ",", 
       "50", ",", "50", ",", "52", ",", "52", ",", "50", ",", "50", 
       ",", "52", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "48", ",", 
       "50", ",", "48", ",", "48", ",", "42", ",", "44", ",", "36", 
       ",", "36"}], "}"}], ",", 
     RowBox[{"{", 
      RowBox[{
      "8", ",", "27", ",", "56", ",", "88", ",", "110", ",", "148", 
       ",", "168", ",", "187", ",", "220", ",", "226", ",", "240", 
       ",", "230", ",", "238", ",", "234", ",", "234", ",", "233", 
       ",", "236", ",", "236", ",", "240", ",", "240", ",", "240", 
       ",", "232", ",", "226", ",", "220", ",", "218", ",", "224", 
       ",", "220", ",", "219", ",", "218", ",", "216", ",", "210", 
       ",", "212", ",", "212", ",", "212", ",", "206", ",", "204", 
       ",", "198", ",", "198", ",", "190", ",", "192", ",", "190", 
       ",", "191", ",", "184", ",", "188", ",", "186", ",", "186", 
       ",", "180", ",", "180", ",", "178", ",", "182", ",", "176", 
       ",", "176", ",", "174", ",", "176", ",", "168", ",", "172", 
       ",", "170", ",", "173", ",", "168", ",", "170", ",", "168", 
       ",", "170", ",", "166", ",", "170", ",", "170", ",", "172", 
       ",", "166", ",", "168", ",", "166", ",", "168", ",", "162", 
       ",", "166", ",", "164", ",", "168", ",", "162", ",", "165", 
       ",", "162", ",", "166", ",", "160", ",", "164", ",", "162", 
       ",", "164", ",", "156", ",", "160", ",", "156", ",", "154", 
       ",", "148", ",", "148", ",", "150", ",", "152", ",", "150", 
       ",", "150", ",", "152", ",", "152", ",", "150", ",", "149", 
       ",", "150", ",", "150", ",", "148", ",", "148", ",", "150", 
       ",", "150", ",", "146", ",", "146", ",", "140", ",", "142", 
       ",", "134", ",", "138", ",", "134", ",", "138", ",", "134", 
       ",", "138", ",", "136", ",", "140", ",", "134", ",", "138", 
       ",", "136", ",", "139", ",", "132", ",", "136", ",", "134", 
       ",", "138", ",", "132", ",", "136", ",", "134", ",", "138", 
       ",", "132", ",", "136", ",", "134", ",", "138", ",", "132", 
       ",", "136", ",", "134", ",", "140", ",", "124", ",", "124", 
       ",", "108", ",", "108", ",", "94", ",", "96", ",", "92", ",", 
       "99", ",", "92", ",", "92", ",", "90", ",", "94", ",", "88", 
       ",", "96", ",", "94", ",", "94", ",", "88", ",", "92", ",", 
       "90", ",", "98", ",", "92", ",", "92", ",", "90", ",", "94", 
       ",", "90", ",", "98", ",", "96", ",", "96", ",", "90", ",", 
       "94", ",", "92", ",", "100", ",", "94", ",", "93", ",", "90", 
       ",", "94", ",", "88", ",", "96", ",", "92", ",", "94", ",", 
       "88", ",", "92", ",", "90", ",", "94", ",", "88", ",", "92", 
       ",", "90", ",", "94", ",", "88", ",", "92", ",", "90", ",", 
       "96", ",", "90", ",", "94", ",", "92", ",", "96", ",", "90", 
       ",", "94", ",", "92", ",", "96", ",", "90", ",", "93", ",", 
       "90", ",", "94", ",", "86", ",", "90", ",", "78", ",", "66", 
       ",", "52", ",", "52", ",", "54", ",", "54", ",", "52", ",", 
       "52", ",", "54", ",", "54", ",", "52", ",", "52", ",", "54", 
       ",", "54", ",", "54", ",", "54", ",", "56", ",", "56", ",", 
       "54", ",", "54", ",", "56", ",", "56", ",", "54", ",", "54", 
       ",", "56", ",", "55", ",", "52", ",", "52", ",", "54", ",", 
       "54", ",", "52", ",", "52", ",", "54", ",", "54", ",", "52", 
       ",", "52", ",", "54", ",", "54", ",", "50", ",", "50", ",", 
       "52", ",", "52", ",", "50", ",", "50", ",", "52", ",", "52", 
       ",", "50", ",", "50", ",", "52", ",", "52", ",", "50", ",", 
       "50", ",", "52", ",", "52", ",", "50", ",", "50", ",", "52", 
       ",", "52", ",", "50", ",", "50", ",", "52", ",", "52", ",", 
       "50", ",", "50", ",", "52", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "48", ",", "50", ",", 
       "48", ",", "48", ",", "42", ",", "44", ",", "36", ",", "36"}], 
      "}"}], ",", 
     RowBox[{"{", 
      RowBox[{
      "8", ",", "27", ",", "56", ",", "88", ",", "110", ",", "148", 
       ",", "168", ",", "187", ",", "220", ",", "228", ",", "246", 
       ",", "240", ",", "258", ",", "248", ",", "252", ",", "247", 
       ",", "250", ",", "246", ",", "248", ",", "248", ",", "252", 
       ",", "246", ",", "244", ",", "240", ",", "238", ",", "240", 
       ",", "238", ",", "237", ",", "238", ",", "234", ",", "228", 
       ",", "228", ",", "230", ",", "230", ",", "228", ",", "224", 
       ",", "218", ",", "216", ",", "204", ",", "206", ",", "200", 
       ",", "201", ",", "194", ",", "198", ",", "196", ",", "196", 
       ",", "190", ",", "190", ",", "188", ",", "192", ",", "186", 
       ",", "186", ",", "184", ",", "186", ",", "178", ",", "182", 
       ",", "180", ",", "181", ",", "176", ",", "176", ",", "172", 
       ",", "174", ",", "168", ",", "172", ",", "172", ",", "174", 
       ",", "168", ",", "170", ",", "168", ",", "170", ",", "164", 
       ",", "166", ",", "164", ",", "168", ",", "162", ",", "165", 
       ",", "162", ",", "166", ",", "160", ",", "164", ",", "162", 
       ",", "166", ",", "160", ",", "164", ",", "162", ",", "166", 
       ",", "160", ",", "164", ",", "162", ",", "168", ",", "162", 
       ",", "166", ",", "164", ",", "168", ",", "162", ",", "165", 
       ",", "162", ",", "166", ",", "160", ",", "164", ",", "162", 
       ",", "164", ",", "156", ",", "160", ",", "156", ",", "154", 
       ",", "148", ",", "148", ",", "150", ",", "150", ",", "150", 
       ",", "150", ",", "152", ",", "152", ",", "150", ",", "150", 
       ",", "152", ",", "151", ",", "148", ",", "148", ",", "150", 
       ",", "150", ",", "148", ",", "148", ",", "150", ",", "150", 
       ",", "148", ",", "148", ",", "144", ",", "146", ",", "136", 
       ",", "138", ",", "134", ",", "140", ",", "132", ",", "136", 
       ",", "134", ",", "138", ",", "132", ",", "136", ",", "134", 
       ",", "137", ",", "130", ",", "134", ",", "132", ",", "136", 
       ",", "130", ",", "134", ",", "132", ",", "136", ",", "130", 
       ",", "134", ",", "132", ",", "136", ",", "130", ",", "134", 
       ",", "132", ",", "136", ",", "132", ",", "136", ",", "134", 
       ",", "138", ",", "132", ",", "136", ",", "134", ",", "138", 
       ",", "132", ",", "131", ",", "116", ",", "116", ",", "98", ",",
        "102", ",", "98", ",", "96", ",", "88", ",", "92", ",", "90", 
       ",", "98", ",", "92", ",", "92", ",", "90", ",", "94", ",", 
       "88", ",", "96", ",", "94", ",", "96", ",", "90", ",", "94", 
       ",", "92", ",", "100", ",", "94", ",", "94", ",", "92", ",", 
       "96", ",", "90", ",", "97", ",", "94", ",", "94", ",", "88", 
       ",", "92", ",", "90", ",", "98", ",", "92", ",", "92", ",", 
       "90", ",", "94", ",", "88", ",", "96", ",", "94", ",", "94", 
       ",", "88", ",", "92", ",", "90", ",", "98", ",", "94", ",", 
       "94", ",", "92", ",", "96", ",", "90", ",", "94", ",", "92", 
       ",", "96", ",", "90", ",", "94", ",", "92", ",", "95", ",", 
       "88", ",", "92", ",", "90", ",", "94", ",", "88", ",", "92", 
       ",", "90", ",", "94", ",", "88", ",", "92", ",", "90", ",", 
       "94", ",", "88", ",", "92", ",", "90", ",", "94", ",", "88", 
       ",", "92", ",", "90", ",", "96", ",", "90", ",", "94", ",", 
       "90", ",", "94", ",", "78", ",", "66", ",", "56", ",", "56", 
       ",", "54", ",", "54", ",", "56", ",", "55", ",", "52", ",", 
       "52", ",", "54", ",", "54", ",", "52", ",", "52", ",", "54", 
       ",", "54", ",", "52", ",", "52", ",", "54", ",", "54", ",", 
       "52", ",", "52", ",", "54", ",", "54", ",", "52", ",", "52", 
       ",", "54", ",", "54", ",", "54", ",", "54", ",", "56", ",", 
       "56", ",", "54", ",", "54", ",", "54", ",", "54", ",", "52", 
       ",", "52", ",", "54", ",", "54", ",", "52", ",", "51", ",", 
       "52", ",", "52", ",", "50", ",", "50", ",", "52", ",", "52", 
       ",", "50", ",", "50", ",", "52", ",", "52", ",", "50", ",", 
       "50", ",", "52", ",", "52", ",", "50", ",", "50", ",", "52", 
       ",", "52", ",", "50", ",", "50", ",", "52", ",", "52", ",", 
       "50", ",", "50", ",", "52", ",", "52", ",", "50", ",", "50", 
       ",", "52", ",", "52", ",", "50", ",", "50", ",", "52", ",", 
       "52", ",", "50", ",", "50", ",", "52", ",", "52", ",", "50", 
       ",", "50", ",", "52", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", 
       ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", 
       "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", 
       ",", "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", 
       "50", ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", 
       ",", "50", ",", "50", ",", "50", ",", "52", ",", "50", ",", 
       "50", ",", "50", ",", "52", ",", "50", ",", "50", ",", "50", 
       ",", "52", ",", "50", ",", "50", ",", "50", ",", "52", ",", 
       "50", ",", "50", ",", "48", ",", "50", ",", "48", ",", "48", 
       ",", "42", ",", "44", ",", "36", ",", "36"}], "}"}]}], "}"}]}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"ListLinePlot", "[", 
  RowBox[{"data", ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "3"}]}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", 
      "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
     RowBox[{"\"\<GenericLinePlot\>\"", ",", "\"\<PlotStyles\>\""}], 
     "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

The cause of the “steps down” becomes clearer if one examines more closely the “spokes” in the graph above. Here’s one example:

With
&#10005
With[{big = 
   SimpleGraph[
    With[{t = 50}, 
     Flatten@Table[
       DirectedEdge @@@ 
        Partition[
         TuringMachine[tm, {{1, t + 1, 0}, ConstantArray[0, 2 t + 1]},
           t], 2, 1], {tm, 0, 4095}]], 
    EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
    VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
    VertexSize -> Tiny, 
    GraphLayout -> {"SpringElectricalEmbedding", 
      "InferentialDistance" -> 200, 
      "StepControl" -> "NonMonotonic"}]}, 
 Show[big, PlotRange -> {{85, 95}, {64, 67}}]]

And basically what’s happening is that multiple Turing machines are tracing out roughly the same sequences of configurations, but some do it “efficiently”, while others “waste time”, for example having the head flip around on alternate steps. The “inner parts” of the spokes—that are closer to the initial node—involve both “inefficient” and more efficient Turing machines. But the “inefficient” Turing machines will simply not get as far, so they do not contribute to the outer layers of the geodesic ball. The final “step down” in the plot above—at basically half the total number of steps used for the Turing machines—involves the “petering out” of the roughly half the Turing machines that effectively “waste half their steps”.

For the “spoke” shown, here are the actual Turing machine histories involved (there are 8 machines that made identical copies of the first history):

Keys
&#10005
Keys[Counts[
  RulePlot[TuringMachine[#], {1, {{}, 0}}, 20, 
     FrameStyle -> LightGray, ImageSize -> {Automatic, 150}] & /@ 
   Union[{3491, 439, 951, 1463, 1488, 1492, 1528, 1975, 2487, 2967, 
     2999, 3479, 3511, 3891, 4023, 1528, 3507, 3891, 1528}]]]

The Cellular Automaton Analog

The kind of graphs we’ve just made for deterministic Turing machines can be made for any family of deterministic computational systems. And in particular they can be made for the (at least for me, much more familiar) case of the 256 k = 2, r = 1 cellular automata. (And, yes, it’s somewhat amazing that in all these years I’ve never made such pictures before—though there’s a note on page 956 of A New Kind of Science that gets close.)

Here are the results for 5 and 10 steps, starting from an initial condition containing a single black cell:

Graph
&#10005
Graph[With[{t = #}, 
    Flatten@Table[
      DirectedEdge @@@ 
       Partition[CellularAutomaton[ru, CenterArray[{1}, 2 t + 1], t], 
        2, 1], {ru, 0, 255}]], 
   EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
   VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
   GraphLayout -> {"SpringElectricalEmbedding", 
     "InferentialDistance" -> 200, "StepControl" -> "NonMonotonic"}, 
   ImageSize -> 280] & /@ {5, 10}

And here is the result for 50 steps:

Graph
&#10005
Graph[With[{t = 50}, 
  Flatten@Table[
    DirectedEdge @@@ 
     Partition[CellularAutomaton[ru, CenterArray[{1}, 2 t + 1], t], 2,
       1], {ru, 0, 255}]], EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
 VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
 VertexSize -> Tiny, 
 GraphLayout -> {"SpringElectricalEmbedding", 
   "InferentialDistance" -> 200, "StepControl" -> "NonMonotonic"}]

Here is the result for 10 steps, annotated with the actual cellular automaton evolution for the rules that “reach furthest”:

With
&#10005
With[{t = 10}, 
 With[{g = 
    Graph[Flatten@
      Table[DirectedEdge @@@ 
        Partition[CellularAutomaton[tm, CenterArray[{1}, 2 t + 1], t],
          2, 1], {tm, 0, 255}], 
     EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
     VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
     GraphLayout -> {"SpringElectricalEmbedding", 
       "InferentialDistance" -> 200, 
       "StepControl" -> "NonMonotonic"}]}, 
  With[{call = 
     Table[CellularAutomaton[ru, CenterArray[{1}, 2 t + 1], t], {ru, 
       0, 255}]}, 
   Graph[g, 
    VertexLabels -> (# -> 
         ArrayPlot[
          CellularAutomaton[
           First@FirstPosition[call, #] - 1, {{1}, 0}, {t, All}], 
          ImageSize -> 35] & /@ 
       Pick[VertexList[g], VertexOutDegree[g], 0])]]]]

It’s slightly easier to see what’s going on if we include only even-numbered rules (which leave a blank state blank):

With
&#10005
With[{t = 10}, 
 With[{g = 
    Graph[Flatten@
      Table[DirectedEdge @@@ 
        Partition[CellularAutomaton[tm, CenterArray[{1}, 2 t + 1], t],
          2, 1], {tm, 0, 255, 2}], 
     EdgeStyle -> RGBColor[0.92, 0.17, 0.24], 
     VertexStyle -> Darker[RGBColor[0.92, 0.17, 0.24], 0.4], 
     GraphLayout -> {"SpringElectricalEmbedding", 
       "InferentialDistance" -> 200, 
       "StepControl" -> "NonMonotonic"}]}, 
  With[{call = 
     Table[CellularAutomaton[ru, CenterArray[{1}, 2 t + 1], t], {ru, 
       0, 255}]}, 
   Graph[g, 
    VertexLabels -> (# -> (ArrayPlot[
           CellularAutomaton[(First[
               Select[First /@ Position[call, #], OddQ]]) - 1, {{1}, 
             0}, {t, All}], ImageSize -> 35]) & /@ 
       Pick[VertexList[g], VertexOutDegree[g], 0])]]]]

It’s an interesting map of “cellular automaton space”. (Note the presence of rule 30 on the lower left, and rule 110 on the lower right.)

The total number of new configurations explored by all rules on successive steps has a more regular form than for Turing machines:

ListLinePlot
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{"With", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"t", "=", "200"}], "}"}], ",", 
     RowBox[{"With", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"rr", "=", 
         RowBox[{"Table", "[", 
          RowBox[{
           RowBox[{"DirectedEdge", "@@@", 
            RowBox[{"Partition", "[", 
             RowBox[{
              RowBox[{"CellularAutomaton", "[", 
               RowBox[{"ru", ",", 
                RowBox[{"CenterArray", "[", 
                 RowBox[{
                  RowBox[{"{", "1", "}"}], ",", 
                  RowBox[{
                   RowBox[{"2", "t"}], "+", "1"}]}], "]"}], ",", 
                "t"}], "]"}], ",", "2", ",", "1"}], "]"}]}], ",", 
           RowBox[{"{", 
            RowBox[{"ru", ",", "0", ",", "255"}], "}"}]}], "]"}]}], 
        "}"}], ",", 
       RowBox[{"Table", "[", 
        RowBox[{
         RowBox[{"VertexCount", "[", 
          RowBox[{"Graph", "[", 
           RowBox[{"Flatten", "[", 
            RowBox[{"Take", "[", 
             RowBox[{"rr", ",", "All", ",", "u"}], "]"}], "]"}], 
           "]"}], "]"}], ",", 
         RowBox[{"{", 
          RowBox[{"u", ",", "t"}], "}"}]}], "]"}]}], "]"}]}], "]"}]}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"ListLinePlot", "[", 
  RowBox[{
   RowBox[{"Differences", "[", "data", "]"}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "3"}]}], ",", 
   RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", 
      "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
     RowBox[{"\"\<GenericLinePlot\>\"", ",", "\"\<PlotStyles\>\""}], 
     "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

The result mostly alternates with period 4 between 72 and 84, though with dips at steps of the form 2m.

If we go for a certain number of steps (say 200), and then look at the geodesic ball centered on the initial condition, the number of configurations in successive layers is just:

ListLinePlot
&#10005
Cell[CellGroupData[{
		Cell[BoxData[
 RowBox[{
  RowBox[{"data", "=", 
   RowBox[{"With", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"t", "=", "200"}], "}"}], ",", 
     RowBox[{"Differences", "[", 
      RowBox[{"First", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"Values", "[", 
          RowBox[{
           RowBox[{
           "ResourceFunction", "[", 
            "\"\<GraphNeighborhoodVolumes\>\"", "]"}], "[", 
           RowBox[{"#", ",", 
            RowBox[{"{", 
             RowBox[{"CenterArray", "[", 
              RowBox[{
               RowBox[{"{", "1", "}"}], ",", 
               RowBox[{
                RowBox[{"2", "t"}], "+", "1"}]}], "]"}], "}"}]}], 
           "]"}], "]"}], "&"}], "[", 
        RowBox[{"Graph", "[", 
         RowBox[{"Flatten", "@", 
          RowBox[{"ParallelTable", "[", 
           RowBox[{
            RowBox[{"DirectedEdge", "@@@", 
             RowBox[{"Partition", "[", 
              RowBox[{
               RowBox[{"CellularAutomaton", "[", 
                RowBox[{"ru", ",", 
                 RowBox[{"CenterArray", "[", 
                  RowBox[{
                   RowBox[{"{", "1", "}"}], ",", 
                   RowBox[{
                    RowBox[{"2", "t"}], "+", "1"}]}], "]"}], ",", 
                 "t"}], "]"}], ",", "2", ",", "1"}], "]"}]}], ",", 
            RowBox[{"{", 
             RowBox[{"ru", ",", "0", ",", "255"}], "}"}]}], "]"}]}], 
         "]"}], "]"}], "]"}], "]"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"ListLinePlot", "[", 
  RowBox[{"data", ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "3"}]}], ",", 
   RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", 
      "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
     RowBox[{"\"\<GenericLinePlot\>\"", ",", "\"\<PlotStyles\>\""}], 
     "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

These results are all for ordinary, deterministic cellular automata. So are there non-deterministic cellular automata? Typically, cellular automata are defined to consistently update every cell at every step. But one can also consider sequential cellular automata, where specific cells are updated at each step—and in this case it is straightforward to define a non-deterministic version, for which things like multiway systems can be constructed. (Note that for a full rulial multiway system, such non-deterministic cellular automata are basically the same as non-deterministic mobile automata, which are close to Turing machines.)

Computation Capabilities and the Structure of Rulial Space

What do the computational capabilities of Turing machines mean for their rulial space? Let’s start with computation universality.

An important fact about deterministic Turing machines is that there are universal ones. Among the 4096 s = 2, k = 2 rules there aren’t any. But as soon one goes to s = 2, k = 3 rules (of which there are 2,985,984) it’s known (thanks to my results in A New Kind of Science, and Alex Smith’s 2007 proof) that there is a universal machine. The rule is:

RulePlot
&#10005
RulePlot[TuringMachine[{596440, 2, 3}]]

Starting from a blank tape, this machine gives

RulePlot
&#10005
RulePlot[TuringMachine[{596440, 2, 3}], {1, {{}, 0}}, 80]

and the corresponding causal graph is:

With
&#10005
With[{t = 2000}, 
 Graph[Rule @@@ 
   TMCausalData[
    TuringMachine[{596440, 2, 3}, {{1, t}, Table[0, 2 t + 1]}, t]], 
  EdgeStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "EdgeStyle"], 
  VertexStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["CausalGraph", 
    "VertexStyle"]]]

But what does the existence of a universal machine mean in the rulial multiway system? From any given initial condition, any deterministic Turing machine will trace out some trajectory in the rulial multiway graph. And if a machine is universal it means that by appropriately picking its initial conditions it can be “programmed” to “emulate” any other Turing machine, in the sense that its trajectory will track the trajectory of whatever Turing machine it’s emulating. What does “track” mean? Basically, that there’s some fixed scheme that allows one to go from the states of the universal Turing machine to the states of the machine it’s emulating. The “scheme” will correspond to some translation in the rulial multiway graph, and the requirement is that this translation is somehow always limited. In other words, the trajectory of a universal machine will “flail around” as its starting point (i.e. initial condition) moves in the rulial multiway graph, and this “flailing” will be diverse enough that the trajectory can get close to the trajectory of any given other machine.

If, on the other hand, the machine one’s dealing with isn’t universal, then its trajectory won’t “flail around” enough for this work; the trajectory will in a sense be too constrained to successfully “track” all possible other deterministic Turing machine trajectories.

What about non-deterministic Turing machines? What universality can be interpreted to mean in this case is that given a particular initial condition, the output one wants occurs somewhere on the different paths followed by the non-deterministic Turing machine. (If one’s trying to do a decision problem—as in the computational complexity class NP—then one can arrange to “signal that one’s got an answer” through some feature of the Turing machine state.) In the case of “extreme non-determinism”—as used to construct the rulial multiway graph—computation universality in a sense becomes a statement purely about the structure of the rulial multiway graph. And basically it just requires that the rulial multiway graph is sufficiently connected—which is guaranteed if there’s causal invariance (so there’s nothing like an “event horizon” anywhere).

But does one have to allow “extreme non-determinism” to get universality? With s = 2, k = 3 we know that there’s a purely deterministic Turing machine that achieves it. And my guess is that among s = 2, k = 2 there are “slightly multiway” rules that also do. In a standard deterministic s = 2, k = 2 Turing machine, there are 4 cases of the rule that are each specified to have a unique outcome. But what if even just one of those cases in the rule has two outcomes? The rule is non-deterministic, but it can be thought of as just being the result of specifying 5 defining cases for the rule. And it’s my guess that even this will be sufficient to get universality in the system.

A non-deterministic rule like this will not trace out a single path in the rulial multiway graph. Instead, it’ll give a bundle of paths, with different paths corresponding to different non-deterministic choices. But the story of universality is very similar to the deterministic case: one simply has to ask whether anything in the bundle successfully manages to track the trajectory of the machine one’s emulating.

It’s worth remembering that any given rule won’t typically be following geodesics in rulial space. It’ll be following some more circuitous path (or bundle of paths). But let’s say one has some rule that traces out some trajectory—corresponding to performing some computation. The Principle of Computational Equivalence implies that across different possible rules, there’s a standard “maximum computational sophistication” for these computations, and many rules achieve it. But then the principle also implies that there’s equivalence between these maximally sophisticated computations, in the sense that there’s always a limited computation that translates between them.

Let’s think about this in rulial space. Let’s say we have two rules, starting from the same initial condition. Well, then, at the beginning it’s trivial to translate between the rules. But after t steps, the states reached by these rules can have diverged—making it potentially progressively more difficult to translate between them. A key idea of the rulial multiway graph—and rulial space—is that it lets one talk about both computations and translations between them in uniform ways. Let’s say that the trajectories of two rules have gone a certain distance in rulial space. Then one can look at their divergence, and see how long a “translation computation” one has to do to get from one to the other.

In ordinary spacetime, let’s say a certain time t has elapsed. Then we know that the maximum spatial distance that can have been traversed is c t, where c is the speed of light. In rulial space, there’s something directly analogous: in time t, there’s a maximum rulial distance that can be traversed, which we can call ρ t. But here the Principle of Computational Equivalence makes it a crucial contribution: it implies that throughout rulial space, and in all situations, ρ is fixed. It can take an irreducible amount of computational work to successfully translate from the outcome of one rule to another. But this always scales the same way. There’s in effect one scale of computational irreducibility, and it’s characterized by ρ. Like the constancy of the speed of light uniformly limits physical motion, the constancy of ρ uniformly limits rulial motion.

But let’s say you’re trying to get from one point in rulial space to another. If from your starting point you can follow the path of an irreducible—and effectively universal—computation, then you’ll successfully be able to reach the other point. But if from your starting point you can only follow a reducible computation this won’t generally be true. And what this means is that “pockets of computational reducibility” in rulial space act a bit like black holes in physical space. You can get into them from regions of irreducibility, but you can’t get out of them.

There are probably signs of phenomena like this even in the rulial space for simple Turing machines that we’ve explored here. But there’s considerably more that needs to be worked out to be able to make all the necessary connections.

The Emerging Picture of Rulial Space

There are lots of analogies between physical space, the branchial space, and rulial space. For example, in physical space, there are light cones that govern the maximum rate at which effects can propagate between different parts of physical space. In branchial space, there are entanglement cones that govern the maximum rate of quantum entanglement. And in rulial space, one can think of “emulation cones”, which govern the maximum rate at which one description of behavior can be translated into another.

And when it comes to applying these things to modeling the physical universe, a crucial point is that the observer is necessarily part of the system, governed by the same rules as everything else. And that means that the observer can only be sensitive to certain “appropriately modded-out” aspects of the system. But in actually imagining how an observer “perceives” a system it’s almost always convenient to think about coordinatizing the system—by defining some appropriate foliation. In physical space, this involves foliating the causal graph using reference frames like in relativity. In branchial space, it involves our concept of quantum observation frames. And in rulial space, we can invent another such concept: a rulial description frame, or just a rulial frame.

Different rulial frames in effect correspond to describing the evolution of the universe as operating according to different rules. Causal invariance implies that in the end different rulial frames must give equivalent results. But the specific way one describes the time evolution of the universe will depend on what rulial frame one’s using. In one frame one would be describing the universe in one way; in another frame, another way. And the “story one tells” about how the universe evolves will be different in the different frames.

Much like with superposition in quantum mechanics, there’s probably some notion of regions in rulial space, in which one’s somehow viewing the universe as operating according to “rulially entangled” collections of rules.

But while our original motivation was understanding physics, a lot of what we’re studying about rulial space also applies to purely computational systems. For example, we can think of rulial space even without having any notion of physical space. And we can in effect imagine that rulial space is some kind of map of a space of possible rules for computational systems. (Notice that because of computation universality and the Principle of Computational Equivalence it ultimately doesn’t matter what particular type of rule—Turing machine, cellular automaton, combinator, whatever—is used to “parametrize” rulial space.)

Different places in rulial space in some sense correspond to different rules. Paths at different places in rulial space correspond to evolution according to different rules. So what is the analog of motion in rulial space? In effect it’s having a frame which progressively changes the rule one’s using. If one’s trying to find out what happens in one’s system, it’s fundamentally most efficient to “stick with one rule”. If one progressively changes the rule, one’s going to have to keep “translating back” to the original rule, by somehow emulating this rule with whatever rule one’s reached. And the result of this is there’ll be exactly the analog of relativistic time dilation. Faster motion in rulial space leads to slower progression in time. Of course, to discuss this properly, we really have to talk about the rulial multiway causal graph, etc.

But one thing is clear: motion faster than some maximum speed ρ is impossible. From within the system, you simply can’t keep the correct rulial causal connections if you’re changing your rulial location faster than ρ.

In an abstract study of Turing machines, ρ is just an arbitrary parameter. But in the actual physical universe, it must have a definite value, like the speed of light c, or our maximum entanglement speed ζ. It’s difficult even to estimate what ζ might be. And presumably estimating ρ will be even harder. But it’s interesting to discuss at least how we might start to think about estimating ρ.

The first key observation about rulial space is that in our models, it’s discrete. In other words, there’s a discrete space of possible rules. Or, put another way, theories are quantized, and there’s somehow an elementary distance between theories—or a minimum distance in “theory space” between neighboring theories.

But what units is this distance in? Basically it’s in units of rule—or program—size. Given any program—or rule—we can imagine writing that program out in some language (say in Wolfram Language, or as a program for a particular universal Turing machine, or whatever) And now we can characterize the size of the program by just looking at how many tokens it takes to write the program out.

Of course, with different languages, that number will be different—at the simplest level just like the number of decimal digits necessary to represent a number is different from the number of binary digits, or the length of its representation in terms of primes. But it’s just like measuring a length in feet or meters: even though the numerical value is different, we’re still describing the same length.

It’s important to point out that it’s not enough to just measure things in terms of “raw information content”, or ordinary bits, as discussed in information theory. Rather, we want some kind of measure of “semantic information content”: information content that directly tells us what computation to do.

It’s also important to point out that what we need is different from what’s discussed in algorithmic information theory. Once one has a computation universal system, one can always use it to translate from any one language to any other. And in algorithmic information theory the concept is that one can measure the length of a program up to an additive constant by just expecting to include an “emulation program” that adapts to whatever language one’s measuring the length in. But in the usual formalism of algorithmic information theory one doesn’t worry about how long it’s going to take for the emulation to be done; it’s just a question of whether there’s ultimately enough information to do it.

In our setup, however, it does matter how long the emulation takes, because that process of emulation is actually part of our system. And basically we need the number of steps needed for the emulation to be in some sense bounded by a constant.

So, OK, what does this mean for the value of ρ? Its units are presumably program size per unit time. And so to define its value, we’ll have to say how we’re measuring program size. Perhaps we could imagine we write our rules in the Wolfram Language. Then there should be a definite value of ρ for our universe, measured in Wolfram-Language-tokens per second. If we chose to use (2,3)-Turing-machine-tape-values per year then we’d get a different numerical value. But assuming we used the correct conversions, the value would be the same. (And, yes, there’s all sorts of subtlety about constant-time or not emulation, etc.)

In some sense, we may be able to think of ρ as the ultimate “processor speed for the universe”: how fast tokens in whatever language we’re using are being “interpreted” and actually “executed” to determine the behavior of the universe.

Can we estimate the value of ρ? If our units are Wolfram-Language-tokens per second we could start by imagining just computing in the Wolfram Language some piece of the rulial multiway graph for our models and seeing how many operations it takes. To allow “all possible rules” we’d have to increase the possible left- (and right-) hand sides of our rules to reflect the size of the hypergraph representing the universe at each step. But now we’d need to divide by the “number of parallel threads” in the rulial multiway graph. So we can argue that all we’d be left with is something like (size of spatial hypergraph represented in Wolfram Language) / (elementary time).

So, based on our previous estimates (which I don’t consider anything more than vaguely indicative yet) we might conclude that perhaps:

ρ ~ 10450 Wolfram-Language-tokens/second

The number of “parallel threads” in the rulial multiway graph (the rulial analog of Ξ) might then be related to the number of possible hypergraphs that contain about the number of nodes in the universe, or very roughly (10350)^(10350) ≈ . If we ask the total number of Wolfram Language tokens processed by the universe, there’ll be another factor ~10467, but this “parallelism” will completely dominate, and the result will be about:

Wolfram-Language-tokens

OK, so given a value of ρ, how might we conceivably observe it? Presumably there’s an analog of quantum uncertainty in rulial space, that’s somehow proportional to the value of ρ. It’s not completely clear how this would show up, but one possibility is that it would lead to intrinsic limits on inductive inference. For example, given only a limited observation time, it might be fundamentally impossible to determine beyond a certain (“rulial”) accuracy what rule the universe is following in your description language. There’d be a minimum rate of divergence of behaviors from different rules, associated with the minimum distance between theories—and it would take a certain time to distinguish theories at this rate of divergence.

In our models, just like every causal edge in physical and branchial space is associated with energy, so should every causal edge in rulial space be. In other words, the more processing that happens in a particular part of rulial space, the more physical energy one should consider exists there. And just as with the Einstein equations in physical space, or the Feynman path integral in branchial space, we should expect that the presence of energy in a particular region of rulial space should reflect in a deflection of geodesics there.

Geodesics in rulial space are the shortest paths from one configuration of the universe to another, using whatever sequences of rules are needed. But although that’s somewhat like what’s considered at a theoretical level in non-deterministic computation, it’s not something we’re usually familiar with: we’re used to picking a particular description language and sticking with it. So exactly what the interpretation of deflections of geodesics in rulial space should be isn’t clear.

But there are a few other things we can consider. For example, presumably the universe is expanding in rulial space, perhaps implying that in some sense more descriptions of it are becoming possible over time. What about rulial black holes? As mentioned above, parts of rulial space that correspond to computational reducibility should behave like black holes, where in effect “time stops”. Or, in other words, while in most of the universe time is progressing, and irreducible computation is going on, computational reducibility will cause that process to stop in a rulial black hole.

Presumably geodesics near the rulial black hole will be pulled towards it. Somehow when there’s a description language that leads to computational reducibility, languages near it will tend to also stop being able to successfully describe computationally irreducible processes.

Can we estimate the density of rulial black holes? Let’s consider the Turing machine case. In effect we’re going to want to know what the density of non-universality is among all possible non-deterministic Turing machines. Imagine we emulate all Turing machines using a single universal machine. Then this is effectively equivalent to asking what fraction of initial conditions for that machine lead to reducible behaviors—or, in essence, in the traditional characterization of Turing machines, halt. But the probability across all possible inputs that a universal Turing machine will halt is exactly Greg Chaitin’s Ω. In other words, the density of rulial black holes in rulial space is governed by Ω.

But Ω isn’t just a number like π that we can compute as accurately as we want; it’s noncomputable, in the sense that it can’t be computed to arbitrary accuracy in any finite time by a Turing machine. Now, in a sense it’s not too surprising that the density of rulial black holes is noncomputable—because, given computational irreducibility, to determine for certain whether something is truly a rulial black hole from which nothing can escape one might have to watch it for an unbounded amount of time.

But for me there’s something personally interesting about Greg Chaitin’s Ω showing up in any way in a potential description of anything to do with the universe. You see, I’ve had a nearly 40-year-long debate with Greg about whether the universe is “like π” or “like Ω”. In other words, is it possible to have a rule that will let us compute what the universe does just like we can compute (say, in principle, with a Turing machine) the digits of π? Or will we have to go beyond a Turing machine to describe our universe? I’ve always thought that our universe is “like π”; Greg has thought that it might be “like Ω”. But now it looks as if we might both be right!

In our models, we’re saying that we can compute what the universe does, in principle with a Turing machine. But what we’re now finding out is that in the full rulial space, general limiting statements pull in Ω. In a particular rulial observation frame, we’re able to analyze the universe “like π”. But if we want to know about all possible rulial observation frames—or in a sense the space of all possible descriptions of the universe—we’ll be confronted with Ω.

In our models, the actual operation of the universe, traced in a particular rulial observation frame, is assumed never to correspond to anything computationally more powerful than a Turing machine. But let’s say there was a hypercomputer in our universe. What would it look like? It’d be a place where the effective ρ is infinite—a place where rulial geodesics infinitely diverge—a kind of white hole in rulial space (or perhaps a cosmic event horizon). (We can also think about the hypercomputer as introducing infinite-shortcut paths in the rulial multiway graph which ordinary Turing-machine paths can never “catch up to” and therefore causally affect.)

But given a universe that does hypercomputation, we can then imagine defining a rulial multiway graph for it. And then our universe will show up as a black hole in this higher-level rulial space.

But OK, so if there’s one level of hypercomputer, why not consider all possible levels? In other words, why not define a hyperrulial multiway graph in which the possible rules that are used include both ordinary computational ones, but also hypercomputational ones?

And once we’re dealing with hypercomputational systems, we can just keep going, adding in effect more and more levels of oracles—and progressively ascending the arithmetic hierarchy. (The concept of “intermediate degrees” might be thought to lead to something that isn’t a perfect hierarchy—but I suspect that it’s not robust enough to apply to complete rulial spaces.) Within the hyperrulial multiway graph at a particular level, levels below it will be presumably appear as rulial black holes, while ones above will appear as rulial white holes.

And there’s nothing to keep us only considering finite levels of the arithmetic hierarchy; we can also imagine ascending to transfinite levels, and then just keeping going to higher and higher levels of infinity. Of course, according to our models, none of this is relevant to our particular physical universe. But at a theoretical level, we can still at least to some extent “symbolically describe it”, even in our universe.

A Burst of Physics Progress at the 2020 Wolfram Summer School

$
0
0
WSSthumbnail

A Burst of Physics Progress at the 2020 Wolfram Summer School

And We’re Off and Running…

We recently wrapped up the four weeks of our first-ever “Physics track” Wolfram Summer School—and the results were spectacular! More than 30 projects potentially destined to turn into academic papers—reporting all kinds of progress on the Wolfram Physics Project.

When we launched the Wolfram Physics Project just three months ago one of the things I was looking forward to was seeing other people begin to seriously contribute to the project. Well, it turns out I didn’t have to wait long! Because—despite the pandemic and everything—things are already very much off and running!

Six weeks ago we made a list of questions we thought we were ready to explore in the Wolfram Physics Project. And in the past five weeks I’m excited to say that through projects at the Summer School lots of these are already well on their way to being answered. If we ever wondered whether there was a way for physicists (and physics students) to get involved in the project, we can now give a resounding answer, “yes”.

So what was figured out at the Summer School? I’m not going to get even close to covering everything here; that’ll have to await the finishing of papers (that I’ll be most interested to read!). But I’ll talk here about a few things that I think are good examples of what was done, and on which I can perhaps provide useful commentary.

I should explain that we’ve been doing our Wolfram Summer School for 18 years now (i.e. since just after the 2002 publication of A New Kind of Science), always focusing on having each student do a unique original project. This year—for the first time—we did the Summer School virtually, with 79 college/graduate/postdoc/… students from 21 countries around the world (and, yes, 13 time zones). We had 30 students officially on the “Physics track”, but at least 35 projects ended up being about the Wolfram Physics Project. (Simultaneous with the last two weeks of the Summer School we also had our High School Summer Camp, with another 44 students—and several physics projects.)

My most important role in the Summer School (and Summer Camp) is in defining projects. For the Physics track Jonathan Gorard was the academic director, assisted by some very able mentors and TAs. Given how new the Wolfram Physics Project is, there aren’t many people who yet know it well, but one of the things we wanted to achieve at the Summer School was to fix that!

Nailing Down Quantum Mechanics

One of the remarkable features of our models is that they basically imply the inevitability of quantum mechanics. But what is the precise correspondence between our models and all the traditional formalism of quantum mechanics? Some projects at the Summer School helped the ongoing process of nailing that down.

The starting point for any discussion of quantum mechanics in our models is the notion of multiway systems, and the concept that there can be many possible paths of evolution, represented by a multiway graph. The nodes in the multiway graph represent quantum (eigen)states. Common ancestry among these states defines entanglements between them. The branchial graph then in effect gives a map of the entanglements of quantum states—and in the large-scale limit one can think of this as corresponding to a “branchial space”:

Branchial graph

Branchial graph

The full picture of multiway systems for transformations between hypergraphs is quite complicated. But a key point that has become increasingly clear is that many of the core phenomena of quantum mechanics are actually quite generic to multiway systems, independent of the details of the underlying rules for transitions between states. And as a result, it’s possible to study quantum formalism just by looking at string substitution systems, without the full complexity of hypergraph transformations.

A quantum state corresponds to a collection of nodes in the multiway graph. Transitions between states through time can be studied by looking at the paths of bundles of geodesics through the multiway graph from the nodes of one state to another.

In traditional quantum formalism different states are assigned quantum amplitudes that are specified by complex numbers. One of our realizations has been that this “packaging” of amplitudes into complex numbers is quite misleading. In our models it’s much better to think about the magnitude and phase of the amplitude separately. The magnitude is obtained by looking at path weights associated with multiplicity of possible paths that reach a given state. The phase is associated with location in branchial space.

One of the most elegant results of our models so far is that geodesic paths in branchial space are deflected by the presence of relativistic energy density represented by the multiway causal graph—and therefore that the path integral of quantum mechanics is just the analog in branchial space of the Einstein equations in physical space.

To connect with the traditional formalism of quantum mechanics we must discuss how measurement works. The basic point is that to obtain a definite “measured result” we must somehow get something that no longer shows “quantum branches”. Assuming that our underlying system is causal invariant, this will eventually always “happen naturally”. But it’s also something that can be achieved by the way an observer (who is inevitably themselves embedded in the multiway system) samples the multiway graph. And as emphasized by Jonathan Gorard this is conveniently parametrized by thinking of the observer as effectively adding certain “completions” to the transition rules used to construct the multiway system.

It looks as if it’s then straightforward to understand things like the Born rule for quantum probabilities. (To project one state onto another involves a “rectangle” of transformations that have path weights corresponding to the product of those for the sides.) It also seems possible to understand things like destructive interference—essentially as the result of geodesics for different cases landing up at sufficiently distant points in branchial space that any “spanning completion” must pull in a large number of “randomly canceling” path weights.

Local versus Global Multiway Systems

A standard “global” multiway system works by merging branches that lead to globally isomorphic hypergraphs. In Jonathan Gorard’s “completion interpretation of quantum mechanics”, some of these merges represent the results of applying rules that effectively get “added by the observer” as part of their interpretation of the universe. Max Piskunov has criticized the need to consider global hypergraph isomorphism (“Is one really going to compare complete universes?”)—and has suggested instead the idea of local multiway systems. He got the first implementation of local multiway systems done just in time for the Summer School.

Consider the rule:

{{x, y}, {x, z}} {{x, z}, {x, w}, {y, w}, {z, w}}

Start from the initial state {{{1,1},{1,1}}}. Here’s its global multiway graph, showing both states and events:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][
 "WolframModel" -> {{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
      w}}}, {{{1, 1}, {1, 1}}}, 3, "EvolutionEventsGraph", 
 VertexSize -> 1]

But now imagine that we trace the fate of every single relation in each hypergraph, and show it as a separate node in our graph. What we get then is a local multiway system. In this case, here are the first few steps:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}, Automatic, 3]["ExpressionsEventsGraph", 
 VertexLabels -> Automatic]

Continue for a few more steps:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}, Automatic, 5]["ExpressionsEventsGraph"]

If we look only at events, we get exactly the same causal graph as for the global multiway system:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}, Automatic, 5]["LayeredCausalGraph"]

But in the full local multiway graph every causal edge is “annotated” with the relation (or “expression”) that “carries” causal information between events.

In general, two events can be timelike, spacelike or branchlike separated. A local multiway system provides a definite criterion for distinguishing these. When two events are timelike separated, one can go from one to another by following a causal edge. When two events are spacelike separated, their most common ancestor in the local multiway system graph will be an event. But if they are branchlike separated, it will instead be an expression.

To reconstruct “complete states” (i.e. spatial hypergraphs) of the kind used in the global multiway system, one needs to assemble maximal collections of expressions that are spacelike separated (“maximal antichains” in poset jargon).

But so is the “underlying physics” of local multiway systems the same as global ones? In a global multiway system one talks about applying rules to the collections of expressions that exist in spatial hypergraphs. But in a local multiway system one just applies rules to arbitrary collections of expressions (or relations). And a big difference is that the expressions in these collections can lie not just at different places in a spatial hypergraph, but on different multiway branches. Or, in other words, the evolution of the universe can pick pieces from different potential “branches of history”.

This might sound like it’d lead to completely different results. But the remarkable thing is that it doesn’t—and instead global and multiway systems just seem to be different descriptions of what is ultimately the same thing. Let’s assume first that the underlying rules are causal invariant. Then in a global multiway system, branches must always reconverge. But this reconvergence means that even when there are states (and expressions) “on different branches” they can still be brought together into the same event, just like in a local multiway system. And when there isn’t immediately causal invariance, Jonathan’s “completion interpretation” posits that observers in effect add completions which lead to effective causal invariance, with the same reconvergence, and effective involvement of different branches in single events.

As Jonathan and Max debated global vs. local multiway systems I joked that it was a bit like Erwin Schrödinger debating Werner Heisenberg in the early days of quantum mechanics. And then we realized: actually it was just like that! Recall that in the Schrödinger picture of quantum mechanics, time evolution operators are fixed, but states evolve, whereas in the Heisenberg picture, states are fixed, but evolution operators evolve. Well, in a global multiway system one’s looking at complete states and seeing how they change as a result of the fixed set of events defined by the rules. But in a local multiway system one has a fixed basis of expressions, and then one’s looking at how the structure of the events that involve these expressions changes. So it’s just like the Schrödinger vs. Heisenberg pictures!

The Concept of Multispace

Right before the Summer School, I’d been doing quite a lot of work on what I was calling “multispace”. In a spatial hypergraph one’s representing the spatial relationships between elements. In a global multiway system one’s representing the branchial relationships between complete states. In a local multiway system spatial and branchial relationships are effectively mixed together.

So what is the analog of physical space when branchial relationships are included? I’m calling it multispace. In a case where there isn’t any branching—say an ordinary, deterministic Turing machine—multispace is just the same as ordinary space. But if there’s branching, it’s different.

Here’s an experiment I did just before the Summer School in the very simple case of a non-deterministic Turing machine:

Non-deterministic Turing machine

But I wasn’t really happy with this visualization; the most obvious structure is still the multiway system, and there are lots of “copies of space”, appearing in different states. What I wanted to figure out was how to visualize things so that ordinary space is somehow primary, and the branching is secondary. One could imagine that the elements of the system are basically laid out according to the relationships in ordinary space, merely “bulging out” in a different direction to represent branchial structure.

The practical problem is that branchial space may usually be much “bigger” than ordinary space, so the “bulging” may in effect “overpower” the ordinary spatial relationships. But one idea for visualizing multispace—explored by Nikolay Murzin at the Summer School—is to use machine-learning-like methods to create a 3D layout that shows spatial structure when viewed from one direction, and branchial structure when viewed from an orthogonal direction:

ResourceFunction
&#10005
ResourceFunction["MultispacePlot3D"][
 ResourceFunction["MultiwayTuringMachine"][{1507, 2506, 
    3506}, {{1, 1, 0}, {0, 1, 0, 1}}, 4, ##] &, "Graph"]

Generational States, the Ontological Basis and Bohmian Mechanics

In our models, multiway graphs represent all possible “quantum paths of evolution” for a system. But is there a way to pick out at least an approximation to a “classical-like path”? Yes–it’s a path consisting of a sequence of what we call “generational states”. And in going from one generational state to another, the idea is to carry out not just one event, as in the multiway graph, but a maximal set of spacelike separated events. In other words, instead of allowing different “quantum branches” containing different orderings of events, we’re insisting that a maximal set of consistent events are all done together.

Here’s an example. Consider the rule:

{A AB, B BBA}

Here’s a “classical-like path” made from generational states:

ResourceFunction
&#10005
ResourceFunction["GenerationalMultiwaySystem"][{"A" -> "AB", 
  "B" -> "BBA"}, "AA", 3, "StatesGraph"]

These states must appear in the multiway graph, though it typically takes several events (i.e. several edges) to go from one to another (and in general there may be multiple “generational paths”, corresponding to multiple possible “classical-like paths” in a system):

stripMetadata
&#10005
stripMetadata[expression_] := 
 If[Head[expression] === Rule, Last[expression], expression]; Graph[
 ResourceFunction["MultiwaySystem"][{"A" -> "AB", 
   "B" -> "BBA"}, {"AA"}, 3, "StatesGraph"], 
 VertexShapeFunction -> {Alternatives @@ 
     VertexList[
      ResourceFunction["GenerationalMultiwaySystem"][{"A" -> "AB", 
        "B" -> "BBA"}, {"AA"}, 3, "StatesGraph"]] -> (Text[
       Framed[Style[stripMetadata[#2], Hue[0, 1, 0.48]], 
        Background -> Directive[Opacity[.6], Hue[0, 0.45, 0.87]], 
        FrameMargins -> {{2, 2}, {0, 0}}, RoundingRadius -> 0, 
        FrameStyle -> 
         Directive[Opacity[0.5], 
          Hue[0, 0.52, 0.8200000000000001]]], #1, {0, 0}] &)}]

But what is the interpretation of generational states in previous discussions of quantum mechanics? Joseph Blazer’s project at the Summer School suggested that they are like an ontological basis.

In the standard formalism used for quantum mechanics one imagines that there are lots of quantum states that can form superpositions, etc.—and that classical results emerge only when measurements are done. But even from the earliest days of quantum mechanics (and rediscovered in the 1950s) there is an alternative formalism: so-called Bohmian mechanics, in which everything one considers is a “valid classical state”, but in which there are more elaborate rules of evolution than in the standard formalism.

Well, it seems as if generational states are just what Bohmian mechanics is talking about. The set of possible generational states can be thought of as forming an “ontological basis”, of states that “really can exist”, without any “quantum funny business”.

But what is the rule for evolution between generational states? One of the perhaps troubling features of Bohmian mechanics is that it implies correlations between spacelike separated events, or in other words, it implies that effects can propagate at arbitrarily high speeds.

But here’s the interesting thing: that’s just what happens in our generational states too! In our generational states, though, it isn’t some strange effect that seems to be arbitrarily added to the system: it’s simply a consequence of the consistency conditions we choose to impose in defining generational states.

Classic Quantum Systems and Effects

An obvious check on our models is to see them reproduce classic quantum systems and effects—and several projects at the Summer School were concerned with this. A crucial point (that I mentioned above) is that it’s becoming increasingly clear that at least most of these “classic quantum systems and effects” are quite generic features of our models—and of the multiway systems that appear in them. And this meant that many of the “quantum” projects at the Summer School could be done just in terms of string substitution systems, without having to deal with all the complexities of hypergraph rewriting.

Quantum Interference

Hatem Elshatlawy, for example, explored quantum interference in our models. He got some nice results—which Jonathan Gorard managed to simplify to an almost outrageous extent.

Let’s imagine just having a string in which o represents “empty space”, and X represents the position of some quantum thing, like a photon. Then let’s have a simple sorting rule that represents the photon going either left or right (a kind of minimal Huygens’ principle):

{Xo oX, oX Xo}

Now let’s construct a multiway system starting from a state “oooXooXooo” that we can think of as corresponding to photons going through two “slits” a certain distance apart:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"Xo" -> "oX", 
  "oX" -> "Xo"}, "oooXooXooo", 2, "StatesGraph", 
 "IncludeStepNumber" -> True, "IncludeStateWeights" -> True, 
 VertexLabels -> "VertexWeight", 
 GraphLayout -> "LayeredDigraphEmbedding"]

The merging of states that we see here is ultimately going to correspond to “quantum interference”. The path weights correspond to the magnitudes of the amplitudes of different states. But the question is: “What final state corresponds to what final photon position?”

Different final photon positions effectively correspond to different quantum phases for the photon. But in our models these quantum phases are associated with positions in branchial space. And to get an idea of what’s going on, we can just use the sorting order of strings to give a sense of relative positions in branchial space. (Because of the details of the setup, we need to just use the right-hand half of the strings, then symmetrically repeat them.)

If we now do this, and plot the values of the weights (here after 6 steps) this is what we get:

MultiwayDiffractionTest
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"MultiwayDiffractionTest", "[", 
   RowBox[{
   "rules_List", ",", "initialCondition_String", ",", 
    "stepCount_Integer"}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
     "allStatesList", ",", "finalStatesCount", ",", "weights", ",", 
      "sortedWeights"}], "}"}], ",", "\[IndentingNewLine]", 
    RowBox[{
     RowBox[{"allStatesList", "=", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], "[", 
       
       RowBox[{
       "rules", ",", "initialCondition", ",", "stepCount", ",", 
        "\"\<AllStatesList\>\"", ",", 
        RowBox[{"\"\<IncludeStateWeights\>\"", "\[Rule]", "True"}], 
        ",", 
        RowBox[{"VertexLabels", "\[Rule]", "\"\<VertexWeight\>\""}], 
        ",", 
        RowBox[{"\"\<IncludeStepNumber\>\"", "\[Rule]", "True"}]}], 
       "]"}]}], ";", "\[IndentingNewLine]", 
     RowBox[{"finalStatesCount", "=", 
      RowBox[{"Length", "[", 
       RowBox[{"Last", "[", "allStatesList", "]"}], "]"}]}], ";", 
     "\[IndentingNewLine]", 
     RowBox[{"weights", "=", 
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], "[", 
       
       RowBox[{
       "rules", ",", "initialCondition", ",", "stepCount", ",", 
        "\"\<StateWeights\>\"", ",", 
        RowBox[{"\"\<IncludeStateWeights\>\"", "\[Rule]", "True"}]}], 
       "]"}]}], ";", "\[IndentingNewLine]", 
     RowBox[{"sortedWeights", "=", 
      RowBox[{"Join", "[", 
       RowBox[{
        RowBox[{"Reverse", "[", 
         RowBox[{"Take", "[", 
          RowBox[{"weights", ",", 
           RowBox[{"-", 
            RowBox[{"Ceiling", "[", 
             RowBox[{"finalStatesCount", "/", "2"}], "]"}]}]}], "]"}],
          "]"}], ",", 
        RowBox[{"Take", "[", 
         RowBox[{"weights", ",", 
          RowBox[{"-", 
           RowBox[{"Ceiling", "[", 
            RowBox[{"finalStatesCount", "/", "2"}], "]"}]}]}], 
         "]"}]}], "]"}]}], ";", "\[IndentingNewLine]", 
     RowBox[{"Last", "/@", "sortedWeights"}]}]}], "]"}]}]], "Input"],
 
 Cell[BoxData[
 RowBox[{"ListLinePlot", "[", 
  RowBox[{
   RowBox[{"MultiwayDiffractionTest", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"\"\<Xo\>\"", "\[Rule]", "\"\<oX\>\""}], ",", 
       RowBox[{"\"\<oX\>\"", "\[Rule]", "\"\<Xo\>\""}]}], "}"}], ",", 
     " ", "\"\<oooXooXooo\>\"", ",", "6"}], "]"}], ",", 
   RowBox[{"Mesh", "\[Rule]", "All"}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
   RowBox[{"FillingStyle", "->", "LightYellow"}]}], "]"}]], "Input"]

Amazingly, this is starting to look a bit like a diffraction pattern. Let’s try “increasing the slit spacing”—by using the initial string “ooooooooXoooXoooooooo”. Now the multiway graph has the form

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"Xo" -> "oX", "oX" -> "Xo"}, 
  "ooooooooXoooXoooooooo", 10, "EvolutionGraphStructure"]]

and plotting the weights we get

ListLinePlot
&#10005
ListLinePlot[
 MultiwayDiffractionTest[{"Xo" -> "oX", "oX" -> "Xo"}, 
  "ooooooooXoooXoooooooo", 10], Mesh -> All, Frame -> True, 
 Filling -> Axis, FillingStyle -> LightYellow]

which is stunningly similar to the standard quantum mechanics result

Plot
&#10005
Plot[((1/2)*ChebyshevU[1, Cos[x]]*Sinc[0.35*x])^2, {x, -10, 10}, 
 Filling -> Axis, FillingStyle -> LightYellow, Frame -> True]

complete with the expected destructive interference away from the central peak.

Computing the corresponding branchial graph we get

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"Xo" -> "oX", 
  "oX" -> "Xo"}, "ooooooooXoooXoooooooo", 10, \
"BranchialGraphStructure"]

which in effect shows the “concentrations of amplitude” into different parts of branchial space (AKA peaks in different regions of quantum phase).

(In a sense the fact that this all works is “unsurprising”, since in effect we’re just implementing a discrete version of Huygens’ principle. But it’s very satisfying to see everything come together.)

The Quantum Harmonic Oscillator

The quantum harmonic oscillator is one of the first kinds of quantum systems a typical quantum mechanics textbook will discuss. But how does the quantum harmonic oscillator work in our models? Patrick Geraghty’s project at the Summer School began the process of figuring it out.

A classical harmonic oscillator basically has something going back and forth in a certain region at a sequence of possible frequencies. The quantum harmonic oscillator picks up the same “modes”, but now represents them just as quantum eigenstates of a certain energy. In our models it’s actually possible to go back to something very close to the classical picture. We can set up a string substitution system in which something (here B or C) goes back and forth in a string of fixed length:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"BA" -> "AB", "BY" -> "CY", 
  "AC" -> "CA", "XC" -> "XB"}, {"XBAAAY"}, 10, "StatesGraph"]

We can make it a bit more obvious what’s going on by changing the characters in the strings:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"R-" -> "-R", "R]" -> "L]", 
  "-L" -> "L-", "[L" -> "[R"}, {"[R---]"}, 10, "StatesGraph"]

And it’s clear that this system will always go in a periodic cycle. If we were thinking about spacetime and relativity, it might trouble us that we’ve created a closed timelike curve, in which the future merges with the past. But that’s basically what we’re forced into by the idealization of a quantum harmonic oscillator.

Recall that in our models energy is associated with the flux of causal edges. Well, in this model of the harmonic oscillator, we can immediately figure out the causal edges:

ResourceFunction
&#10005
ResourceFunction["MultiwaySystem"][{"R-" -> "-R", "R]" -> "L]", 
  "-L" -> "L-", "[L" -> "[R"}, {"[R---]"}, 10, "EvolutionCausalGraph"]

And we can see that as we change the length of the string, the number of causal edges (i.e. the energy) will linearly increase, as we’d expect for a quantum harmonic oscillator:

Table
&#10005
Table[ResourceFunction["MultiwaySystem"][{"R-" -> "-R", "R]" -> "L]", 
   "-L" -> "L-", "[L" -> "[R"}, {"[R" <> StringRepeat["-", n] <> "]"},
   10, "EvolutionCausalGraphStructure"], {n, 2, 4}]

Oh, and there’s even zero-point energy:

Table[ResourceFunction["MultiwaySystem"]
&#10005
Table[ResourceFunction["MultiwaySystem"][{"R-" -> "-R", "R]" -> "L]", 
   "-L" -> "L-", "[L" -> "[R"}, {"[R" <> StringRepeat["-", n] <> "]"},
   10, "EvolutionCausalGraphStructure"], {n, 0, 2, 4}]

There’s a lot more to figure out even about the quantum harmonic oscillator, but this is a start.

Quantum Teleportation

One of the strange, but characteristic, phenomena that’s known to occur in quantum mechanics is what’s called quantum teleportation. In a physical quantum teleportation experiment, one creates a quantum-entangled pair of particles, then lets them travel apart. But now as soon as one measures the state of one of these particles, one immediately knows something about the state of the other particle—even though there hasn’t been time to get a light signal from that other particle.

At the Summer School, Taufiq Murtadho figured out a rather elegant way to understand this phenomenon in our models. I’ll not go through the details here, but here’s a representation of a key part of the construction:

DrawGraph
&#10005
Cell[CellGroupData[{
						Cell[BoxData[{
 RowBox[{
  RowBox[{"rule", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"\"\<D\>\"", "\[Rule]", " ", "\"\<AXA\>\""}], ",", 
     RowBox[{"\"\<D\>\"", "\[Rule]", " ", "\"\<BXB\>\""}], ",", 
     RowBox[{"\"\<C\>\"", "\[Rule]", " ", "\"\<A\>\""}], ",", 
     RowBox[{"\"\<C\>\"", "\[Rule]", " ", "\"\<B\>\""}]}], "}"}]}], 
  ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{"InitialState", " ", "=", "\"\<DC\>\""}], 
  ";"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{"BellCompletion", "=", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"\"\<BA\>\"", "\[Rule]", " ", "\"\<AA\>\""}], ",", " ", 
     RowBox[{"\"\<AA\>\"", "\[Rule]", " ", "\"\<BA\>\""}], ",", 
     RowBox[{"\"\<BA\>\"", "\[Rule]", " ", "\"\<BB\>\""}], ",", 
     RowBox[{"\"\<BB\>\"", "\[Rule]", " ", "\"\<BA\>\""}], ",", " ", 
     RowBox[{"\"\<AB\>\"", "\[Rule]", " ", "\"\<AA\>\""}], ",", 
     RowBox[{"\"\<AA\>\"", "\[Rule]", " ", "\"\<AB\>\""}], ",", " ", 
     RowBox[{"\"\<AB\>\"", "\[Rule]", " ", "\"\<BB\>\""}], ",", 
     RowBox[{"\"\<BB\>\"", "\[Rule]", " ", "\"\<AB\>\""}]}], "}"}]}], 
  ";"}]}], "Input"],

  Cell[BoxData[
 RowBox[{
  RowBox[{"(*", 
   RowBox[{
   "Function", " ", "to", " ", "draw", " ", "the", " ", "graph", " ", 
    "with", " ", "annotation"}], "*)"}], "\[IndentingNewLine]", 
  RowBox[{
   RowBox[{
    RowBox[{"DrawGraph", "[", "]"}], ":=", 
    RowBox[{"Module", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"a", ",", "b", ",", "c"}], "}"}], ",", 
      "\[IndentingNewLine]", 
      RowBox[{"(*", 
       RowBox[{
       "Selecting", " ", "the", " ", "vertices", " ", "to", " ", "be",
         " ", "colored"}], "*)"}], "\[IndentingNewLine]", 
      RowBox[{
       RowBox[{"EvolVertexList", " ", "=", " ", 
        RowBox[{"VertexList", "[", 
         RowBox[{
          RowBox[{
          "ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], 
          "[", 
          RowBox[{
           RowBox[{"Join", "[", 
            RowBox[{"rule", ",", "BellCompletion"}], "]"}], ",", 
           "InitialState", ",", "4", ",", "\"\<EvolutionGraph\>\""}], 
          "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
       RowBox[{"TeleportInitState", " ", "=", " ", 
        RowBox[{"FilterRules", "[", 
         RowBox[{"EvolVertexList", ",", "2"}], "]"}]}], ";", 
       "\[IndentingNewLine]", 
       RowBox[{"BellVertex1", " ", "=", 
        RowBox[{"Map", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"3", "\[Rule]", " ", "#"}], "&"}], ",", 
          RowBox[{"Flatten", "[", 
           RowBox[{"StringCases", "[", 
            RowBox[{
             RowBox[{
              RowBox[{"FilterRules", "[", 
               RowBox[{"EvolVertexList", ",", "3"}], "]"}], "/.", 
              RowBox[{
               RowBox[{"Rule", "[", 
                RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
               "b"}]}], ",", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"__", "~~", "\"\<AA\>\""}], ",", 
               RowBox[{"__", "~~", "\"\<BB\>\""}]}], "}"}]}], "]"}], 
           "]"}]}], "]"}]}], ";", "\[IndentingNewLine]", 
       RowBox[{"BellVertex2", " ", "=", 
        RowBox[{"Map", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"3", "\[Rule]", " ", "#"}], "&"}], ",", 
          RowBox[{"Flatten", "[", 
           RowBox[{"StringCases", "[", 
            RowBox[{
             RowBox[{
              RowBox[{"FilterRules", "[", 
               RowBox[{"EvolVertexList", ",", "3"}], "]"}], "/.", 
              RowBox[{
               RowBox[{"Rule", "[", 
                RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
               "b"}]}], ",", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"__", "~~", "\"\<AB\>\""}], ",", 
               RowBox[{"__", "~~", "\"\<BA\>\""}]}], "}"}]}], "]"}], 
           "]"}]}], "]"}]}], ";", "\[IndentingNewLine]", 
       RowBox[{
        RowBox[{"stripMetadata", "[", "expression_", "]"}], ":=", 
        RowBox[{"If", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"Head", "[", "expression", "]"}], "===", "Rule"}], 
          ",", 
          RowBox[{"Last", "[", "expression", "]"}], ",", 
          "expression"}], "]"}]}], ";", "\[IndentingNewLine]", 
       RowBox[{"(*", 
        RowBox[{"Coloring", " ", "BellVertex1"}], "*)"}], 
       "\[IndentingNewLine]", 
       RowBox[{"a", " ", "=", 
        RowBox[{"Graph", "[", 
         RowBox[{
          RowBox[{
           RowBox[{
           "ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], 
           "[", 
           RowBox[{
            RowBox[{"Join", "[", 
             RowBox[{"rule", ",", "BellCompletion"}], "]"}], ",", 
            "InitialState", ",", "3", ",", "\"\<EvolutionGraph\>\"", 
            ",", 
            RowBox[{
            "\"\<IncludeStatePathWeights\>\"", "\[Rule]", " ", 
             "True"}], ",", 
            RowBox[{
            "VertexLabels", "\[Rule]", " ", 
             "\"\<VertexWeight\>\""}]}], "]"}], ",", 
          InterpretationBox[
           DynamicModuleBox[{Typeset`open = False}, 
            TemplateBox[{"Expression", 
              RowBox[{"Rule", "[", 
                DynamicBox[
                 FEPrivate`FrontEndResource[
                 "FEBitmaps", "IconizeEllipsis"]], "]"}], 
              GridBox[{{
                 ItemBox[
                  RowBox[{
                    TagBox["\"Byte count: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["1776", "IconizedItem"]}]]}}, 
               GridBoxAlignment -> {"Columns" -> {{Left}}}, 
               DefaultBaseStyle -> "Column", 
               GridBoxItemSize -> {
                "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
              Dynamic[Typeset`open]},
             "IconizedObject"]],
           
           VertexShapeFunction -> {
            Apply[Alternatives, $CellContext`BellVertex1] -> (Text[
               Framed[
                Style[
                 $CellContext`stripMetadata[#2], 
                 Hue[0, 1, 0.1]], Background -> Directive[
                  Opacity[0.6], 
                  Hue[0, 0.45, 0.87]], 
                FrameMargins -> {{2, 2}, {0, 0}}, RoundingRadius -> 0,
                 FrameStyle -> Directive[
                  Opacity[0.5], 
                  Hue[0, 0.52, 0.8200000000000001]]], #, {0, 0}]& )},
           SelectWithContents->True,
           Selectable->False]}], "]"}]}], ";", "\[IndentingNewLine]", 
       
       RowBox[{"(*", 
        RowBox[{"Coloring", " ", "BellVertex2"}], "*)"}], 
       "\[IndentingNewLine]", 
       RowBox[{"b", " ", "=", " ", 
        RowBox[{"Graph", "[", 
         RowBox[{"a", ",", 
          RowBox[{"VertexShapeFunction", "\[Rule]", 
           RowBox[{"{", 
            RowBox[{
             RowBox[{"Alternatives", "@@", "BellVertex2"}], "\[Rule]", 
             RowBox[{"(", 
              RowBox[{
               RowBox[{"Text", "[", 
                RowBox[{
                 RowBox[{"Framed", "[", 
                  RowBox[{
                   RowBox[{"Style", "[", 
                    RowBox[{
                    RowBox[{"stripMetadata", "[", "#2", "]"}], ",", 
                    RowBox[{"Hue", "[", 
                    RowBox[{"0", ",", "1", ",", "0.1"}], "]"}]}], 
                    "]"}], ",", "\[IndentingNewLine]", 
                   RowBox[{"Background", "\[Rule]", 
                    RowBox[{"Directive", "[", 
                    RowBox[{
                    RowBox[{"Opacity", "[", ".6", "]"}], ",", 
                    RowBox[{"Hue", "[", 
                    RowBox[{
                    RowBox[{"1", "/", "3"}], ",", "1", ",", "1", ",", 
                    ".5"}], "]"}]}], "]"}]}], ",", 
                   InterpretationBox[
                    DynamicModuleBox[{Typeset`open = False}, 
                    TemplateBox[{"Expression", "SequenceIcon", 
                    GridBox[{{
                    ItemBox[
                    RowBox[{
                    TagBox["\"Head: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["Sequence", "IconizedItem"]}]]}, {
                    ItemBox[
                    RowBox[{
                    TagBox["\"Length: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["3", "IconizedItem"]}]]}, {
                    ItemBox[
                    RowBox[{
                    TagBox["\"Byte count: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["712", "IconizedItem"]}]]}}, 
                    GridBoxAlignment -> {"Columns" -> {{Left}}}, 
                    DefaultBaseStyle -> "Column", 
                    GridBoxItemSize -> {
                    "Columns" -> {{Automatic}}, 
                    "Rows" -> {{Automatic}}}], 
                    Dynamic[Typeset`open]},
                    "IconizedObject"]],
                    Sequence[
                    FrameMargins -> {{2, 2}, {0, 0}}, RoundingRadius -> 
                    0, FrameStyle -> Directive[
                    Opacity[0.5], 
                    Hue[0, 0.1, 0.8200000000000001]]],
                    SelectWithContents->True,
                    Selectable->False]}], "]"}], ",", "#1", ",", 
                 RowBox[{"{", 
                  RowBox[{"0", ",", "0"}], "}"}]}], "]"}], "&"}], 
              ")"}]}], "}"}]}]}], "]"}]}], ";", "\[IndentingNewLine]", 
       RowBox[{"(*", 
        RowBox[{
        "Coloring", " ", "the", " ", "initial", " ", "teleportation", 
         " ", "state"}], "*)"}], "\[IndentingNewLine]", 
       RowBox[{"c", " ", "=", " ", 
        RowBox[{"Graph", "[", 
         RowBox[{"b", ",", 
          RowBox[{"VertexShapeFunction", "\[Rule]", 
           RowBox[{"{", 
            RowBox[{
             RowBox[{"Alternatives", "@@", "TeleportInitState"}], 
             "\[Rule]", 
             RowBox[{"(", 
              RowBox[{
               RowBox[{"Text", "[", 
                RowBox[{
                 RowBox[{"Framed", "[", 
                  RowBox[{
                   RowBox[{"Style", "[", 
                    RowBox[{
                    RowBox[{"stripMetadata", "[", "#2", "]"}], ",", 
                    RowBox[{"Hue", "[", 
                    RowBox[{"0", ",", "1", ",", "0.1"}], "]"}]}], 
                    "]"}], ",", "\[IndentingNewLine]", 
                   InterpretationBox[
                    DynamicModuleBox[{Typeset`open = False}, 
                    TemplateBox[{"Expression", "SequenceIcon", 
                    GridBox[{{
                    ItemBox[
                    RowBox[{
                    TagBox["\"Head: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["Sequence", "IconizedItem"]}]]}, {
                    ItemBox[
                    RowBox[{
                    TagBox["\"Length: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["4", "IconizedItem"]}]]}, {
                    ItemBox[
                    RowBox[{
                    TagBox["\"Byte count: \"", "IconizedLabel"], 
                    "\[InvisibleSpace]", 
                    TagBox["1008", "IconizedItem"]}]]}}, 
                    GridBoxAlignment -> {"Columns" -> {{Left}}}, 
                    DefaultBaseStyle -> "Column", 
                    GridBoxItemSize -> {
                    "Columns" -> {{Automatic}}, 
                    "Rows" -> {{Automatic}}}], 
                    Dynamic[Typeset`open]},
                    "IconizedObject"]],
                    Sequence[Background -> Directive[
                    Opacity[0.6], 
                    Hue[0.1, 0.7, 3]], 
                    FrameMargins -> {{2, 2}, {0, 0}}, RoundingRadius -> 
                    0, FrameStyle -> Directive[
                    Opacity[0.5], 
                    Hue[0, 0.1, 0.8200000000000001]]],
                    SelectWithContents->True,
                    Selectable->False]}], "]"}], ",", "#1", ",", 
                 RowBox[{"{", 
                  RowBox[{"0", ",", "0"}], "}"}]}], "]"}], "&"}], 
              ")"}]}], "}"}]}]}], "]"}]}]}]}], "]"}]}], 
   "\[IndentingNewLine]", 
   RowBox[{"(*", 
    RowBox[{"Run", " ", "the", " ", "function"}], "*)"}], 
   "\[IndentingNewLine]", 
   RowBox[{"DrawGraph", "[", "]"}]}]}]], "Input"]
}, Open  ]]

A feature of quantum teleportation is that even though the protocol seems to be transmitting information faster than light, that isn’t really what’s happening when one traces everything through. And what Taufiq found is that in our models the multiway causal graph reveals how this works. In essence, the “teleportation” happens through causal edges that connect branchlike separated states—but these edges cannot transmit an actual measurable message.

Quantum Computing

How do we tell if our models correctly reproduce something like quantum computing? One approach is what I call “proof by compilation”. Just take a standard description of something—here quantum computing—and then systematically “compile” it to a representation in terms of our models.

Just before the Summer School, Jonathan Gorard put a function into the Wolfram Function Repository called QuantumToMultiwaySystem, which takes a description of a quantum circuit and “compiles it” to one of our multiway systems:

QuantumToMultiwaySystem

For example, here’s a Pauli-Z gate compiled to the rules for a multiway system:

ResourceFunction
&#10005
ResourceFunction[
ResourceObject[
Association[
   "Name" -> "QuantumToMultiwaySystem", 
    "ShortName" -> "QuantumToMultiwaySystem", 
    "UUID" -> "11c8aade-c41e-481e-85c0-10424fa9edbd", 
    "ResourceType" -> "Function", "Version" -> "1.0.0", 
    "Description" -> "Simulate a quantum evolution as a multiway \
system", "RepositoryLocation" -> URL[
     "https://www.wolframcloud.com/objects/resourcesystem/api/1.0"], 
    "SymbolName" -> "FunctionRepository`$\
337d322381db46d0b0c8103362842dec`QuantumToMultiwaySystem", 
    "FunctionLocation" -> CloudObject[
     "https://www.wolframcloud.com/obj/34b943dd-88a8-423c-a3d3-\
07922920160a"]], ResourceSystemBase -> Automatic]][<|
  "Operator" -> {{1, 0}, {0, -1}}, "Basis" -> {{1, 0}, {0, 1}}|>]

Here now is the result of starting with a superposition of states and running two steps of root-NOT gates:

ResourceFunction
ResourceFunction
&#10005
ResourceFunction[
ResourceObject[
Association[
   "Name" -> "QuantumToMultiwaySystem", 
    "ShortName" -> "QuantumToMultiwaySystem", 
    "UUID" -> "11c8aade-c41e-481e-85c0-10424fa9edbd", 
    "ResourceType" -> "Function", "Version" -> "1.0.0", 
    "Description" -> "Simulate a quantum evolution as a multiway \
system", "RepositoryLocation" -> URL[
     "https://www.wolframcloud.com/objects/resourcesystem/api/1.0"], 
    "SymbolName" -> "FunctionRepository`$\
337d322381db46d0b0c8103362842dec`QuantumToMultiwaySystem", 
    "FunctionLocation" -> CloudObject[
     "https://www.wolframcloud.com/obj/34b943dd-88a8-423c-a3d3-\
07922920160a"]], ResourceSystemBase -> Automatic]][<|
  "Operator" -> {{1 + I, 1 - I}, {1 - I, 1 + I}}, 
  "Basis" -> {{1, 0}, {0, 1}}|>, {1 + I, 
  1 - I}, 2, "EvolutionGraphFull"]

And, yes, we can understand entanglements through branchial graphs, etc.:

ResourceFunction
&#10005
ResourceFunction[
ResourceObject[
Association[
   "Name" -> "QuantumToMultiwaySystem", 
    "ShortName" -> "QuantumToMultiwaySystem", 
    "UUID" -> "11c8aade-c41e-481e-85c0-10424fa9edbd", 
    "ResourceType" -> "Function", "Version" -> "1.0.0", 
    "Description" -> "Simulate a quantum evolution as a multiway \
system", "RepositoryLocation" -> URL[
     "https://www.wolframcloud.com/objects/resourcesystem/api/1.0"], 
    "SymbolName" -> "FunctionRepository`$\
337d322381db46d0b0c8103362842dec`QuantumToMultiwaySystem", 
    "FunctionLocation" -> CloudObject[
     "https://www.wolframcloud.com/obj/34b943dd-88a8-423c-a3d3-\
07922920160a"]], ResourceSystemBase -> Automatic]][<|
  "Operator" -> {{1 + I, 1 - I}, {1 - I, 1 + I}}, 
  "Basis" -> {{1, 0}, {0, 1}}|>, {1 + I, 1 - I}, 2, "BranchialGraph"]

But, OK, if we can do this kind of compilation, what happens if we compile a famous quantum algorithm, like Shor’s algorithm for factoring integers, to a multiway system? At the Summer School, Yoav Rabinovich looked at this, working with Jack Heimrath and Jonathan Gorard. The whole of Shor’s algorithm is pretty messy, with lots of not-very-quantum parts. But the core of factoring an integer n is to do a quantum Fourier transform on Mod[a^Range[n],n], and then to do measurements on the resulting superposition of states and detect peaks.

Here’s a version of the quantum Fourier transform involved in factoring the integer n = 6, converted to one of our multiway systems:

Quantum Fourier transform

And, yes, there’s a lot going on here, but at least it’s happening “in parallel” in different branches of the quantum evolution—in just a few time steps. But the result here is just a superposition of quantum states; to actually find “the answer” we have to do measurements to find which quantum states have the highest amplitude, or largest path weight.

In the usual formalism of quantum mechanics, we’d just talk about “doing the measurement”; we wouldn’t discuss what goes on “inside the measurement”. But in our model we can analyze the actual process of measurement. And at least in Jonathan’s “completion interpretation” we can say that the measurement is achieved by a multiway system in which the observer effectively defines completions that merge branches:

Completions that merge branches

We’ve included path weights here, and “the answer” can effectively be read off by asking where in branchial space the maximum path weight occurs. But notice that lots of multiway edges (or events) had to be added to get the measurement done; that’s effectively the “cost of measurement” as revealed in our model.

So now the obvious question is: “How does this scale as one increases the number n? Including measurement, does the quantum computation ultimately succeed in factoring n in a polynomial number of steps?”

We don’t yet know the answer to this—but we’ve now more or less got the wherewithal to figure it out.

Here’s the basic picture. When we “do a quantum computation” we get to use the parallelism of having many different “threads” spread across branchial space. But when we want to measure what comes out, we have to “corral” all these threads back together to get a definite “observable” result. And the question is whether in the end we come out ahead compared to doing the purely classical computation.

I have to say that I’ve actually wondered about this for a very long time. And in fact, back in the early 1980s, when Richard Feynman and I worked on quantum computing, one of the main things we talked about was the “cost of measurement”. As an example, we looked at the “null” quantum computation of generating “random numbers” (e.g. from a process like radioactive decay)—and we ended up suspecting that there would be inevitable bounds on the “minimum cost of measurement”.

So it wouldn’t surprise me at all if in the end the “cost of measurement” wiped out any gains from “quantum parallelism”. But we don’t yet know for sure, and it will be interesting to continue the analysis and see what our models say.

I should emphasize that even if it turns out that there can’t be a “formal speed up” (e.g. polynomial vs. super-polynomial) from quantum mechanics, it still makes perfect sense to study “quantum computing”, because it’s basically inevitable that broadening the kinds of physics that are used to do computing will open up some large practical speed ups, even if they’re only by “constant factors”.

I might as well mention one slightly strange thought I had—just before the Summer School—about the power of quantum computing: it might be true that in an “isolated system” quantum parallelism would be offset by measurement cost, but that in the actual universe it might not be.

Here’s an analogy. Normally in physics one thinks that energy is conserved. But when one considers general relativity on cosmological scales, that’s no longer true. Imagine connecting a very long spring between the central black holes of two distant galaxies (and, yes, it’s very much a “thought experiment”). The overall expansion of the universe will make the galaxies get further apart, and so will continually impart energy to the spring. At some level we can think of this as “mining the energy of the Big Bang”, but on a local scale the result will be an apparent increase in available energy.

Well, in our models, the universe doesn’t just expand in physical space; it expands in branchial space too. So the speculation is that quantum computing might only “win” if it can “harvest” the expansion of branchial space. It seems completely unrealistic to get energy by harnessing the expansion of physical space. But it’s conceivable that there is so much more expansion in branchial space that it can be harnessed even locally—to deliver “true quantum power” to a quantum computer.

Corrections to the Einstein Equations

One of the important features of our models is that they provide a derivation of Einstein’s equations from something lower level—namely the dynamics of hypergraphs containing very large numbers of “atoms of space”. But if we can derive Einstein’s equations, what about corrections to Einstein’s equations? At the Summer School, Cameron Beetar and Jonathan Gorard began to explore this.

It’s immediately useful to think about an analogy. In standard physics, we know that on a microscopic scale fluids consist of large numbers of discrete molecules. But on a macroscopic scale the overall behavior of all these molecules gives us continuum fluid equations like the Navier–Stokes equations. Well, the same kind of thing happens in our models. Except that now we’re dealing with “atoms of space”, and the large-scale equations are the Einstein equations.

OK, so in our analogy of fluid mechanics, what are the higher-order corrections? As it happens, I looked at this back in 1986, when I was studying how fluid behavior could arise from simple cellular automata. The algebra was messy, and I worked it out using the system I had built that was the predecessor to Mathematica. But the end result was that there was a definite form for the corrections to the Navier–Stokes equations of fluid mechanics:

“Cellular Automaton Fluids 1: Basic Theory”

OK, so what’s the analog in our models? A key part of our derivation of the Einstein equations involves looking at volumes of small geodesic balls. On a d-dimensional manifold, the leading term is proportional to rd. Then there’s a correction that’s proportional to the Ricci scalar curvature, from which, in essence, we derive the Einstein equations. But what comes after that?

It turns out that longtime Mathematica user Alfred Gray had done this computation (even before Mathematica):

“The Volume of a Small Geodesic Ball of a Riemannian Manifold”

And basically using this result it’s possible to compute the form that the next-order corrections to the Einstein equations should take—as Jonathan already did in his paper a few months ago:

Next-order corrections to the Einstein equations

But what determines the parameters α, β, γ that appear here? Einstein’s original equations have the nice feature that they don’t involve any free parameters (apart from the cosmological constant): so long as there’s no “external source” (like “matter”) of energy-momentum the equations in effect just express the “conservation of cross-sectional area” of bundles of geodesics in spacetime. And this is similar to what happens with the Euler equations for incompressible fluids without viscosity—that essentially just express conservation of volume and momentum for “bundles of moving fluid”.

But to go further one actually has to know at least something about the structure and interactions of the underlying molecules. The analogy isn’t perfect, but working out the full Einstein equations including matter is roughly like working out the full Navier–Stokes equations for a fluid.

But there’s even further one can imagine going. In fluid mechanics, it’s about dealing with higher spatial derivatives of the velocity. In the case of our models, one has to deal with higher derivatives of the spacetime metric. In fluid mechanics the basic expansion parameter is the Knudsen number (molecular mean free path vs. length). In our models, the corresponding parameter is the ratio of the elementary length to a length associated with changes in the metric. Or, in other words, the higher-order corrections are about situations where one ends up seeing signs of deviations from pure continuum behavior.

In fluid mechanics, dealing with rarefied gases with higher Knudsen number and working out the so-called Burnett equations (and the various quantities that appear in them) is difficult. But it’s the analog of this that has to be done to fill in the parameters for corrections to the Einstein equations. It’s not clear to what extent the results will depend on the precise details of underlying hypergraph rules, and to what extent they’ll be at least somewhat generic—but it’s somewhat encouraging that at least to first order there are only a limited number of possible parameters.

In general, though, one can say that higher-order corrections can get large when the “radius of curvature” approaches the elementary length—or in effect sufficiently close to a curvature singularity.

Gauge Groups Meet Hypergraphs

Local gauge invariance is an important feature of what we know about physics. So how does it work in our models? At the Summer School, Graham Van Goffrier came up with a nice analysis that made considerably more explicit what we’d imagined before.

In the standard formalism of mathematical physics, based on continuous mathematics, one imagines having a fiber bundle in which at each point in a base space one has a fiber containing a copy of the gauge group, which is normally assumed to be a Lie group. But as Graham pointed out, one can set up a direct discrete analog of this. Imagine having a base space that’s a graph like:

Graph3D
&#10005
Graph3D[GridGraph[{5, 5, 5}]]

Now consider a discrete approximation to a Lie group, say the cyclic group like C6 approximating U(1):

Graph
&#10005
Graph[ResourceFunction["TorusGraph"][{6}], 
 EdgeStyle -> Directive[Red, Thick]]

Now imagine inserting the vertices of this at every point of the “base lattice”. Here’s an example of what one can get:

Base lattice

The red hexagons here are just visual guides; the true object simply has connections that knit together the “group elements” on each fiber. And the remarkable thing is that this can be thought of as a very direct discrete approximation to a fiber bundle—where the connections correspond quite literally to the so-called connections in the fiber bundle, that “align” the copies of the gauge group at each point, and in effect implement the covariant derivative.

In our models the structure of the discrete analog of the fiber bundle has to emerge from the actual operation of underlying hypergraph rules. And most likely this happens because there are multiple ways in which a given rule can locally be applied to a hypergraph, effectively leading to the kind of local symmetry we see appearing at every point of the base space.

But even without knowing any of the details of this, we can already work some things out just from the structure of our “fiber bundle graph”. For example, consider tracing out “Wilson loops” which visit fibers around a closed loop—and ask what the “total group action” associated with this process is. But by direct analogy with electromagnetism we can now interpret this as the “magnetic flux through the Wilson loop”.

But what happens if we look at the total flux emanating from a closed volume? For topological reasons, it’s inevitable that this is quantized. And even in the simple setup shown here we can start to interpret nonzero values as corresponding to the presence of “magnetic monopoles”.

Not Even Just Fundamental Physics

I developed what are now being called “Wolfram models” to be as a minimal and general as possible. And—perhaps not too surprisingly therefore—the models are looking as if they’re also very relevant for all sorts of things beyond fundamental physics. Several of these things got studied at the Summer School, notably in mathematics, in biology and in other areas of physics.

The applications in mathematics look to be particularly deep, and we’ve actually been working quite intensively on them over the past couple of weeks—leading to some rather exciting conclusions that I’m hoping to write about soon.

When it comes to biology, it seems possible that our models may be able to provide a new approach to thinking about biological evolution, and at the Summer School Antonia Kaestner and Tali Beynon started trying to understand how graphs—and multiway systems—might be used to represent evolutionary processes:

GraphicsBox

Another project at the Summer School, by Matthew Kafker (working with Christopher Wolfram), concerned hard sphere gases. I have a long personal history with hard sphere gases: looking at them was what first got me interested—back in 1972—in questions about the emergence of complexity. So I was quite surprised that after all these years, there was something new to consider with them. But a feature of our models is that they suggest a different way to look at systems.

Model

So what if we think of the collisions in a hard sphere gas as events? Then—just like in our models—we can make a causal graph that shows the causal relationships between these events:

And—just like in our models—we can define light cones and so on. But what does this tell us about hard sphere gases? Standard statistical mechanics approaches look at local statistical properties—in a sense making a “molecular chaos” assumption that everything else is random. But the causal graph has the potential to give us much more global (and long-range) information—which is likely to be increasingly important as the density of the hard sphere gas increases.

Hard sphere gases are based on classical physics. But given that our models naturally include quantum mechanics, does that give us a way to study quantum gases, or quantum fluids? At the Summer School Ivan Martinez studied a quantum generalization of my 1986 cellular automaton fluids model.

In that model discrete idealized molecules undergo 2- and 3-body collisions. And when I originally set this up, I just picked possible outcomes from these collisions consistent with momentum conservation. But there are several choices to make—and with the understanding we now have, the obvious thing to do is just to follow all choices, and make a multiway system. Here are the collisions and possible outcomes:

Collisions

A single branch of the multiway system produces a specific pattern of fluid flow. But the whole multiway system represents a whole collection of quantum states—or in effect a quantum fluid (and in the most obvious version of the model, a Fermi fluid). So now we can start to ask questions about the quantum fluid, studying branchial graphs, event horizons, etc.

And Lots of Other Projects Too…

And Lots of Other Projects Too...

I’ve talked about 11 projects so far here—but that’s less than a third of all the Wolfram Physics–related projects at the Summer School.

There were projects about the large-scale structure of hypergraphs, and phenomena like the spatial distribution of dimension, time variation of dimension and possible overall growth rates of hypergraphs. There was a project about characterizing overall structures of hypergraphs by finding PDE modes on them (“Weyl’s law for graphs”).

What happens if you look at the space of all possible hypergraphs, and for example form state transition graphs by applying rules? One project explored subgraphs excluded by evolution (“the approach to attractor states”). Another project explored the structure of the space of possible hypergraphs, and the mathematical analysis of ergodicity in it.

One of the upcoming challenges in our models is about identifying “particles” and their properties. One project started directly hunting for particles by looking at the effects of perturbations in hypergraphs. Another studied the dynamics of specific kinds of “topologically stable defects” in hypergraphs. There was a project looking for global conservation laws in hypergraph rewriting, and another studying local graph invariants. There was also a project that started to make a rather direct detector of gravitational waves in our models.

There were projects that analysed the global behavior of our models. One continued the enumeration of cases in which black holes arise. Another looked at termination and completion in multiway systems. Still another compared growth in physical vs. branchial space.

I mentioned above the concept of “proof by compilation” and its use in validating the quantum features of our models. One project at the Summer School began the process of using our models as a foundation for practical “numerical general relativity” (in much the same way as my cellular automata fluids have become the basis for practical fluid computations).

There are lots of interesting questions about how our models relate to known features of physics. And there were projects at the Summer School about understanding the emergence of rotational invariance and CPT invariance as well as the AdS/CFT correspondence (and things like the Bekenstein bound).

There were projects about the Wolfram Physics Project not only at our Summer School, but also at our High School Summer Camp. One explored the emergent differential geometry of a particular one of our models that makes something like a manifold with curvature. Others explored fundamental aspects of models like ours. One searched for multiway systems with intermediate growth. Another explored multiway systems based on cyclic string substitutions.

There were still other projects at both the Summer School and Summer Camp that explored systems from the computational universe—now informed by ideas from the Wolfram Physics Project. One looked at non-deterministic Turing machines; another looked at combinators.

I suggested most of the projects I’ve discussed here, and that makes it particularly satisfying for me to see how well they’ve progressed. Few are yet “finished”, but they’re all off and running, beginning to build up a serious corpus of work around the Wolfram Physics Project. And I’m looking forward to seeing how they develop, what they discover, how they turn into papers—and how they seed other work which will help explore the amazing basic science opportunity that’s opened up with the Wolfram Physics Project.

The Empirical Metamathematics of Euclid and Beyond

$
0
0
euclid-icon

The Empirical Metamathematics of Euclid and Beyond

Towards a Science of Metamathematics

One of the many surprising things about our Wolfram Physics Project is that it seems to have implications even beyond physics. In our effort to develop a fundamental theory of physics it seems as if the tower of ideas and formalism that we’ve ended up inventing are actually quite general, and potentially applicable to all sorts of areas.

One area about which I’ve been particularly excited of late is metamathematics—where it’s looking as if it may be possible to use our formalism to make what might be thought of as a “bulk theory of metamathematics”.

Mathematics itself is about what we establish about mathematical systems. Metamathematics is about the infrastructure of how we get there—the structure of proofs, the network of theorems, and so on. And what I’m hoping is that we’re going to be able to make an overall theory of how that has to work: a formal theory of the large-scale structure of metamathematics—that, among other things, can make statements about the general properties of “metamathematical space”.

Like with physical space, however, there’s not just pure underlying “geometry” to study. There’s also actual “geography”: in our human efforts to do mathematics over the last few millennia, where in metamathematical space have we gone, and “colonized”? There’ve been a few million mathematical theorems explicitly published in the history of human mathematics. What does the “empirical metamathematics” of them reveal? Some of it presumably reflects historical accidents, but some may instead reflect general features of metamathematics and metamathematical space.

I’ve wondered about empirical metamathematics for a long time, and tucked away on page 1176 at the end of the Notes for the section about “Implications for Mathematics and Its Foundations” in A New Kind of Science is something I wrote more than 20 years ago about it:

Empirical Metamathematics

This note is mostly about what a descriptive theory of empirical metamathematics might be like—for example characterizing what one might mean by a powerful theorem, a deep theorem, a surprising theorem and so on. But at the end of the note is a graph: an actual piece of quantitative empirical metamathematics, based on the best-known structured piece of mathematics in history—Euclid’s Elements.

The graph shows relationships between theorems in the Elements: a kind of causal graph of how different theorems make use of each other. As presented in A New Kind of Science, it’s a small “footnote item” that doesn’t look like much. But for more than 20 years, I’ve kept wondering what more there might be to learn from it. And now that I’m trying to make a general theory of metamathematics, it seemed like it was a good time to try to find out…

The Most Famous Math Book in History

Euclid’s Elements is an impressive achievement. Written in Greek around 300 BC (though presumably including many earlier results), the Elements in effect defined the way formal mathematics is done for more than two thousand years. The basic idea is to start from certain axioms that are assumed to be true, then—without any further “input from outside”—use purely deductive methods to establish a collection of theorems.

Euclid effectively had 10 axioms (5 “postulates” and 5 “common notions”), like “one can draw a straight line from any point to any other point”, or “things which equal the same thing are also equal to one another”. (One of his axioms was his fifth postulate—that parallel lines never meet—which might seem obvious, but which actually turns out not to be true for physical curved space in our universe.)

On the basis of his axioms, Euclid then gave 465 theorems. Many were about 2D and 3D geometry; some were about arithmetic and numbers. Among them were many famous results, like the Pythagorean theorem, the triangle inequality, the fact that there are five Platonic solids, the irrationality of and the fact that there are an infinite number of primes. But certainly not all of them are famous—and some seem to us now pretty obscure. And in what has remained a (sometimes frustrating) tradition of pure mathematics for more than two thousand years, Euclid never gives any narrative about why he’s choosing the theorems he does, out of all the infinitely many possibilities.

We don’t have any original Euclids, but versions from a few centuries later exist. They’re written in Greek, with each theorem explained in words, usually by referring to a diagram. Mathematical notation didn’t really start getting invented until the 1400s or so (i.e. a millennium and a half later)—and even the notation for numbers in Euclid’s time was pretty unwieldy. But Euclid had basically modern-looking diagrams, and he even labeled points and angles with (Greek) letters—despite the fact that the idea of variables standing for numbers wouldn’t be invented until the end of the 1500s.

There’s a stylized—almost “legalistic”—way that Euclid states his theorems. And so far as we can tell, in the original version, all that was done was to state theorems; there was no explanation for why a theorem might be true—no proof offered. But it didn’t take long before people started filling in proofs, and there was soon a standard set of proofs, in which each particular theorem was built up from others—and ultimately from the axioms.

There’ve been more than a thousand editions of Euclid printed (probably more than any other book except the Bible), and reading Euclid was until quite recently part of any serious education. (At Eton—where I went to high school—it was only in the 1960s that learning “mathematics” began to mean much other than reading Euclid, in the original Greek of course.) Here’s an edition of Euclid from the 1800s that I happen to own, with the proof of every theorem giving little references to other theorems that are used:

An edition of Euclid from the 1800s

But so what about the metamathematics of Euclid? Given all those theorems—and proofs—can we map out the structure of what Euclid did? That’s what the graph in A New Kind of Science was about. A few years ago, we put the data for that graph into our Wolfram Data Repository—and I looked at it again, but nothing immediately seemed to jump out about it; it still just seemed like a complicated mess:

Theorem Network from Euclid’s Elements

What else happened? One thing is that we added automated theorem proving to Mathematica and the Wolfram Language. Enter a potential theorem, and axioms from which to derive it, and FindEquationalProof will try to generate a proof. This works well for “structurally simple” mathematical systems (like basic logic), and indeed one can generate proofs with complex networks of lemmas that go significantly beyond what humans can do (or readily understand):

FindEquationalProof
&#10005
FindEquationalProof[p\[CenterDot]q == q\[CenterDot]p, \!\(
\*SubscriptBox[\(\[ForAll]\), \({a, b, 
     c}\)]\(\((\((a\[CenterDot]b)\)\[CenterDot]c)\)\[CenterDot]\((a\
\[CenterDot]\((\((a\[CenterDot]c)\)\[CenterDot]a)\))\) == 
    c\)\)]["ProofGraph"]

It’s in principle possible to use these methods to prove theorems in Euclidean geometry too. But it’s a different problem to make the proofs readily understandable to humans (like the step-by-step solutions of Wolfram|Alpha). So at least for now—even after 2000 years—the most effective source of information about the empirical metamathematics of proofs of Euclid’s theorems is still basically going to be Euclid’s Elements.

But when it comes to representing Euclid’s theorems there’s something new. The whole third-of-a-century story of the Wolfram Language has been about finding ways to represent more and more things in the world computationally. I had long wondered what it would take to represent Euclid-style geometry computationally. And in April I was excited to announce that we’d managed to do it:

Computational Euclid

Basic Statistics of Euclid

Euclid’s Elements is divided into 13 “books”, containing a total of 465 theorems (and 131 definitions):

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"]; Module[{g, h}, 
 g[expr_] := Style[Row[{"(", expr, ")"}], Italic, Gray, 10]; 
 h[expr_] := Style[expr, Italic, FontFamily -> "Source Sans Pro"]; 
 Text[Grid[
   MapIndexed[
    Prepend[#, 
      Style[{"subjects", "books", "theorems", "totals", "definitions",
          "totals"}[[First[#2]]], Italic, 
       FontFamily -> "Source Sans Pro"]] &, {{h[
       "2D geometry"], \[SpanFromLeft], \[SpanFromLeft], \
\[SpanFromLeft], \[SpanFromLeft], \[SpanFromLeft], 
      h["numbers"], \[SpanFromLeft], \[SpanFromLeft], \[SpanFromLeft],
       h["3D geometry"], \[SpanFromLeft], \[SpanFromLeft]}, 
     Style[#, Italic, Smaller] & /@ Range[13], {48, 14, 37, 16, 25, 
      33, 39, 27, 36, 115, 39, 18, 
      18}, {g[173], \[SpanFromLeft], \[SpanFromLeft], \
\[SpanFromLeft], \[SpanFromLeft], \[SpanFromLeft], 
      g[217], \[SpanFromLeft], \[SpanFromLeft], \[SpanFromLeft], 
      g[75], \[SpanFromLeft], \[SpanFromLeft]}, {23, 2, 11, 7, 18, 4, 
      22, 0, 0, 16, 28, 0, 
      0}, {g@
       65, \[SpanFromLeft], \[SpanFromLeft], \[SpanFromLeft], \
\[SpanFromLeft], \[SpanFromLeft], 
      g@38, \[SpanFromLeft], \[SpanFromLeft], \[SpanFromLeft], 
      g@28, \[SpanFromLeft], \[SpanFromLeft]}}], 
   Background -> {Prepend[
      Composition[Lighter[#, 0.4] &, bookColor] /@ Range[13], 
      GrayLevel[0.9]], None}, Frame -> All]]]

Stating the theorems takes 9589 words (about 60k characters) of Greek (about 13,000 words in a standard English translation). (The 10 axioms take another 115 words in Greek or about 140 in English, and the definitions another 2369 words in Greek or about 3300 in English.)

A typical theorem (or “proposition”)—in this case Book 1, Theorem 20—is stated as:

GreekEnglishShort
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"]; 
					GreekEnglishShort[<|
  "Book" -> 1, "Theorem" -> 20|>]

(This is what we now call the triangle inequality. And of course, to make this statement we have to have defined what a triangle is, and Euclid does that earlier in Book 1.)

If we look at the statements of Euclid’s theorems in Greek (or in English), there’s a distribution of lengths (colored here by subjects, and reasonably fit by a Pascal distribution):

GraphicsRow
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"]; GraphicsRow[
 MapThread[
  Function[{t, l}, Module[{dataE = WordCount[#[t]] & /@ eus, dataG},
    dataG = 
     GroupBy[If[MissingQ[#[[1]]["Book"]], 
         0 -> #[[2]], #[[1]]["Book"] -> #[[2]]] & /@ Normal[dataE], 
      First -> Last]; 
    Histogram[
     Flatten[Join[Values[dataG[[Key /@ #]]]]] & /@ {{0}, {1, 2, 3, 4, 
        5, 6}, {7, 8, 9, 10}, {11, 12, 13}}, {1}, Frame -> True, 
     PlotRange -> All, FrameLabel -> {l, None}, 
     FrameTicks -> {Automatic, None}, ChartLayout -> "Stacked", 
     ChartBaseStyle -> Opacity[1], 
     ChartStyle -> {bookColorIntense /@ {0, 6, 10, 13}, 
       EdgeForm[Directive[Thin, GrayLevel[0.15]]]}]]], {{"GreekText", 
    "Text"}, {"Greek words", "English words"}}]]

The “outlier” longest-to-state theorem (in both Greek and English) is the rather unremarkable 103-Greek-word 3.8

GreekEnglish
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"]; 
					GreekEnglish[<|"Book" -> 3, 
  "Theorem" -> 8|>, 12]

which can be illustrated as:

GeometricScene
&#10005
GeometricScene[
  {A, B, C, D, E, F, G, H, K, L, M},
  {
   GeometricAssertion[{D}, {"Outside", CircleThrough[{A, B, C}, M]}],
   GeometricAssertion[{A, B, C, E, F, G, H, K, L}, "Distinct"],
   Line[{D, G, M, A}],
   Line[{{D, K, E}, {D, L, F}, {D, H, C}}],
   CircleThrough[{A, B, C, E, F, G, H, K, L}, M],
   GeometricAssertion[{A, E, F, C, H, L, K, G, B}, 
    "CyclicallyOrdered"],
   Style[{Line[{M, K}], Line[{M, L}], Line[{M, H}], Line[{M, C}], 
     Line[{M, F}], Line[{M, E}]}, Dashed],
   PlanarAngle[{D, M, B}] == PlanarAngle[{D, M, K}],
   Line[{D, B}]
   }(*,
  {
  EuclideanDistance[D,A]>EuclideanDistance[D,E]>EuclideanDistance[D,
  F]>EuclideanDistance[D,C],
  EuclideanDistance[D,G]<EuclideanDistance[D,K]<EuclideanDistance[D,
  L]<EuclideanDistance[D,H],
  EuclideanDistance[D,B]==EuclideanDistance[D,K]
  }*)
  ] // RandomInstance

(The runner-up, at about two-thirds the length, is the also rather unremarkable 11.35.)

The nominally shortest-to-state theorems are in Book 10, Theorems 85 through 90, and all have just 4 Greek words:

GreekEnglishShort
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
GreekEnglish[<|"Book" -> 10, "Theorem" -> 85|>]
				

GreekEnglishShort
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					GreekEnglishShort[<|"Book" -> 10, "Theorem" -> 90|>]

The shortness of these theorems is a bit of a cheat, since the successive “apotomes” (pronounced /əˈpɒtəmi/ like “hippopotamus”) actually have quite long definitions that are given elsewhere. And, yes, some emphasis in math has changed in the past 2000+ years; you don’t hear about apotomes these days. (An apotome is a number x – y where isn’t rational, but is—as for , y = 1. It’s difficult enough to describe even this without math notation. But then for a “first apotome” Euclid added the conditions that both and x must be rational—all described in words.)

At five words, we’ve got one more familiar theorem (3.30) and another somewhat obscure one (10.26):

GreekEnglishShort
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					GreekEnglishShort[<|"Book" -> 3, "Theorem" -> 30|>]
GreekEnglishShort
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					GreekEnglishShort[<|"Book" -> 10, "Theorem" -> 26|>]

In our modern Wolfram Language representation, we’ve got a precise, symbolic way to state Euclid’s theorems. But Euclid had to rely on natural language (in his case, Greek). Some words he just assumed people would know the meanings of. But others he defined. Famously, he started at the beginning of Book 1 with his Definition 1—and in a sense changing how we think about this is what launched our whole Physics Project:

GreekEnglishShort
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					GreekEnglishShort[<|"Book" -> 1, "Definition" -> 1|>]

There is at least an implicit network of dependencies among Euclid’s definitions. Having started by defining points and lines, he moves on to defining things like triangles, and equilaterality, until eventually, for example, by Book 11 Definition 27 he’s saying things like “An icosahedron is a solid figure contained by twenty equal and equilateral triangles”.

Of course, Euclid didn’t ultimately have to set up definitions; he could just have repeated the content of each definition every time he wanted to refer to that concept. But like words in natural language—or functions in our computational language—definitions are an important form of compression for making statements. And, yes, you have to pick the right definitions to make the things you want to say easy to say. And, yes, your definitions will likely play at least some role in determining what kinds of things you choose to talk about. (Apotomes, anyone?)

The Interdependence of Theorems

All the theorems Euclid states represent less than 10,000 words of Greek. But the standard proofs of them are perhaps 150,000 words of Greek. (They’re undoubtedly not minimal proofs—but the fact that the same ones are being quoted after more than 2000 years presumably tells us at least something.)

Euclid is very systematic. Every theorem throughout the course of his Elements is proved in terms of earlier theorems (and ultimately in terms of his 10 axioms). Thus, for example, the proof of 1.14 (i.e. Book 1, Theorem 14) uses 1.13 as well as the axioms P2 (i.e. Postulate 2), P4, CN1 (i.e. Common Notion 1) and CN3. By the time one’s got to 12.18 the proof is written only in terms of other theorems (in this case 12.17, 12.2, 5.14 and 5.16) and not directly in terms of axioms.

The total number of theorems (or axioms) directly referenced in a given proof varies from 0 (for axioms) to 21 (for 12.17, which is about inscribing polyhedra in spheres); the average is 4.3:

Histogram
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Histogram[
 Module[{vod = # -> VertexOutDegree[euc, #] & /@ VertexList[euc], 
   dataG}, dataG = 
   GroupBy[If[MissingQ[#[[1]]["Book"]], 
       0 -> #[[2]], #[[1]]["Book"] -> #[[2]]] & /@ vod, First -> Last];
  Flatten[Join[Values[dataG[[Key /@ #]]]]] & /@ {{0}, {1, 2, 3, 4, 5, 
     6}, {7, 8, 9, 10}, {11, 12, 13}}
  ], {1}, Frame -> True, 
 FrameLabel -> {"number of theorems directly referenced"}, 
 ChartLayout -> "Stacked", ChartBaseStyle -> Opacity[1], 
 ChartStyle -> {bookColorIntense /@ {0, 6, 10, 13}, 
   EdgeForm[Directive[Thin, GrayLevel[0.15]]]}]

If we put Euclid’s axioms and theorems in order, we can represent which axioms or theorems occur in a given proof by an arrangement of dots across the page. For example, for 1.12 through 1.17 we have:

With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					With[{axiomcol = bookColorDarker[0], geom2dcol = bookColor[1]}, 
 Grid[With[{head = 
     Composition[Text, Style[#, 13] &, EuclidVertexName] /@ 
      Take[SortBy[VertexList[euc], Length], 26]}, 
   Prepend[Table[
     Prepend[If[
         MemberQ[Rest[
           VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> n|>, 
            1]], #], Style["\[FilledCircle]", 10], ""] & /@ 
       Take[SortBy[VertexList[euc], Length], 25], 
      Text[Style[EuclidVertexName[<|"Book" -> 1, "Theorem" -> n|>], 
        13]]], {n, 12, 17}], Prepend[head, ""]]], 
  Background -> {1 -> GrayLevel[.9], 
    1 -> GrayLevel[.9], {{{2, -1}, {2, 11}} -> 
      axiomcol, {{2, -1}, {12, -1}} -> geom2dcol}}, Frame -> All, 
  FrameStyle -> GrayLevel[.7], ItemSize -> All, 
  Spacings -> {0.2, 0.2}]]

Doing this for all the theorems we get:

key = MapIndexed
&#10005
Cell[CellGroupData[{

				Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"key", "=", 
   RowBox[{"MapIndexed", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"#", "\[Rule]", 
       RowBox[{"First", "[", "#2", "]"}]}], "&"}], ",", 
     RowBox[{
      RowBox[{"GatherBy", "[", 
       RowBox[{
        RowBox[{"VertexList", "[", "euc", "]"}], ",", "Length"}], 
       "]"}], "[", 
      RowBox[{"[", "1", "]"}], "]"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"bookmarkers", "=", 
   RowBox[{"Append", "[", 
    RowBox[{
     RowBox[{"First", "/@", 
      RowBox[{"Table", "[", 
       RowBox[{
        RowBox[{"FirstPosition", "[", 
         RowBox[{
          RowBox[{"First", "/@", "key"}], ",", 
          RowBox[{"<|", 
           RowBox[{
            RowBox[{"\"\<Book\>\"", "\[Rule]", "b"}], ",", 
            RowBox[{"\"\<Theorem\>\"", "\[Rule]", "_"}]}], "|>"}]}], 
         "]"}], ",", 
        RowBox[{"{", 
         RowBox[{"b", ",", "13"}], "}"}]}], "]"}]}], ",", 
     RowBox[{"Length", "[", "key", "]"}]}], "]"}]}], ";"}]], "Input"],

     Cell[BoxData[
 RowBox[{"ListPlot", "[", 
  RowBox[{
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"{", 
       RowBox[{"1", ",", 
        RowBox[{"-", "1"}]}], "}"}], "*", "#"}], "&"}], "/@", 
    RowBox[{"DeleteCases", "[", 
     RowBox[{
      RowBox[{"Reverse", "/@", 
       RowBox[{"(", 
        RowBox[{"List", "@@@", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"EdgeList", "[", "euc", "]"}], "/.", "key"}], 
          ")"}]}], ")"}]}], ",", 
      RowBox[{"{", 
       RowBox[{"x_", ",", "x_"}], "}"}]}], "]"}]}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", "Black"}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", "1"}], ",", 
   RowBox[{"GridLines", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{"bookmarkers", ",", 
      RowBox[{"-", "bookmarkers"}]}], "}"}]}], ",", 
   RowBox[{"Ticks", "\[Rule]", "None"}], ",", 
   RowBox[{"PlotRange", "->", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"-", "10"}], ",", "460"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{"-", "468"}], ",", 
        RowBox[{"-", "10"}]}], "}"}]}], "}"}]}], ",", " ", 
   RowBox[{"Epilog", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"{", 
          RowBox[{
           RowBox[{"Style", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"Text", "[", 
                RowBox[{
                 RowBox[{"#2", "[", 
                  RowBox[{"[", "1", "]"}], "]"}], ",", 
                 RowBox[{"{", 
                  RowBox[{
                   RowBox[{"Mean", "[", "#1", "]"}], ",", "5"}], 
                  "}"}]}], "]"}], ",", 
               RowBox[{"Text", "[", 
                RowBox[{
                 RowBox[{"#2", "[", 
                  RowBox[{"[", "1", "]"}], "]"}], ",", 
                 RowBox[{"{", 
                  RowBox[{
                   RowBox[{"-", "5"}], ",", 
                   RowBox[{"-", 
                    RowBox[{"Mean", "[", "#1", "]"}]}]}], "}"}]}], 
                "]"}]}], "}"}], ",", "14"}], "]"}], ",", 
           RowBox[{"bookColorDarker", "[", 
            RowBox[{"#2", "[", 
             RowBox[{"[", "1", "]"}], "]"}], "]"}], ",", 
           RowBox[{"Opacity", "[", ".3", "]"}], ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"#1", "[", 
                RowBox[{"[", "1", "]"}], "]"}], ",", 
               RowBox[{"-", "465"}]}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"#1", "[", 
                RowBox[{"[", "2", "]"}], "]"}], ",", "0"}], "}"}]}], 
            "]"}], ",", 
           RowBox[{"Rectangle", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"0", ",", 
               RowBox[{"-", 
                RowBox[{"#1", "[", 
                 RowBox[{"[", "2", "]"}], "]"}]}]}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"465", ",", 
               RowBox[{"-", 
                RowBox[{"#1", "[", 
                 RowBox[{"[", "1", "]"}], "]"}]}]}], "}"}]}], "]"}]}],
           "}"}], "&"}], ",", 
        RowBox[{"Partition", "[", 
         RowBox[{"bookmarkers", ",", "2", ",", "1"}], "]"}]}], "]"}], 
      ",", 
      RowBox[{"GrayLevel", "[", ".6", "]"}], ",", 
      RowBox[{"Opacity", "[", ".3", "]"}], ",", 
      RowBox[{"Rectangle", "[", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"0", ",", "0"}], "}"}], ",", 
        RowBox[{"{", 
         RowBox[{"465", ",", "20"}], "}"}]}], "]"}], ",", 
      RowBox[{"Rectangle", "[", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{
          RowBox[{"-", "10"}], ",", 
          RowBox[{"-", "465"}]}], "}"}], ",", 
        RowBox[{"{", 
         RowBox[{"0", ",", "0"}], "}"}]}], "]"}]}], "}"}]}]}], 
  "]"}]], "Input"]
}, Open  ]]

We can see there’s lots of structure here. For example, there are clearly “popular” theorems near the beginning of Book 6 and Book 10, to which lots of at least “nearby” theorems refer. There are also “gaps”: ranges of theorems that no theorems in a given book refer to.

At a coarse level, something we can do is to look at cross-referencing within and between books:

books
&#10005
Cell[CellGroupData[{

						Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],
						
Cell[BoxData[
 RowBox[{
  RowBox[{"books", "=", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "2"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "3"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "4"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "5"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "6"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "7"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "8"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "9"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "10"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "11"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "12"}], ",", 
     RowBox[{"\"\<Book\>\"", "\[Rule]", "13"}]}], "}"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"vertexweights", "=", 
   RowBox[{"Select", "[", 
    RowBox[{
     RowBox[{"Tally", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"First", "[", 
         RowBox[{"Normal", "[", "#", "]"}], "]"}], "&"}], "/@", 
       RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}], ",", 
     RowBox[{
      RowBox[{"MemberQ", "[", 
       RowBox[{"books", ",", 
        RowBox[{"First", "[", "#", "]"}]}], "]"}], "&"}]}], "]"}]}], 
  ";"}]], "Input"],

  Cell[BoxData[
 RowBox[{
  RowBox[{"edgeweights", "=", 
   RowBox[{"Select", "[", 
    RowBox[{
     RowBox[{"Tally", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"{", 
         RowBox[{
          RowBox[{
           RowBox[{"Normal", "[", "#", "]"}], "[", 
           RowBox[{"[", 
            RowBox[{"1", ",", "1"}], "]"}], "]"}], ",", 
          RowBox[{
           RowBox[{"Normal", "[", "#", "]"}], "[", 
           RowBox[{"[", 
            RowBox[{"2", ",", "1"}], "]"}], "]"}]}], "}"}], "&"}], "/@", 
       RowBox[{"EdgeList", "[", "euc", "]"}]}], "]"}], ",", 
     RowBox[{
      RowBox[{"MemberQ", "[", 
       RowBox[{"books", ",", 
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", "2"}], "]"}], "]"}]}], "]"}], "&"}]}], 
    "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"bookweights", "=", 
   RowBox[{
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"Last", "[", 
        RowBox[{"First", "[", "#", "]"}], "]"}], ",", 
       RowBox[{"Last", "[", "#", "]"}]}], "}"}], "&"}], "/@", 
    "vertexweights"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"edgesout", "=", 
   RowBox[{
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"#", "[", 
        RowBox[{"[", 
         RowBox[{"1", ",", "1", ",", "1", ",", "2"}], "]"}], "]"}], 
       ",", 
       RowBox[{"Total", "[", 
        RowBox[{"#", "[", 
         RowBox[{"[", "2", "]"}], "]"}], "]"}]}], "}"}], "&"}], "/@", 
    
    RowBox[{"(", 
     RowBox[{"Transpose", "/@", 
      RowBox[{"GatherBy", "[", 
       RowBox[{"edgeweights", ",", 
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", 
           RowBox[{"1", ",", "1"}], "]"}], "]"}], "&"}]}], "]"}]}], 
     ")"}]}]}], ";"}]], "Input"],

Cell[BoxData[{
 RowBox[{
  RowBox[{
   RowBox[{"normalizededgeweights", "=", 
    RowBox[{
     RowBox[{
      RowBox[{
       RowBox[{"DirectedEdge", "[", 
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", 
           RowBox[{"1", ",", "1", ",", "2"}], "]"}], "]"}], ",", 
         RowBox[{"#", "[", 
          RowBox[{"[", 
           RowBox[{"1", ",", "2", ",", "2"}], "]"}], "]"}]}], "]"}], 
       "\[Rule]", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", "2", "]"}], "]"}], "/", 
        RowBox[{"edgesout", "[", 
         RowBox[{"[", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", 
             RowBox[{"1", ",", "1", ",", "2"}], "]"}], "]"}], ",", 
           "2"}], "]"}], "]"}]}]}], "&"}], "/@", "edgeweights"}]}], 
   ";"}], "\n"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{
   RowBox[{
    RowBox[{"diskedLine", "[", 
     RowBox[{"{", 
      RowBox[{"line_", ",", "radii_"}], "}"}], "]"}], ":=", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{
       RowBox[{"RegionIntersection", "[", 
        RowBox[{
         RowBox[{"Line", "[", "line", "]"}], ",", 
         RowBox[{"Circle", "[", 
          RowBox[{
           RowBox[{"line", "[", 
            RowBox[{"[", "1", "]"}], "]"}], ",", 
           RowBox[{"radii", "[", 
            RowBox[{"[", "1", "]"}], "]"}]}], "]"}]}], "]"}], "[", 
       RowBox[{"[", 
        RowBox[{"1", ",", "1"}], "]"}], "]"}], ",", 
      RowBox[{
       RowBox[{"RegionIntersection", "[", 
        RowBox[{
         RowBox[{"Line", "[", "line", "]"}], ",", 
         RowBox[{"Circle", "[", 
          RowBox[{
           RowBox[{"line", "[", 
            RowBox[{"[", "2", "]"}], "]"}], ",", 
           RowBox[{"radii", "[", 
            RowBox[{"[", "2", "]"}], "]"}]}], "]"}]}], "]"}], "[", 
       RowBox[{"[", 
        RowBox[{"1", ",", "1"}], "]"}], "]"}]}], "}"}]}], ";"}], 
  "\n"}], "\[IndentingNewLine]", 
 RowBox[{
  RowBox[{
   RowBox[{"weightedArrow", "[", 
    RowBox[{"line_", ",", "weight_"}], "]"}], ":=", 
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
      "len", ",", "start", ",", "end", ",", "angle", ",", "thick", 
       ",", "rec", ",", "mid"}], "}"}], ",", 
     RowBox[{
      RowBox[{"start", "=", 
       RowBox[{"line", "[", 
        RowBox[{"[", "1", "]"}], "]"}]}], ";", 
      RowBox[{"end", "=", 
       RowBox[{"line", "[", 
        RowBox[{"[", "2", "]"}], "]"}]}], ";", 
      RowBox[{"mid", "=", 
       RowBox[{"Mean", "[", "line", "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"len", "=", 
       RowBox[{"EuclideanDistance", "[", 
        RowBox[{"start", ",", "end"}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"angle", "=", 
       RowBox[{"Arg", "[", 
        RowBox[{
         RowBox[{"(", 
          RowBox[{"start", "-", "end"}], ")"}], ".", 
         RowBox[{"{", 
          RowBox[{"1", ",", "I"}], "}"}]}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"thick", "=", 
       RowBox[{"weight", "/", "len"}]}], ";", "\[IndentingNewLine]", 
      RowBox[{"rec", "=", 
       RowBox[{
        RowBox[{
         RowBox[{"#", "+", "mid"}], "&"}], "/@", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"RotationMatrix", "[", "angle", "]"}], ".", "#"}],
            "&"}], "/@", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{"-", "len"}], "/", "2"}], ",", 
              RowBox[{
               RowBox[{"-", "thick"}], "/", "2"}]}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"len", "/", "2"}], ",", 
              RowBox[{
               RowBox[{"-", "thick"}], "/", "2"}]}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{"len", "/", "2"}], ",", 
              RowBox[{"thick", "/", "2"}]}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{"-", "len"}], "/", "2"}], ",", 
              RowBox[{"thick", "/", "2"}]}], "}"}]}], "}"}]}], 
         ")"}]}]}], ";", "\[IndentingNewLine]", 
      RowBox[{"Polygon", "[", "rec", "]"}]}]}], "]"}]}], 
  ";"}]}], "Input"],

Cell[BoxData[
 RowBox[{"Labeled", "[", 
  RowBox[{
   RowBox[{"Graph", "[", 
    RowBox[{
     RowBox[{"Range", "[", "13", "]"}], ",", 
     RowBox[{"First", "/@", "normalizededgeweights"}], ",", 
     RowBox[{"EdgeStyle", "\[Rule]", 
      RowBox[{"Thread", "[", 
       RowBox[{
        RowBox[{"First", "/@", "normalizededgeweights"}], "\[Rule]", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{"{", 
            RowBox[{
             RowBox[{"AbsoluteThickness", "[", 
              RowBox[{"15", " ", 
               RowBox[{"Last", "[", "#", "]"}]}], "]"}], ",", 
             RowBox[{"bookColorIntense", "[", 
              RowBox[{"First", "[", 
               RowBox[{"First", "[", "#", "]"}], "]"}], "]"}], ",", 
             RowBox[{"Arrowheads", "[", 
              RowBox[{
               RowBox[{"Last", "[", "#", "]"}], "/", "15"}], "]"}]}], 
            "}"}], "&"}], "/@", "normalizededgeweights"}], ")"}]}], 
       "]"}]}], ",", 
     RowBox[{"VertexSize", "\[Rule]", 
      RowBox[{"Thread", "[", 
       RowBox[{
        RowBox[{"First", "/@", "bookweights"}], "\[Rule]", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{"1.5", " ", 
            RowBox[{
             RowBox[{"Sqrt", "[", "#", "]"}], "/", "20"}]}], "&"}], "/@", 
          RowBox[{"(", 
           RowBox[{"Last", "/@", "bookweights"}], ")"}]}], ")"}]}], 
       "]"}]}], ",", 
     RowBox[{"VertexStyle", "\[Rule]", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{
         RowBox[{"#", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"bookColorIntense", "[", "#", "]"}], ",", 
            RowBox[{"EdgeForm", "[", 
             RowBox[{"Darker", "[", 
              RowBox[{
               RowBox[{"bookColorIntense", "[", "#", "]"}], ",", 
               ".2"}], "]"}], "]"}]}], "}"}]}], "&"}], "/@", 
        RowBox[{"Range", "[", "13", "]"}]}], ")"}]}], ",", 
     RowBox[{"GraphLayout", "\[Rule]", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{
        "\"\<VertexLayout\>\"", "\[Rule]", " ", 
         "\"\<SpringElectricalEmbedding\>\""}], ",", 
        RowBox[{"\"\<SelfLoopRadius\>\"", "\[Rule]", " ", "1"}]}], 
       "}"}]}], ",", 
     RowBox[{"VertexLabels", "\[Rule]", 
      RowBox[{"Placed", "[", 
       RowBox[{"Automatic", ",", "Center"}], "]"}]}], ",", 
     RowBox[{"PerformanceGoal", "\[Rule]", "\"\<Quality\>\""}], ",", 
     RowBox[{"BaseStyle", "\[Rule]", "13"}]}], "]"}], ",", 
   RowBox[{"Row", "[", 
    RowBox[{
     RowBox[{
      RowBox[{
       RowBox[{"Row", "[", 
        RowBox[{"#", ",", 
         RowBox[{"Spacer", "[", "0.005", "]"}]}], "]"}], "&"}], "/@", 
      
      RowBox[{"Transpose", "[", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"bookColorIntense", "/@", 
          RowBox[{"{", 
           RowBox[{"6", ",", "10", ",", "13"}], "}"}]}], ",", 
         RowBox[{
          RowBox[{
           RowBox[{"Style", "[", 
            RowBox[{"#", ",", 
             RowBox[{
             "FontFamily", "\[Rule]", "\"\<Source Sans Pro\>\""}], 
             ",", 
             RowBox[{"GrayLevel", "[", "0.3", "]"}], ",", 
             RowBox[{"FontSize", "\[Rule]", "16"}]}], "]"}], "&"}], "/@", 
          RowBox[{"{", 
           RowBox[{
           "\"\<2D geometry\>\"", ",", " ", "\"\<numbers\>\"", ",", 
            "\"\<3D geometry\>\""}], "}"}]}]}], "}"}], "]"}]}], ",", 
     RowBox[{"Spacer", "[", "20", "]"}]}], "]"}]}], "]"}]], "Input"]
}, Open  ]]

The size of each node represents the number of theorems in each book. The thickness of each arrow represents the fraction of references in the proofs of those theorems going to different books. The self-loops are from theorems in a given book that refer to theorems in the same book. Needless to say, the self-loop is large for Book 1, since it doesn’t have any previous book to refer to. Book 7 again has a large self-loop, because it’s the first book about numbers, and doesn’t refer much to the earlier books (which are about 2D geometry).

It’s interesting to see that Books 7, 8 and 9—which are about numbers rather than geometry—“keep to themselves”, even though Book 10, which is also about numbers, is more central. It’s also interesting to see the interplay between the books on 2D and 3D geometry over on the right-hand side of the graph.

But, OK, what about individual theorems? What is their network of dependencies?

Here’s 1.5, whose proof is given in terms of 1.3 and 1.4, as well as the axioms P1, P2 and CN3:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> 5|>, 1]]]

But now we can continue this, and show what 1.3 and 1.4 depend on—all the way down to the axioms:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> 5|>, 2]]]

Later theorems depend on much more. Here are the direct dependencies for 12.18:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 12, "Theorem" -> 18|>, 1]]]

Here’s what happens if one goes another step:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 12, "Theorem" -> 18|>, 2]], 
 VertexSize -> .9, BaseStyle -> 8, AspectRatio -> 1/3]

Here’s 3 steps:

EuclidGraphSmall
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphSmall[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 12, "Theorem" -> 3|>, 
   3]], "Intense"]

And here’s what happens if one goes all the way down to the axioms (which in this case takes 5 steps):

EuclidGraphSmall
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphSmall[
 Subgraph[euc, 
  VertexOutComponent[
   euc, <|"Book" -> 12, "Theorem" -> 3|>]], "Intense"]

Things look a little simpler if we consider the transitive reduction of this graph. We’re no longer faithfully representing what’s in the text of Euclid, but we’re still capturing the core dependency information. If theorem A in Euclid refers to B, and B refers to C, then even if in Euclid A refers to C we won’t mention that. And, yes, graph theoretically AC is just the transitive closure of AB and BC. But it could still be that the pedagogical structure of the proof of theorem A makes it desirable to refer to theorem B, even if in principle one could rederive theorem B from theorem C.

Here’s the original 1-step graph for 12.18, along with its transitive reduction:

Row
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Row[Riffle[
  EuclidGraphLarge[#[
      Subgraph[euc, 
       VertexOutComponent[euc, <|"Book" -> 12, "Theorem" -> 18|>, 
        1]]], ImageSize -> {Automatic, 180}] & /@ {Identity, 
    TransitiveReductionGraph}, Spacer[50]]]

And here, by the way, is also the “fully pedantic” transitive closure, including all indirect connections, whether they’re mentioned by Euclid or not:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 TransitiveClosureGraph[
  Subgraph[euc, 
   VertexOutComponent[euc, <|"Book" -> 12, "Theorem" -> 18|>, 1]]], 
 ImageSize -> {Automatic, 200}]

And now here’s the transitive reduction of the full 12.8 dependency graph, all the way down to the axioms:

EuclidGraphSmall
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphSmall[
 TransitiveReductionGraph[
  Subgraph[euc, 
   VertexOutComponent[
    euc, <|"Book" -> 12, "Theorem" -> 18|>]]], "Intense"]

And what all these graphs show is that even to prove one theorem, one’s making use of lots of other theorems. To make this quantitative, we can plot the total number of theorems that appear anywhere in the “full proof” of a given theorem, ultimately working all the way down to the axioms:

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Module[{dataA = 
   If[MissingQ[#[["Book"]]], 
      Nothing, #[["Book"]] -> Length[VertexOutComponent[euc, #]]] & /@
     VertexList[euc], vals, acc, xval},
 vals = CountsBy[dataA, First]; 
 acc = Association[
   MapIndexed[First[#2] -> #1 &, 
    Accumulate[Values[CountsBy[dataA, First]]]]];
 xval = Association[#[[1]] -> (#[[2]] - vals[#[[1]]]/2) & /@ 
    Normal[acc]];
 Show[{ListLinePlot[Values[dataA], Axes -> {False, True}, 
    Frame -> True, 
    FrameLabel -> {"theorems by book", "theorems in full proof"}, 
    FrameTicks -> {{True, 
       False}, {{#[[2]], #[[1]], {0, 0}} & /@ Normal[xval], False}}, 
    Filling -> Axis, ColorFunctionScaling -> False, 
    ColorFunction -> 
     Function[{x, y}, 
      Piecewise[{{bookColorIntense[6], 
         x <= acc[6]}, {bookColorIntense[10], 
         x <= acc[10]}, {bookColorIntense[13], x <= acc[13]}}]], 
    PlotRange -> All ],
   Graphics[{GrayLevel[0.5], 
     Line[{{#, -5}, {#, 300}} & /@ Values[acc]]}]
   }]]

At the beginnings of many of the books, there tend to be theorems that are proved more directly from the axioms, so they don’t depend on as much. But as one progresses through the books, one’s relying on more and more theorems—sometimes, as we saw above, in the same book, and sometimes in earlier books.

From the picture above, we can see that Euclid in a sense builds up to a “climax” at the end—with his very last theorem (13.18) depending on more theorems than anything else. We’ll be discussing “Euclid’s last theorem” some more below...

The Graph of All Theorems

OK, so what is the full interdependence graph for all the theorems in Euclid? It’s convenient to go the opposite way than in our previous graphs—and put the axioms at the top, and show how theorems below are derived from them. Here’s the graph one gets by doing that:

Labeled
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Labeled[ReverseGraph[euc, 
  GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
  AspectRatio -> 1/2, EdgeStyle -> GrayLevel[.5, .5], 
  VertexStyle -> (# -> EuclidVertexStyle[#, "Intense"] & /@ 
     VertexList[euc]), VertexSize -> 6, 
  VertexLabels -> (# -> EuclidVertexName[#] & /@ VertexList[euc])], 
 Row[Row[#, Spacer[0.005]] & /@ 
   Transpose[{bookColorIntense /@ {0, 6, 10, 13}, 
     Style[#, FontFamily -> "Source Sans Pro", GrayLevel[0.3], 
        FontSize -> 11] & /@ {"axioms", "2D geometry", "numbers", 
       "3D geometry"}}], Spacer[20]]]

One can considerably simplify this by looking just at the transitive reduction graph (the full graph has 2054 connections; this reduction has 974, while if one went “fully pedantic” with transitive closure, one would have 25,377 connections):

Graph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Graph[TransitiveReductionGraph[ReverseGraph[euc]], 
 GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
 AspectRatio -> 1/2, EdgeStyle -> GrayLevel[.5, .5], 
 VertexStyle -> (# -> EuclidVertexStyle[#, "Intense"] & /@ 
    VertexList[euc]), VertexSize -> 1.7, 
 VertexLabels -> (# -> 
      Style[EuclidVertexName[#], Background -> Opacity[.4, White]] & /@
     VertexList[euc])]

What can we see from this? Probably the most obvious thing is that the graphs start fairly sparse, then become much denser. And what this effectively means is that one starts off by proving certain “preliminaries”, and then after one’s done that, it unlocks a mass of other theorems. Or, put another way, if we were exploring this metamathematical space starting from the axioms, progress might seem slow at first. But after proving a bunch of preliminary theorems, we’d be able to dramatically speed up.

Here’s another view of this, plotting how many subsequent theorems depend on each different theorem:

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Module[{dataA = 
   If[MissingQ[#[["Book"]]], 
      Nothing, #[["Book"]] -> Length[VertexInComponent[euc, #]]] & /@ 
    VertexList[euc], vals, acc, xval},
 vals = CountsBy[dataA, First]; 
 acc = Association[
   MapIndexed[First[#2] -> #1 &, 
    Accumulate[Values[CountsBy[dataA, First]]]]];
 xval = Association[#[[1]] -> (#[[2]] - vals[#[[1]]]/2) & /@ 
    Normal[acc]]; 
 Show[{ListLinePlot[Values[dataA], Axes -> {False, True}, 
    Frame -> True, 
    FrameLabel -> {"theorems by book", "dependent theorems"}, 
    Filling -> Axis, 
    FrameTicks -> {{True, 
       False}, {{#[[2]], #[[1]], {0, 0}} & /@ Normal[xval], False}}, 
    ColorFunctionScaling -> False, 
    ColorFunction -> 
     Function[{x, y}, 
      Piecewise[{{bookColorIntense[6], 
         x <= acc[6]}, {bookColorIntense[10], 
         x <= acc[10]}, {bookColorIntense[13], x <= acc[13]}}]], 
    PlotRange -> All ], 
   Graphics[{GrayLevel[0.5], 
     Line[{{#, -5}, {#, 400}} & /@ Values[acc]]}]}]]

In a sense, this is complementary to the plot we made above, that showed how many theorems a given theorem depends on. (From a graph-theoretical point of view they’re very directly complementary: this plot involves VertexInComponent; the previous one involved VertexOutComponent.)

And what the plot shows is that there are a bunch of early theorems (particularly in Book 1) that have lots of subsequent theorems depending on them—so that they’re effectively foundational to much of what follows. The plot also shows that in most of the books the early theorems are the most “foundational”, in the sense that the most subsequent theorems depend on them.

By the way, we can also look at the overall form of the basic dependency graph, not layering it starting from the axioms:

Graph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Graph[euc, 
 VertexStyle -> (# -> EuclidVertexStyle[#, "Intense"] & /@ 
    VertexList[euc]), VertexSize -> 3, EdgeStyle -> GrayLevel[.5, .5],
  VertexLabels -> (# -> 
      Style[EuclidVertexName[#], GrayLevel[.3], 
       Background -> Opacity[.4, White]] & /@ VertexList[euc]), 
 AspectRatio -> 1]

The transitive reduction is slightly easier to interpret:

Graph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Graph[ReverseGraph[TransitiveReductionGraph[ReverseGraph[euc]]], 
 VertexStyle -> (# -> EuclidVertexStyle[#, "Intense"] & /@ 
    VertexList[euc]), VertexSize -> 8, EdgeStyle -> GrayLevel[.5, .5],
  VertexLabels -> (# -> 
      Style[EuclidVertexName[#], GrayLevel[.1], 
       Background -> Opacity[.4, White]] & /@ VertexList[euc]), 
 AspectRatio -> 1]

And the main notable feature is the presence of “prongs” associated, for example, with Book 9 theorems about the properties of even and odd numbers.

The Causal Graph Analogy

Knowing about the Wolfram Physics Project, there’s an obvious analog of theorem dependency graphs: they’re like causal graphs. You start from a certain set of “initial events” (the “big bang”), corresponding to the axioms. Then each subsequent theorem is like an event, and the theorem dependency graph is tracing out the causal connections between these events.

Just like the causal graph, the theorem dependency graph defines a partial ordering: you can’t write down the proof of a given theorem until the theorems that will appear in it have been proved. Like in the causal graph, one can define light cones: there’s a certain set of “future” theorems that can be affected by any given theorem. Here is the “future light cone” of Book 1, Theorem 5:

HighlightGraph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
						HighlightGraph[ReverseGraph[euc, EdgeStyle -> GrayLevel[.5, .5]], 
 Subgraph[ReverseGraph[euc], 
  VertexOutComponent[
   ReverseGraph[euc], <|"Book" -> 1, "Theorem" -> 5|>]], 
 GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
 AspectRatio -> 1/2]

And here is the corresponding transitive reduction graph:

TransitiveReductionGraph
&#10005
TransitiveReductionGraph[%]

But now let’s think about the notion of time in the theorem dependency graph. Imagine you were rederiving the theorems in Euclid in a series of “time steps”. What would you have to do at each time step? The theorem dependency graph tells you what you will have to have done in order to derive a particular theorem. But just like for spacetime causal graphs, there are many different foliations one can use to define consistent time steps.

Here’s an obvious one, effectively corresponding to a “cosmological rest frame” in which at each step one “does as much as one consistently can at that step”:

ReverseGraph
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/KXgcRNRJ\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"GraphPlot", "[", 
  RowBox[{
   RowBox[{"ReverseGraph", "[", "euc", "]"}], ",", 
   RowBox[{"GraphLayout", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{"\"\<LayeredDigraphEmbedding\>\"", ",", 
      RowBox[{"\"\<RootVertex\>\"", "\[Rule]", "axioms"}]}], "}"}]}], 
   ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "2"}]}], ",", 
   RowBox[{"VertexStyle", "\[Rule]", 
    RowBox[{"(", 
     RowBox[{
      RowBox[{
       RowBox[{"#", "\[Rule]", 
        RowBox[{"EuclidVertexStyle", "[", 
         RowBox[{"#", ",", "\"\<Intense\>\""}], "]"}]}], "&"}], "/@", 
      
      RowBox[{"VertexList", "[", "euc", "]"}]}], ")"}]}], ",", 
   RowBox[{"VertexSize", "\[Rule]", "4"}], ",", 
   RowBox[{"EdgeStyle", "\[Rule]", 
    RowBox[{"GrayLevel", "[", 
     RowBox[{".5", ",", ".5"}], "]"}]}], ",", 
   RowBox[{"Epilog", "\[Rule]", 
    RowBox[{"Scale", "[", 
     RowBox[{
      RowBox[{"straightFoliationLines", "[", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{"0.43", ",", "0"}], "}"}], ",", 
        RowBox[{"{", 
         RowBox[{"0", ",", "0"}], "}"}], ",", 
        RowBox[{"#", "&"}], ",", 
        RowBox[{"{", 
         RowBox[{"45", ",", "3"}], "}"}]}], "]"}], ",", "4"}], 
     "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

And here are the number of theorems that appear on each slice (in effect each theorem appears on the slice determined by its longest path to any axiom):

longestpathlengths
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

   Cell[BoxData[
 RowBox[{
  RowBox[{"longestpathlengths", "=", 
   RowBox[{"ParallelMap", "[", 
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{"t", "->", 
        RowBox[{
         RowBox[{"Max", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"Length", "[", 
             RowBox[{"FindLongestPath", "[", 
              RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], 
            "&"}], "/@", "axioms"}], "]"}], "-", "1"}]}]}], "]"}], 
     ",", 
     RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"layers", "=", 
   RowBox[{"Map", "[", 
    RowBox[{"First", ",", 
     RowBox[{"SortBy", "[", 
      RowBox[{
       RowBox[{"GatherBy", "[", 
        RowBox[{"longestpathlengths", ",", "Last"}], "]"}], ",", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", "2"}], "]"}], "]"}], "&"}]}], "]"}], ",", 
     
     RowBox[{"{", "2", "}"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{"data", ",", 
     RowBox[{"max", "=", 
      RowBox[{"Length", "[", "layers", "]"}]}]}], "}"}], ",", 
   RowBox[{
    RowBox[{"data", "=", 
     RowBox[{"Map", "[", 
      RowBox[{
       RowBox[{"Function", "[", 
        RowBox[{"u", ",", 
         RowBox[{"PadLeft", "[", 
          RowBox[{
           RowBox[{"Reverse", "[", 
            RowBox[{
             RowBox[{
              RowBox[{"If", "[", 
               RowBox[{
                RowBox[{"MissingQ", "[", 
                 RowBox[{"#", "[", "\"\<Book\>\"", "]"}], "]"}], ",", 
                "100", ",", 
                RowBox[{"#", "[", "\"\<Book\>\"", "]"}]}], "]"}], 
              "&"}], "/@", "u"}], "]"}], ",", "max"}], "]"}]}], "]"}],
        ",", "layers"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"Show", "[", 
     RowBox[{
      RowBox[{"ArrayPlot", "[", 
       RowBox[{
        RowBox[{"Transpose", "[", "data", "]"}], ",", 
        RowBox[{"AspectRatio", "\[Rule]", 
         RowBox[{"1", "/", "4"}]}], ",", 
        RowBox[{"ColorRules", "->", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{
            "1", "|", "2", "|", "3", "|", "4", "|", "5", "|", "6"}], "->", 
            RowBox[{"bookColorIntense", "[", "6", "]"}]}], ",", 
           RowBox[{
            RowBox[{"7", "|", "8", "|", "9", "|", "10"}], "->", 
            RowBox[{"bookColorIntense", "[", "10", "]"}]}], ",", 
           RowBox[{
            RowBox[{"11", "|", "12", "|", "13"}], "->", 
            RowBox[{"bookColorIntense", "[", "13", "]"}]}], ",", 
           RowBox[{"100", "\[Rule]", 
            RowBox[{"bookColorIntense", "[", "0", "]"}]}]}], 
          "}"}]}]}], "]"}], ",", 
      RowBox[{"Frame", "\[Rule]", "True"}], ",", 
      RowBox[{"FrameTicks", "\[Rule]", "Automatic"}], ",", " ", 
      RowBox[{"FrameLabel", "\[Rule]", 
       RowBox[{"{", 
        RowBox[{
        "\"\<longest path to axioms\>\"", ",", "\[IndentingNewLine]", 
         "\"\<number of theorems\>\""}], "}"}]}]}], "]"}]}]}], 
  "]"}]], "Input"]
}, Open  ]]

But there are many other foliations that are possible, in which one for example concentrates first on a particular group of theorems, only doing others when one “needs to”.

Each choice of foliation can be thought of as corresponding to a different reference frame—and a different choice of how one explores the analog of spacetime in Euclid. But, OK, if the foliations define successive moments in time—or successive “simultaneity surfaces”—what is the analog of space? In effect, the “structure of space” is defined by the way that theorems are laid out on the slices defined by the foliations. And a convenient way to probe this is to look at branchial graphs, in which pairs of theorems on a given slice are connected by an edge if they have an immediate common ancestor on the slice before.

So here are the branchial graphs for all successive slices of Euclid in the “cosmological rest frame”:

GraphicsGrid
&#10005

Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"longestpathlengths", "=", 
   RowBox[{"ParallelMap", "[", 
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{"t", "->", 
        RowBox[{
         RowBox[{"Max", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"Length", "[", 
             RowBox[{"FindLongestPath", "[", 
              RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], 
            "&"}], "/@", "axioms"}], "]"}], "-", "1"}]}]}], "]"}], 
     ",", 
     RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"layers", "=", 
   RowBox[{"Map", "[", 
    RowBox[{"First", ",", 
     RowBox[{"SortBy", "[", 
      RowBox[{
       RowBox[{"GatherBy", "[", 
        RowBox[{"longestpathlengths", ",", "Last"}], "]"}], ",", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", "2"}], "]"}], "]"}], "&"}]}], "]"}], ",", 
     
     RowBox[{"{", "2", "}"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"GraphicsGrid", "[", 
  RowBox[{
   RowBox[{"Partition", "[", 
    RowBox[{
     RowBox[{"Table", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"SimpleGraph", "[", 
          RowBox[{"#", ",", 
           RowBox[{"ImageSize", "\[Rule]", "Tiny"}], ",", 
           RowBox[{"EdgeStyle", "\[Rule]", 
            RowBox[{
             RowBox[{
              RowBox[{
              "ResourceFunction", "[", 
               "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
              "\"\<BranchialGraph\>\"", "]"}], "[", 
             "\"\<EdgeStyle\>\"", "]"}]}]}], "]"}], "&"}], "@", 
        RowBox[{"Flatten", "[", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"Outer", "[", 
              RowBox[{"UndirectedEdge", ",", "#", ",", "#"}], "]"}], 
             "&"}], "[", 
            RowBox[{"Intersection", "[", 
             RowBox[{
              RowBox[{"layers", "[", 
               RowBox[{"[", 
                RowBox[{"t", "+", "1"}], "]"}], "]"}], ",", 
              RowBox[{"VertexInComponent", "[", 
               RowBox[{"euc", ",", "#", ",", "1"}], "]"}]}], "]"}], 
            "]"}], "&"}], "/@", 
          RowBox[{"layers", "[", 
           RowBox[{"[", "t", "]"}], "]"}]}], "]"}]}], ",", 
       RowBox[{"{", 
        RowBox[{"t", ",", 
         RowBox[{
          RowBox[{"Length", "[", "layers", "]"}], "-", "1"}]}], 
        "}"}]}], "]"}], ",", 
     RowBox[{"UpTo", "[", "8", "]"}]}], "]"}], ",", 
   RowBox[{"Frame", "\[Rule]", "All"}]}], "]"}]], "Input"]
}, Open  ]]

And here are the branchial graphs specifically from slices 23 and 26:

Function
&#10005

Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"longestpathlengths", "=", 
   RowBox[{"ParallelMap", "[", 
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{"t", "->", 
        RowBox[{
         RowBox[{"Max", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"Length", "[", 
             RowBox[{"FindLongestPath", "[", 
              RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], 
            "&"}], "/@", "axioms"}], "]"}], "-", "1"}]}]}], "]"}], 
     ",", 
     RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"layers", "=", 
   RowBox[{"Map", "[", 
    RowBox[{"First", ",", 
     RowBox[{"SortBy", "[", 
      RowBox[{
       RowBox[{"GatherBy", "[", 
        RowBox[{"longestpathlengths", ",", "Last"}], "]"}], ",", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", "2"}], "]"}], "]"}], "&"}]}], "]"}], ",", 
     
     RowBox[{"{", "2", "}"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"Function", "[", 
   RowBox[{"t", ",", 
    RowBox[{
     RowBox[{
      RowBox[{"Framed", "[", 
       RowBox[{"SimpleGraph", "[", 
        RowBox[{"#", ",", 
         RowBox[{"EdgeStyle", "->", 
          RowBox[{
           RowBox[{
            RowBox[{
            "ResourceFunction", "[", 
             "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
            "\"\<BranchialGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"",
            "]"}]}], ",", 
         RowBox[{"VertexStyle", "\[Rule]", 
          RowBox[{"(", 
           RowBox[{
            RowBox[{
             RowBox[{"#", "\[Rule]", 
              RowBox[{"EuclidVertexStyle", "[", "#", "]"}]}], "&"}], "/@", 
            RowBox[{"VertexList", "[", "euc", "]"}]}], ")"}]}], ",", 
         RowBox[{"VertexSize", "\[Rule]", ".4"}], ",", 
         RowBox[{"ImageSize", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"400", ",", "Automatic"}], "}"}]}], ",", 
         RowBox[{"VertexLabels", "->", 
          RowBox[{"(", 
           RowBox[{
            RowBox[{
             RowBox[{"#", "\[Rule]", 
              RowBox[{"EuclidVertexName", "[", "#", "]"}]}], "&"}], "/@", 
            RowBox[{"VertexList", "[", "euc", "]"}]}], ")"}]}]}], 
        "]"}], "]"}], "&"}], "@", 
     RowBox[{"Flatten", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"Outer", "[", 
           RowBox[{"UndirectedEdge", ",", "#", ",", "#"}], "]"}], 
          "&"}], "[", 
         RowBox[{"Intersection", "[", 
          RowBox[{
           RowBox[{"layers", "[", 
            RowBox[{"[", 
             RowBox[{"t", "+", "1"}], "]"}], "]"}], ",", 
           RowBox[{"VertexInComponent", "[", 
            RowBox[{"euc", ",", "#", ",", "1"}], "]"}]}], "]"}], 
         "]"}], "&"}], "/@", 
       RowBox[{"layers", "[", 
        RowBox[{"[", "t", "]"}], "]"}]}], "]"}]}]}], "]"}], "/@", 
  RowBox[{"{", 
   RowBox[{"23", ",", "26"}], "}"}]}]], "Input"]
}, Open  ]]

How should we interpret these graphs? Just like in quantum mechanics, they effectively define a map of “entanglements”, but now these are “entanglements” not between quantum states but between theorems. But potentially we can also interpret these graphs as showing how theorems are laid out in a kind of “instantaneous metamathematical space”—or, in effect, we can use the graphs to define “distances between theorems”.

We can generalize our ordinary branchial graphs by connecting theorems that have common ancestors not just one slice back, but also up to δt slices back. Here are the results for slice 26 (in the cosmological rest frame):

Transpose
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"longestpathlengths", "=", 
   RowBox[{"ParallelMap", "[", 
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{"t", "->", 
        RowBox[{
         RowBox[{"Max", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"Length", "[", 
             RowBox[{"FindLongestPath", "[", 
              RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], 
            "&"}], "/@", "axioms"}], "]"}], "-", "1"}]}]}], "]"}], 
     ",", 
     RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"layers", "=", 
   RowBox[{"Map", "[", 
    RowBox[{"First", ",", 
     RowBox[{"SortBy", "[", 
      RowBox[{
       RowBox[{"GatherBy", "[", 
        RowBox[{"longestpathlengths", ",", "Last"}], "]"}], ",", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", "2"}], "]"}], "]"}], "&"}]}], "]"}], ",", 
     
     RowBox[{"{", "2", "}"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Transpose", "[", 
  RowBox[{"Table", "[", 
   RowBox[{
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{
        RowBox[{
         RowBox[{"Labeled", "[", 
          RowBox[{
           RowBox[{"Framed", "[", 
            RowBox[{"SimpleGraph", "[", 
             RowBox[{"#", ",", 
              RowBox[{"EdgeStyle", "->", 
               RowBox[{
                RowBox[{
                 RowBox[{
                 "ResourceFunction", "[", 
                  "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], 
                 "[", "\"\<BranchialGraph\>\"", "]"}], "[", 
                "\"\<EdgeStyle\>\"", "]"}]}], ",", 
              RowBox[{"ImageSize", "\[Rule]", "300"}], ",", 
              RowBox[{"VertexLabels", "->", 
               RowBox[{"(", 
                RowBox[{
                 RowBox[{
                  RowBox[{"#", "\[Rule]", 
                   RowBox[{"EuclidVertexName", "[", "#", "]"}]}], 
                  "&"}], "/@", 
                 RowBox[{"VertexList", "[", "euc", "]"}]}], ")"}]}], 
              ",", 
              RowBox[{"VertexSize", "\[Rule]", ".4"}], ",", 
              RowBox[{"VertexStyle", "\[Rule]", 
               RowBox[{"(", 
                RowBox[{
                 RowBox[{
                  RowBox[{"#", "\[Rule]", 
                   RowBox[{"{", 
                    RowBox[{"EuclidVertexStyle", "[", "#", "]"}], 
                    "}"}]}], "&"}], "/@", 
                 RowBox[{"VertexList", "[", "euc", "]"}]}], ")"}]}]}],
              "]"}], "]"}], ",", 
           RowBox[{"Style", "[", 
            RowBox[{
             RowBox[{
             "\"\<\!\(\*StyleBox[\(\[Delta]\*StyleBox[\"t\",FontSlant-\
>\"Italic\"]\)]\) = \>\"", "<>", 
              RowBox[{"ToString", "[", "dt", "]"}]}], ",", 
             RowBox[{
             "FontFamily", "\[Rule]", "\"\<Source Sans Pro\>\""}]}], 
            "]"}]}], "]"}], "&"}], "@", 
        RowBox[{"Flatten", "[", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"Outer", "[", 
              RowBox[{"UndirectedEdge", ",", "#", ",", "#"}], "]"}], 
             "&"}], "[", 
            RowBox[{"Intersection", "[", 
             RowBox[{
              RowBox[{"Union", "@@", 
               RowBox[{"Table", "[", 
                RowBox[{
                 RowBox[{"layers", "[", 
                  RowBox[{"[", 
                   RowBox[{"t", "+", "i"}], "]"}], "]"}], ",", 
                 RowBox[{"{", 
                  RowBox[{"i", ",", "dt"}], "}"}]}], "]"}]}], ",", 
              RowBox[{"VertexInComponent", "[", 
               RowBox[{"euc", ",", "#", ",", "dt"}], "]"}]}], "]"}], 
            "]"}], "&"}], "/@", 
          RowBox[{"layers", "[", 
           RowBox[{"[", "t", "]"}], "]"}]}], "]"}]}]}], "]"}], "/@", 
     RowBox[{"{", "26", "}"}]}], ",", 
    RowBox[{"{", 
     RowBox[{"dt", ",", "1", ",", "3"}], "}"}]}], "]"}], 
  "]"}]], "Input"]
}, Open  ]]

If we went all the way back to the axioms (the analog of the “big bang”) then we’d just get a complete graph, connecting all the theorems on slice 26. But here we’re seeing in effect “fuzzier and fuzzier” versions of how the theorems that exist at slice 26 can be thought of as being “metamathematically laid out”. The disconnected components in these branchial graphs represent theorems that have no recent shared history—so that in some sense they’re “causally disconnected”.

In thinking about “theorem search”, it’s interesting to try to imagine measures of “distance between theorems”—and in effect branchial distance captures some of this. And even for Euclid there are presumably things to learn about the “layout” of theorems, and what should count as “close to” what.

There are only 465 theorems in Euclid’s Elements. But what if there were many more? What might the “metamathematical space” they define be like? Just as for the hypergraphs—or, for that matter, the multiway graphs—in our models of physics we can ask questions about the limiting emergent geometry of this space. And—ironically enough—one thing we can immediately say is that it seems to be far from Euclidean!

But does it for example have some definite effective dimension? There isn’t enough data to say much about the branchial slices we just saw. But we can say a bit more about the complete theorem dependency graph—which is the analog of the multiway graph in our physics models. For example, starting with the axioms (the analog of the “big bang”) we can ask how many theorems are reached in successive steps. The result (counting the axioms) is:

Table
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Table[Length[Union @@ (VertexInComponent[euc, #, i] & /@ axioms)], {i,
   0, 10}]

If we were dealing with something that approximated a d-dimensional manifold, we’d expect these numbers to be of order rd. Computing their logarithmic differences to fit for d gives

ListLinePlot
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
						ListLinePlot[
 ResourceFunction["LogDifferences"][
  MeanAround /@ 
   Transpose[
    Table[Length[VertexInComponent[euc, #, i]], {i, 0, 10}] & /@ 
     axioms]], Frame -> True, 
 FrameLabel -> {"graph distance", "effective dimension"}]

if one starts from the axioms, and

ListLinePlot
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					ListLinePlot[
 ResourceFunction["LogDifferences"][
  MeanAround /@ 
   Transpose[
    Table[Length[VertexInComponent[euc, #, i]], {i, 0, 10}] & /@ 
     VertexList[euc]]], Frame -> True, 
 FrameLabel -> {"graph distance", "effective dimension"}]

if one starts from all possible theorems in the network.

One gets somewhat different results if one deals not with the actual theorem dependency graph in Euclid, but instead with its transitive reduction—removing all “unnecessary” direct connections. Now the number of theorems reached on successive steps is:

TransitiveReductionGraph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Table[Length[
  Union @@ (VertexInComponent[TransitiveReductionGraph[euc], #, i] & /@
      axioms)], {i, 0, 10}]

The “dimension estimate” based on theorems reached starting from the axioms is

ListLinePlot
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					ListLinePlot[
 ResourceFunction["LogDifferences"][
  MeanAround /@ 
   Transpose[
    Table[Length[
        VertexInComponent[TransitiveReductionGraph[euc], #, i]], {i, 
        0, 20}] & /@ axioms]], Frame -> True]

while the corresponding result starting from all theorems is:

ListLinePlot
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					ListLinePlot[
 ResourceFunction["LogDifferences"][
  MeanAround /@ 
   Transpose[
    Table[Length[
        VertexInComponent[TransitiveReductionGraph[euc], #, i]], {i, 
        0, 20}] & /@ VertexList[euc]]], Frame -> True]

Euclid’s Elements represents far too little data to make a definite statement, but perhaps there’s a hint of 2-dimensional structure, with positive curvature.

The Most Difficult Theorem in Euclid

One way to assess the “difficulty” of a theorem is to look at what results have to have already been built up in order to prove the theorem. And by this measure, the most difficult theorem in Euclid’s Elements is the very last theorem in the last book—what one might call “Euclid’s last theorem”, the climax of the ElementsBook 13, Theorem 18, which amounts to the statement that there are five Platonic solids, or more specifically:

Style
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Style[
 Text[
  Style[eus[<|"Book" -> 13, "Theorem" -> 18|>]["GreekText"], 
   RGBColor["#333333"],
   FontSize -> 13]]]

This theorem uses all 10 axioms, and 219 of the 464 previous theorems. Here’s its graph of dependencies:

Labeled
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Labeled[Subgraph[ReverseGraph[euc], 
  VertexInComponent[
   ReverseGraph[euc], <|"Book" -> 13, "Theorem" -> 18|>], 
  GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
  AspectRatio -> 1/2, EdgeStyle -> GrayLevel[.5, .5], VertexSize -> 3,
   VertexLabels -> (# -> EuclidVertexName[#] & /@ VertexList[euc]), 
  VertexStyle -> (# -> EuclidVertexStyle[#, "Intense"] & /@ 
     VertexList[euc])], 
 Row[Row[#, Spacer[0.005]] & /@ 
   Transpose[{bookColorIntense /@ {0, 6, 10, 13}, 
     Style[#, FontFamily -> "Source Sans Pro", GrayLevel[0.3], 
        FontSize -> 12] & /@ {"axioms", "2D geometry", "numbers", 
       "3D geometry"}}], Spacer[20]]]

And here is the transitive reduction of this—notably with different subject areas being more obviously separated:

TransitiveReductionGraph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					TransitiveReductionGraph[
 Subgraph[ReverseGraph[euc], 
  VertexInComponent[
   ReverseGraph[euc], <|"Book" -> 13, "Theorem" -> 18|>], 
  GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
  AspectRatio -> 1/2, EdgeStyle -> GrayLevel[.5, .5], 
  VertexSize -> .8, 
  VertexLabels -> (# -> EuclidVertexName[#] & /@ VertexList[euc]), 
  VertexStyle -> (# -> EuclidVertexStyle[#, "Intense"] & /@ 
     VertexList[euc])]]

This shows how 13.18 and its prerequisites (its “past light cone”) sit inside the whole theorem dependency graph:

HighlightGraph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					HighlightGraph[ReverseGraph[euc, EdgeStyle -> GrayLevel[.5, .5]], 
 Subgraph[ReverseGraph[euc], 
  VertexInComponent[
   ReverseGraph[euc], <|"Book" -> 13, "Theorem" -> 18|>]], 
 GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
 AspectRatio -> 1/2]

If we started from the axioms, the longest chains of theorems we’d have to prove to get to 13.18 is:

Text
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Text[Column[(Style[
      RightArrow @@ (Module[{text = EuclidVertexName[#]}, 
           Framed[Style[text, 10, Black], RoundingRadius -> 4, 
            ImageSize -> {Automatic, 20}, 
            Background -> 
             If[StringMatchQ[First[StringSplit[text, "."]], 
               NumberString], 
              bookColor[ToExpression[First[StringSplit[text, "."]]]], 
              bookColor[0]], 
            FrameStyle -> 
             If[StringMatchQ[First[StringSplit[text, "."]], 
               NumberString], 
              bookColor[ToExpression[First[StringSplit[text, "."]]]], 
              bookColorDarker[0]]]] & /@ 
         FindLongestPath[
          ReverseGraph[euc], #, <|"Book" -> 13, "Theorem" -> 18|>]), 
      Gray]) & /@ axioms, Frame -> All, FrameStyle -> GrayLevel[.7]]]

Or in other words, from CN1 and from P1 and P3 we’d have to go 33 steps to reach 13.18. If we actually look at the paths, however, we see that after different segments at the beginning, they all merge at Book 6, Theorem 1, and then are the same for the last 14 steps:

HighlightGraph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					HighlightGraph[ReverseGraph[euc, EdgeStyle -> GrayLevel[.5, .5]], 
 Style[PathGraph[
     FindLongestPath[
      ReverseGraph[euc], #, <|"Book" -> 13, "Theorem" -> 18|>], 
     DirectedEdges -> True], Red, Thick] & /@ axioms, 
 GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
 AspectRatio -> 1/2]

(Theorem 6.1 is the statement that both triangles and parallelograms that have the same base and same height have the same area, i.e. one can skew a triangle or parallelogram without changing its area.)

How much more difficult than other theorems is 13.18? Here’s a histogram of maximum path lengths for all theorems (ignoring cases to be discussed later where a particular theorem does not use a given axiom at all):

tlens = ParallelMap
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"tlens", "=", 
   RowBox[{"ParallelMap", "[", 
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{"Max", "[", 
        RowBox[{
         RowBox[{
          RowBox[{"Length", "[", 
           RowBox[{"FindLongestPath", "[", 
            RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], "&"}], 
         "/@", "axioms"}], "]"}]}], "]"}], ",", 
     RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Histogram", "[", 
  RowBox[{
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"vod", "=", 
        RowBox[{"ParallelMap", "[", 
         RowBox[{
          RowBox[{"Function", "[", 
           RowBox[{"t", ",", 
            RowBox[{"t", "->", 
             RowBox[{"Max", "[", 
              RowBox[{
               RowBox[{
                RowBox[{"Length", "[", 
                 RowBox[{"FindLongestPath", "[", 
                  RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], 
                "&"}], "/@", "axioms"}], "]"}]}]}], "]"}], ",", 
          RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], ",", 
       "dataG"}], "}"}], ",", 
     RowBox[{
      RowBox[{"dataG", "=", 
       RowBox[{"GroupBy", "[", 
        RowBox[{
         RowBox[{
          RowBox[{
           RowBox[{"If", "[", 
            RowBox[{
             RowBox[{"MissingQ", "[", 
              RowBox[{
               RowBox[{"#", "[", 
                RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
               "]"}], "]"}], ",", 
             RowBox[{"0", "\[Rule]", " ", 
              RowBox[{"#", "[", 
               RowBox[{"[", "2", "]"}], "]"}]}], ",", 
             RowBox[{
              RowBox[{
               RowBox[{"#", "[", 
                RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
               "]"}], "\[Rule]", 
              RowBox[{"#", "[", 
               RowBox[{"[", "2", "]"}], "]"}]}]}], "]"}], "&"}], "/@",
           "vod"}], ",", 
         RowBox[{"First", "\[Rule]", "Last"}]}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{
       RowBox[{
        RowBox[{"Flatten", "[", 
         RowBox[{"Join", "[", 
          RowBox[{"Values", "[", 
           RowBox[{"dataG", "[", 
            RowBox[{"[", 
             RowBox[{"Key", "/@", "#"}], "]"}], "]"}], "]"}], "]"}], 
         "]"}], "&"}], "/@", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{
          "1", ",", "2", ",", "3", ",", "4", ",", "5", ",", "6"}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"7", ",", "8", ",", "9", ",", "10"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"11", ",", "12", ",", "13"}], "}"}]}], "}"}]}]}]}], 
    "\[IndentingNewLine]", "]"}], ",", 
   RowBox[{"{", "1", "}"}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"ChartLayout", "\[Rule]", "\"\<Stacked\>\""}], ",", 
   RowBox[{"ChartBaseStyle", "\[Rule]", 
    RowBox[{"Opacity", "[", "1", "]"}]}], ",", 
   RowBox[{"ChartStyle", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"bookColorIntense", "/@", 
       RowBox[{"{", 
        RowBox[{"6", ",", "10", ",", "13"}], "}"}]}], ",", 
      RowBox[{"EdgeForm", "[", 
       RowBox[{"Directive", "[", 
        RowBox[{"Thin", ",", 
         RowBox[{"GrayLevel", "[", "0.15", "]"}]}], "]"}], "]"}]}], 
     "}"}]}], ",", 
   RowBox[{"FrameLabel", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
     "\"\<maximum path length\>\"", ",", 
      "\"\<number of theorems\>\""}], "}"}]}]}], "]"}]], "Input"]
}, Open  ]]

And here’s how the maximum path length varies through the sequence of all 465 theorems:

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Module[{dataA = 
   ParallelMap[
    Function[t, 
     If[MissingQ[t["Book"]], Nothing, 
      t["Book"] -> 
       Max[Length[FindLongestPath[euc, t, #]] & /@ axioms] ]], 
    VertexList[euc]], vals, acc, xval},
 vals = CountsBy[dataA, First]; 
 acc = Association[
   MapIndexed[First[#2] -> #1 &, 
    Accumulate[Values[CountsBy[dataA, First]]]]];
 xval = Association[#[[1]] -> (#[[2]] - vals[#[[1]]]/2) & /@ 
    Normal[acc]]; 
 Show[{ListLinePlot[Values[dataA], Axes -> {False, True}, 
    Frame -> True, 
    FrameTicks -> {{True, 
       False}, {{#[[2]], #[[1]], {0, 0}} & /@ Normal[xval], False}}, 
    FrameLabel -> {"theorems by book", "maximum path length"}, 
    Filling -> Axis, ColorFunctionScaling -> False, 
    ColorFunction -> 
     Function[{x, y}, 
      Piecewise[{{bookColorIntense[6], 
         x <= acc[6]}, {bookColorIntense[10], 
         x <= acc[10]}, {bookColorIntense[13], x <= acc[13]}}]], 
    PlotRange -> All], 
   Graphics[{GrayLevel[0.5], 
     Line[{{#, -5}, {#, 35}} & /@ Values[acc]]}]}]]

In the causal graph interpretation, and using the “flat foliation” (i.e. the “cosmological rest frame”) what this basically shows is at what “time slice” a given theorem first emerges from Euclid’s proofs. Or, in other words, if one imagines exploring the “metamathematical space of Euclid” by going “one level of theorems at a time”, the order in which one will encounter theorems is:

tlens = ParallelMap
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"tlens", "=", 
   RowBox[{"ParallelMap", "[", 
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"t", ",", 
       RowBox[{"Max", "[", 
        RowBox[{
         RowBox[{
          RowBox[{"Length", "[", 
           RowBox[{"FindLongestPath", "[", 
            RowBox[{"euc", ",", "t", ",", "#"}], "]"}], "]"}], "&"}], 
         "/@", "axioms"}], "]"}]}], "]"}], ",", 
     RowBox[{"VertexList", "[", "euc", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Column", "[", 
  RowBox[{
   RowBox[{"Row", "/@", 
    RowBox[{"Map", "[", 
     RowBox[{
      RowBox[{
       RowBox[{"Text", "[", 
        RowBox[{
         RowBox[{"Style", "[", 
          RowBox[{
           RowBox[{"StringJoin", "[", 
            RowBox[{"\"\<\[ThinSpace]\>\"", ",", 
             RowBox[{"EuclidVertexName", "[", 
              RowBox[{"First", "[", "#", "]"}], "]"}], ",", 
             "\"\<\[ThinSpace]\>\""}], "]"}], ",", "11", ",", 
           RowBox[{"LineSpacing", "\[Rule]", " ", 
            RowBox[{"{", 
             RowBox[{"1", ",", "0"}], "}"}]}]}], "]"}], ",", 
         RowBox[{"Background", "\[Rule]", 
          RowBox[{"bookColorDarker", "[", 
           RowBox[{"Lookup", "[", 
            RowBox[{
             RowBox[{"First", "[", "#", "]"}], ",", "\"\<Book\>\"", 
             ",", "0"}], "]"}], "]"}]}]}], "]"}], "&"}], ",", 
      RowBox[{"SplitBy", "[", 
       RowBox[{
        RowBox[{"SortBy", "[", 
         RowBox[{
          RowBox[{"Transpose", "[", 
           RowBox[{"{", 
            RowBox[{
             RowBox[{"VertexList", "[", "euc", "]"}], ",", "tlens"}], 
            "}"}], "]"}], ",", "Last"}], "]"}], ",", "Last"}], "]"}], 
      ",", 
      RowBox[{"{", "2", "}"}]}], "]"}]}], ",", 
   RowBox[{"Frame", "\[Rule]", "All"}], ",", 
   RowBox[{"FrameStyle", "\[Rule]", 
    RowBox[{"GrayLevel", "[", ".8", "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

A question one might ask is whether “short-to-state” theorems are somehow “easier to prove” than longer-to-state ones. This shows the maximum path length to prove theorems as a function of the length of their statements in Euclid’s Greek. Remarkably little correlation is seen.

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Module[{dataA = 
   GroupBy[ParallelMap[
     Function[t, 
      t["Book"] ->  
       Callout[{StringLength[eus[t]["GreekText"]], 
         Max[Length[FindLongestPath[euc, t, #]] & /@ axioms]}, 
        EuclidVertexName[t]]], Complement[VertexList[euc], axioms]], 
    First -> Last]},
 ListPlot[Values[dataA], ColorFunctionScaling -> False, 
  PlotStyle -> Table[bookColorIntense[i], {i, 1, 13}], Frame -> True, 
  FrameLabel -> {Style["Greek statement length", GrayLevel[.5]], 
    Style["maximum path", GrayLevel[.5]]} ]]

This plot shows instead the number of “prerequisite theorems” as a function of statement length:

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Module[{dataA = 
   GroupBy[ParallelMap[
     Function[t, 
      t["Book"] ->  
       Callout[{StringLength[eus[t]["GreekText"]], 
         Length[VertexOutComponent[euc, t]]}, EuclidVertexName[t]]], 
     Complement[VertexList[euc], axioms]], First -> Last]},
 ListPlot[Values[dataA], ColorFunctionScaling -> False, 
  PlotStyle -> Table[bookColorIntense[i], {i, 1, 13}], Frame -> True, 
  FrameLabel -> {Style["Greek statement length", GrayLevel[.5]], 
    Style["dependencies", GrayLevel[.5]]} ]]

Once again there is poor correlation.

How often do particular theorems get used in the proofs of other theorems? The “most popular” theorems in terms of being directly quoted in the proofs of other theorems are:

Row
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Row[Text[Grid[#, 
     Background -> {None, 
       MapIndexed[
        First[#2] -> 
          With[{bn = 
             StringCases[#1, b : (DigitCharacter ..) ~~ "." :> b]}, 
           bookColorDarker[
            If[Length[bn] == 1, FromDigits[First[bn]], 0]]] &, #[[All,
          1]]]}, Frame -> All]] & /@ 
  Partition[{EuclidVertexName[#], 
      Style[VertexInDegree[euc, #] - 1, Italic]} & /@ 
    TakeLargestBy[VertexList[euc], VertexInDegree[euc, #] &, 50], 10],
  Spacer[5]]

Notably, all but one of 10.11’s direct mentions are in other theorems in Book 10. Theorem 6.1 (which we already encountered above) is used in 4 books.

By the way, there is some subtlety here, because 26 theorems reference a particular theorem more than once in their proofs: for example, 10.4 references 10.3 three times, while 13.18 references both 13.17 and 13.16 twice:

With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
With[{g = 
   Select[EdgeList[euc], 
    First[#] == <|"Book" -> 13, "Theorem" -> 18|> &]}, 
 Graph[g, VertexLabels -> (# -> 
       Placed[EuclidVertexName[#], Center] & /@ VertexList[g]), 
  VertexSize -> .75, EdgeStyle -> Gray, 
  VertexStyle -> (# -> EuclidVertexStyle[#] & /@ VertexList[g]), 
  GraphLayout -> "BalloonEmbedding", ImageSize -> 200]]

But looking simply at the distribution of the number of direct uses (here on a log scale), we see that the vast majority of theorems are very rarely used—with just a few being quite widely used:

Histogram
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Histogram[
 Module[{vod = # -> Length[VertexInComponent[euc, #, 1]] & /@ 
     VertexList[euc], dataG}, 
  dataG = GroupBy[
    If[MissingQ[#[[1]]["Book"]], 
       0 -> #[[2]], #[[1]]["Book"] -> #[[2]]] & /@ vod, First -> Last];
  Flatten[Join[Values[dataG[[Key /@ #]]]]] & /@ {{0}, {1, 2, 3, 4, 5, 
     6}, {7, 8, 9, 10}, {11, 12, 13}}
  ], {1}, {"Log", "Count"}, PlotRange -> All, Frame -> True, 
 ChartLayout -> "Stacked", 
 FrameLabel -> {"number of direct uses", "number of theorems"}, 
 ChartBaseStyle -> Opacity[1], 
 ChartStyle -> {bookColorIntense /@ {0, 6, 10, 13}, 
   EdgeForm[Directive[Thin, GrayLevel[0.15]]]}]

Indicating the number of direct uses by size, here are the “directly popular” theorems:

With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					With[{vl = VertexList[euc]}, 
 Labeled[Graph[ReverseGraph[euc], 
   VertexSize -> (# -> 4 Sqrt[VertexInDegree[euc, #]] & /@ 
      VertexList[euc]), 
   VertexStyle -> (# -> EuclidVertexStyle[#] & /@ vl), 
   EdgeStyle -> GrayLevel[.5, .5], 
   VertexLabels -> (# -> 
        If[VertexInDegree[euc, #] > 10 , 
         Placed[EuclidVertexName[#], Center], None] & /@ 
      VertexList[euc]), 
   GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms},
    AspectRatio -> 1/2], 
  Row[Row[#, Spacer[0.005]] & /@ 
    Transpose[{bookColorIntense /@ {0, 6, 10, 13}, 
      Style[#, FontFamily -> "Source Sans Pro", GrayLevel[0.3], 
         FontSize -> 12] & /@ {"axioms", "2D geometry", "numbers", 
        "3D geometry"}}], Spacer[20]]]]

If we ask also about indirect uses, the results are as follows:

Row
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Row[Text[Grid[#, 
     Background -> {None, 
       MapIndexed[
        First[#2] -> 
          With[{bn = 
             StringCases[#1, b : (DigitCharacter ..) ~~ "." :> b]}, 
           bookColorDarker[
            If[Length[bn] == 1, FromDigits[First[bn]], 0]]] &, #[[All,
          1]]]}, Frame -> All]] & /@ 
  Partition[{EuclidVertexName[#], 
      Style[Length[VertexInComponent[euc, #]] - 1, Italic]} & /@ 
    TakeLargestBy[VertexList[euc], 
     Length[VertexInComponent[euc, #]] &, 50], 10], Spacer[5]]

Not too surprisingly, the axioms and early theorems are the most popular. But overall, the distribution of total number of uses is somewhat broader than the distribution of direct uses:

Histogram
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Histogram[
 Module[{vod = # -> Length[VertexInComponent[euc, #]] & /@ 
     VertexList[euc], dataG}, 
  dataG = GroupBy[
    If[MissingQ[#[[1]]["Book"]], 
       0 -> #[[2]], #[[1]]["Book"] -> #[[2]]] & /@ vod, First -> Last];
  Flatten[Join[Values[dataG[[Key /@ #]]]]] & /@ {{0}, {1, 2, 3, 4, 5, 
     6}, {7, 8, 9, 10}, {11, 12, 13}}
  ], {3}, {"Log", "Count"}, PlotRange -> All, Frame -> True, 
 FrameLabel -> {"number of indirect uses", "number of theorems"}, 
 ChartLayout -> "Stacked", ChartBaseStyle -> Opacity[1], 
 ChartStyle -> {bookColorIntense /@ {0, 6, 10, 13}, 
   EdgeForm[Directive[Thin, GrayLevel[0.15]]]}]

This shows all theorems, with their sizes in the graph essentially determined by the sizes of their “future light cone” in the theorem dependency graph:

With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					With[{vl = VertexList[euc]}, 
 Graph[ReverseGraph[euc], 
  VertexSize -> (# -> (Length[VertexInComponent[euc, #]]/8) & /@ 
     VertexList[euc]), 
  VertexStyle -> (# -> EuclidVertexStyle[#] & /@ vl), 
  EdgeStyle -> GrayLevel[.5, .5], 
  VertexLabels -> (# -> 
       If[Length[VertexInComponent[euc, #]] > 10 , 
        Placed[EuclidVertexName[#], Center], None] & /@ 
     VertexList[euc]), 
  GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> axioms}, 
  AspectRatio -> 1/2]]

In addition to asking about direct and indirect uses, one can also assess the “centrality” of a given theorem by various graph-theoretical measures. One example is betweenness centrality (the fraction of shortest paths that pass through a given node):

With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					With[{vl = VertexList[euc], bw = BetweennessCentrality[euc], 
  reuc = ReverseGraph[euc]}, 
 Graph[Part[VertexList[euc], Ordering[bw]], EdgeList[reuc], 
  VertexSize -> Thread[VertexList[euc] -> .05 bw], 
  VertexStyle -> (# -> EuclidVertexStyle[#] & /@ vl), 
  EdgeStyle -> GrayLevel[.5, .5], 
  VertexLabels -> 
   MapIndexed[# -> 
      If[bw[[First[#2]]] > 500 , Placed[EuclidVertexName[#], Center], 
       None] &, VertexList[euc]], 
  GraphLayout -> {"VertexLayout" -> {"LayeredDigraphEmbedding", 
      "RootVertex" -> axioms}, "RenderingOrder" -> "VertexFirst"}, 
  AspectRatio -> 1/2]]

The theorems with top betweenness centralities are 1.31 (construction of parallel lines), 10.12 (transitivity of commensurability), 10.9 (commensurabilty in squares), 8.4 (continued ratios in lowest terms), etc.

For closeness centrality (average inverse distance to all other nodes) one gets:

With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					With[{vl = VertexList[euc], bw = ClosenessCentrality[euc]}, 
 Graph[ReverseGraph[euc], 
  VertexSize -> Thread[VertexList[euc] -> 30 bw], 
  VertexStyle -> (# -> EuclidVertexStyle[#] & /@ vl), 
  EdgeStyle -> GrayLevel[.5, .5], 
  VertexLabels -> 
   MapIndexed[# -> 
      If[bw[[First[#2]]] > .7 , Placed[EuclidVertexName[#], Center], 
       None] &, VertexList[euc]], 
  GraphLayout -> {"VertexLayout" -> {"LayeredDigraphEmbedding", 
      "RootVertex" -> axioms}, "RenderingOrder" -> "VertexFirst"}, 
  AspectRatio -> 1/2]]

What Really Depends on What?

Euclid’s Elements starts with 10 axioms, from which all the theorems it contains are derived. But what theorems really depend on what axioms? This shows how many of the 465 theorems depend on each of the Common Notions and Postulates according to the proofs given in Euclid:

Text
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Text[Grid[
  Transpose[{EuclidVertexName[#], 
      Length[VertexInComponent[euc, #]] - 1} & /@ axioms], 
  Frame -> All, Background -> {None, {bookColorDarker[0], None}}]]

The famous fifth postulate (that parallel lines do not cross) has the fewest theorems depending on it. (And actually, for many centuries there was a suspicion that no theorems really depended on it—so people tried to find proofs that didn’t use it, although ultimately it became clear it actually was needed.)

Interestingly, at least according to Euclid, more than half (255 out of 465) of the theorems actually depend on all 10 axioms, though one sees definite variation through the course of the Elements:

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Module[{dataA = 
   If[MissingQ[#[["Book"]]], 
      Nothing, #[["Book"]] -> 
       Length[Intersection[VertexOutComponent[euc, #], axioms]]] & /@ 
    VertexList[euc], vals, acc, xval},
 vals = CountsBy[dataA, First]; 
 acc = Association[
   MapIndexed[First[#2] -> #1 &, 
    Accumulate[Values[CountsBy[dataA, First]]]]];
 xval = Association[#[[1]] -> (#[[2]] - vals[[#[[1]]]]/2) & /@ 
    Normal[acc]]; 
 Show[{ListLinePlot[Values[dataA], Axes -> {False, True}, 
    Frame -> True, 
    FrameLabel -> {"theorems by book", "number of axioms used"}, 
    FrameTicks -> {{True, 
       False}, {{#[[2]], #[[1]], {0, 0}} & /@ Normal[xval], False}}, 
    Filling -> Axis, ColorFunctionScaling -> False, 
    ColorFunction -> 
     Function[{x, y}, 
      Piecewise[{{bookColorIntense[6], 
         x <= acc[6]}, {bookColorIntense[10], 
         x <= acc[10]}, {bookColorIntense[13], x <= acc[13]}}]] ], 
   Graphics[{GrayLevel[0.5], 
     Line[{{#, -5}, {#, 11}} & /@ Values[acc]]}]}]]

The number of theorems depending on different numbers of axioms is:

Histogram
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Histogram[
 Module[{vod = # -> 
       Length[Intersection[VertexOutComponent[euc, #], axioms]] & /@ 
     Complement[VertexList[euc], axioms], dataG}, 
  dataG = GroupBy[
    If[MissingQ[#[[1]]["Book"]], 
       0 -> #[[2]], #[[1]]["Book"] -> #[[2]]] & /@ vod, First -> Last];
  Flatten[Join[Values[dataG[[Key /@ #]]]]] & /@ {{0}, {1, 2, 3, 4, 5, 
     6}, {7, 8, 9, 10}, {11, 12, 13}}
  ], {1}, 
 FrameLabel -> {"number of axioms used", "number of theorems"}, 
 PlotRange -> All, Frame -> True, ChartLayout -> "Stacked", 
 ChartBaseStyle -> Opacity[1], 
 ChartStyle -> {bookColorIntense /@ {0, 6, 10, 13}, 
   EdgeForm[Directive[Thin, GrayLevel[0.15]]]}]

Scattered through the Elements there are 86 theorems that depend only on one axiom, most often CN1 (which is transitivity of equality):

Grid
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Grid[Transpose[
  List @@@ Normal[
    KeyMap[EuclidVertexName, 
     ReverseSort[
      Counts[Flatten[
        Intersection[VertexOutComponent[euc, #], axioms] & /@ 
         Select[Complement[VertexList[euc], axioms], 
          Length[Intersection[VertexOutComponent[euc, #], axioms]] == 
            1 &]]]]]]], Frame -> All, 
 Background -> {None, {bookColorDarker[0], None}}]

In most cases, the dependence is quite direct, but there are cases in which it is actually quite elaborate, such as:

TakeLargestBy
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					TakeLargestBy[
  With[{g = Subgraph[euc, VertexOutComponent[euc, #]]}, 
     EuclidGraphLarge[g, BaseStyle -> 9, 
      ImageSize -> {300, Automatic}]] & /@ 
   Select[Complement[VertexList[euc], axioms], 
    Length[Intersection[VertexOutComponent[euc, #], axioms]] == 1 &], 
  VertexCount, 5][[{1, 2, 4}]]

These get slightly simpler after transitive reduction:

TransitiveReductionGraph /@<br />
 TakeLargestBy
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					TransitiveReductionGraph /@ 
 TakeLargestBy[
   With[{g = Subgraph[euc, VertexOutComponent[euc, #]]}, 
      EuclidGraphLarge[g, ImageSize -> {300, Automatic}]] & /@ 
    Select[Complement[VertexList[euc], axioms], 
     Length[Intersection[VertexOutComponent[euc, #], axioms]] == 1 &],
    VertexCount, 5][[{1, 2, 4}]]

We can now also ask the opposite question of how many theorems don’t depend on any given axiom (and, yes, this immediately follows from what we listed above):

Text
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					Text[Grid[
  Transpose[
   Function[
     a, {EuclidVertexName[a], 
      Length[Select[
        VertexList[
         euc], (! MemberQ[VertexOutComponent[euc, #], a]) &]]}] /@ 
    axioms], Frame -> All, 
  Background -> {None, {bookColorDarker[0], None}}]]

And in general we can ask what subsets of the axioms different theorems depend on. Interestingly, of the 1024 possible such subsets, only 19 actually occur, suggesting some considerable correlation between the axioms. Here is a representation of the partial ordering of the subsets that occur, indicating in each case for how many theorems that subset of dependencies occurs:

TransitiveReductionGraph
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					TransitiveReductionGraph[
 Module[{ss = (First[
         ToExpression[StringSplit[EuclidVertexName[#], "."]]] -> 
        Map[EuclidVertexName, 
         Intersection[VertexOutComponent[euc, #], axioms]] & /@ 
      Complement[VertexList[euc], axioms]), sss, pieSet, disp},
  sss = GroupBy[ss, Last -> First];
  pieSet = 
   Association[(#[[1]] -> {Total[Table[Count[#[[2]], i], {i, 6}]], 
         Total[Table[Count[#[[2]], i], {i, 7, 10}]], 
         Total[Table[Count[#[[2]], i], {i, 11, 13}]]}) & /@ 
     Normal[sss]];
  disp = #[[1]] -> 
      PieChart[pieSet[#[[1]]], 
       ChartStyle -> bookColorDarker /@ {6, 10, 13}] & /@ 
    Normal[sss];
  SimpleGraph[
   EuclidGraphLarge[Sort[Keys[sss]], 
    Catenate[
     Table[If[SubsetQ[a, b], a -> b, Nothing], {a, 
       Sort[Keys[sss]]}, {b, Sort[Keys[sss]]}]], 
    VertexWeight -> (Length[sss[#]] & /@ Sort[Keys[sss]])], 
   VertexShape -> disp, 
   VertexLabels -> 
    Placed[Automatic, Automatic, 
     Grid[{#}, Frame -> All, FrameStyle -> LightGray, 
       Background -> bookColor[0]] &]]], VertexSize -> "VertexWeight",
  AspectRatio -> 1/2, PerformanceGoal -> "Quality"]

The Machine Code of Euclid: All the Way Down to Axioms

Any theorem in Euclid can ultimately be proved just by using Euclid’s axioms enough times. In other words, the proofs Euclid gave were stated in terms of “intermediate theorems”—but we can always in principle just “compile things down” so we just get a sequence of axioms. And here for example is how that works for Book 1, Theorem 5:

LayeredDigraphEmbedding
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"replacements", "=", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"First", "[", 
       RowBox[{"First", "[", "#", "]"}], "]"}], "\[Rule]", 
      RowBox[{"Last", "/@", "#"}]}], "&"}], "/@", 
    RowBox[{"GatherBy", "[", 
     RowBox[{
      RowBox[{"Rule", "@@@", 
       RowBox[{"EdgeList", "[", "euc", "]"}]}], ",", "First"}], 
     "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"repx", "=", 
   RowBox[{"replacements", "/.", " ", 
    RowBox[{"(", 
     RowBox[{"a_Association", " ", "\[RuleDelayed]", " ", 
      RowBox[{"EuclidVertexName", "[", "a", "]"}]}], ")"}]}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"g", ",", 
      RowBox[{"i", "=", "1"}], ",", "vl", ",", "vs"}], "}"}], ",", 
    RowBox[{
     RowBox[{
      RowBox[{"vs", "[", "u_", "]"}], ":=", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{"bookColorDarker", "[", "u", "]"}], ",", 
        RowBox[{"EdgeForm", "[", 
         RowBox[{"Darker", "[", 
          RowBox[{
           RowBox[{"bookColorDarker", "[", "u", "]"}], ",", ".2"}], 
          "]"}], "]"}]}], "}"}]}], ";", 
     RowBox[{"g", "=", 
      RowBox[{"Map", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"If", "[", 
          RowBox[{
           RowBox[{"MemberQ", "[", 
            RowBox[{
             RowBox[{"EuclidVertexName", "/@", "axioms"}], ",", "#"}],
             "]"}], ",", 
           RowBox[{"{", 
            RowBox[{"#", ",", 
             RowBox[{"i", "++"}]}], "}"}], ",", "#"}], "]"}], "&"}], 
        ",", 
        RowBox[{"DeleteCases", "[", 
         RowBox[{
          RowBox[{"Flatten", "[", 
           RowBox[{"Last", "[", 
            RowBox[{"Reap", "[", 
             RowBox[{"Nest", "[", 
              RowBox[{
               RowBox[{
                RowBox[{"(", 
                 RowBox[{
                  RowBox[{"Sow", "[", 
                   RowBox[{
                    RowBox[{
                    RowBox[{"Thread", "[", 
                    RowBox[{"#", "\[Rule]", 
                    RowBox[{"(", 
                    RowBox[{"#", "/.", "repx"}], ")"}]}], "]"}], 
                    "&"}], "/@", "#"}], "]"}], ";", 
                  "\[IndentingNewLine]", 
                  RowBox[{"Flatten", "[", 
                   RowBox[{"#", "/.", "repx"}], "]"}]}], ")"}], "&"}],
                ",", 
               RowBox[{"{", 
                RowBox[{"EuclidVertexName", "[", 
                 RowBox[{"<|", 
                  RowBox[{
                   RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
                   RowBox[{"\"\<Theorem\>\"", "\[Rule]", "5"}]}], 
                  "|>"}], "]"}], "}"}], ",", "4"}], "]"}], "]"}], 
            "]"}], "]"}], ",", 
          RowBox[{"x_", "\[Rule]", "x_"}]}], "]"}], ",", 
        RowBox[{"{", 
         RowBox[{"-", "1"}], "}"}]}], "]"}]}], ";", 
     "\[IndentingNewLine]", 
     RowBox[{"vl", "=", 
      RowBox[{"VertexList", "[", "g", "]"}]}], ";", 
     "\[IndentingNewLine]", 
     RowBox[{"EuclidGraphLarge", "[", 
      RowBox[{"g", ",", 
       RowBox[{
       "GraphLayout", "\[Rule]", "\"\<LayeredDigraphEmbedding\>\""}], 
       ",", 
       RowBox[{"VertexLabels", "\[Rule]", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{"#", "\[Rule]", 
            RowBox[{"Placed", "[", 
             RowBox[{
              RowBox[{"If", "[", 
               RowBox[{
                RowBox[{"StringQ", "[", "#", "]"}], ",", "#", ",", 
                RowBox[{"First", "[", "#", "]"}]}], "]"}], ",", 
              "Center"}], "]"}]}], "&"}], "/@", "vl"}], ")"}]}], ",", 
       
       RowBox[{"VertexSize", "\[Rule]", ".5"}], ",", 
       RowBox[{"VertexStyle", "\[Rule]", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{"#", "\[Rule]", 
            RowBox[{"If", "[", 
             RowBox[{
              RowBox[{"ListQ", "[", "#", "]"}], ",", 
              RowBox[{"vs", "[", "0", "]"}], ",", 
              RowBox[{"vs", "[", 
               RowBox[{"FromDigits", "[", 
                RowBox[{"First", "[", 
                 RowBox[{"StringCases", "[", 
                  RowBox[{"#", ",", 
                   RowBox[{
                    RowBox[{
                    RowBox[{"b", ":", 
                    RowBox[{"(", 
                    RowBox[{"DigitCharacter", ".."}], ")"}]}], "~~", 
                    "\"\<.\>\""}], "\[RuleDelayed]", "b"}]}], "]"}], 
                 "]"}], "]"}], "]"}]}], "]"}]}], "&"}], "/@", "vl"}], 
         ")"}]}]}], "]"}]}]}], "]"}], 
  "\[IndentingNewLine]"}]], "Input"]
  }, Open  ]]

Of course it’s much more efficient to “share the work” by using intermediate theorems:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> 5|>]]]

This doesn’t change the “depth”—i.e. the length of any given path to get to the axioms. But it reduces the number of independent paths that have to be followed, because every time one reaches the same theorem (or axiom) one just “uses what one already knows about it”.

But to get a sense of the “axiomatic machine code” of Euclid we can just “compile” the proof of every theorem down to its underlying sequence of axioms. And for example if we do this for 3.18 the final sequence of axioms we get has length 835,416. These are broken down among the various axioms according to:

rep = replacements;
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"replacements", "=", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"First", "[", 
       RowBox[{"First", "[", "#", "]"}], "]"}], "\[Rule]", 
      RowBox[{"Last", "/@", "#"}]}], "&"}], "/@", 
    RowBox[{"GatherBy", "[", 
     RowBox[{
      RowBox[{"Rule", "@@@", 
       RowBox[{"EdgeList", "[", "euc", "]"}]}], ",", "First"}], 
     "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"rep", " ", "=", " ", "replacements"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Monitor", "[", 
  RowBox[{
   RowBox[{"Do", "[", "\[IndentingNewLine]", 
    RowBox[{
     RowBox[{
      RowBox[{"rep", "[", 
       RowBox[{"[", 
        RowBox[{"n", ",", "2"}], "]"}], "]"}], "=", 
      RowBox[{"Sort", "[", 
       RowBox[{"Flatten", "[", 
        RowBox[{
         RowBox[{"rep", "[", 
          RowBox[{"[", 
           RowBox[{"n", ",", "2"}], "]"}], "]"}], "/.", "rep"}], 
        "]"}], "]"}]}], ",", "\[IndentingNewLine]", 
     RowBox[{"{", 
      RowBox[{"n", ",", "1", ",", 
       RowBox[{"Length", "[", "rep", "]"}]}], "}"}]}], "]"}], ",", 
   "n"}], "]"}]], "Input"],

Cell[BoxData[
 RowBox[{"Text", "[", 
  RowBox[{"Grid", "[", 
   RowBox[{
    RowBox[{"Transpose", "[", 
     RowBox[{"KeyValueMap", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"{", 
         RowBox[{
          RowBox[{"EuclidVertexName", "[", "#1", "]"}], ",", "#2"}], 
         "}"}], "&"}], ",", 
       RowBox[{"KeySort", "[", 
        RowBox[{"Counts", "[", 
         RowBox[{"rep", "[", 
          RowBox[{"[", 
           RowBox[{
            RowBox[{"-", "1"}], ",", "2"}], "]"}], "]"}], "]"}], 
        "]"}]}], "]"}], "]"}], ",", 
    RowBox[{"Frame", "\[Rule]", "All"}], ",", " ", 
    RowBox[{"Background", "\[Rule]", 
     RowBox[{"{", 
      RowBox[{"None", ",", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"bookColorDarker", "[", "0", "]"}], ",", "None"}], 
        "}"}]}], "}"}]}]}], "]"}], "]"}]], "Input"]
}, Open  ]]

Here is a plot of the lengths of axiom sequences for all the theorems, shown on a log scale:

Module
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"replacements", "=", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"First", "[", 
       RowBox[{"First", "[", "#", "]"}], "]"}], "\[Rule]", 
      RowBox[{"Last", "/@", "#"}]}], "&"}], "/@", 
    RowBox[{"GatherBy", "[", 
     RowBox[{
      RowBox[{"Rule", "@@@", 
       RowBox[{"EdgeList", "[", "euc", "]"}]}], ",", "First"}], 
     "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"rep", " ", "=", " ", "replacements"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Monitor", "[", 
  RowBox[{
   RowBox[{"Do", "[", "\[IndentingNewLine]", 
    RowBox[{
     RowBox[{
      RowBox[{"rep", "[", 
       RowBox[{"[", 
        RowBox[{"n", ",", "2"}], "]"}], "]"}], "=", 
      RowBox[{"Sort", "[", 
       RowBox[{"Flatten", "[", 
        RowBox[{
         RowBox[{"rep", "[", 
          RowBox[{"[", 
           RowBox[{"n", ",", "2"}], "]"}], "]"}], "/.", "rep"}], 
        "]"}], "]"}]}], ",", "\[IndentingNewLine]", 
     RowBox[{"{", 
      RowBox[{"n", ",", "1", ",", 
       RowBox[{"Length", "[", "rep", "]"}]}], "}"}]}], "]"}], ",", 
   "n"}], "]"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", 
      RowBox[{
       RowBox[{
        RowBox[{"If", "[", 
         RowBox[{
          RowBox[{"MissingQ", "[", 
           RowBox[{
            RowBox[{"#", "[", 
             RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
            "]"}], "]"}], ",", "Nothing", ",", 
          RowBox[{
           RowBox[{
            RowBox[{"#", "[", 
             RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
            "]"}], "\[Rule]", 
           RowBox[{"Length", "[", 
            RowBox[{"Last", "[", "#", "]"}], "]"}]}]}], "]"}], "&"}], 
       "/@", "rep"}]}], ",", "vals", ",", "acc", ",", "xval"}], "}"}],
    ",", 
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    "\[IndentingNewLine]", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", "#1"}], "&"}],
         ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    "\[IndentingNewLine]", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListLogPlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"Axes", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"False", ",", "True"}], "}"}]}], ",", 
         RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
         RowBox[{"Frame", "\[Rule]", "True"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", 
            "\"\<length of axiom sequence\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}], ",", 
         RowBox[{"Joined", "\[Rule]", "True"}]}], "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"10", "^", "7"}]}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

Interestingly, 3.18 isn’t the theorem with the longest axiom sequence; it’s in 4th place, and the top 10 are (in gray are the results with intermediate theorems allowed):

Text
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"replacements", "=", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"First", "[", 
       RowBox[{"First", "[", "#", "]"}], "]"}], "\[Rule]", 
      RowBox[{"Last", "/@", "#"}]}], "&"}], "/@", 
    RowBox[{"GatherBy", "[", 
     RowBox[{
      RowBox[{"Rule", "@@@", 
       RowBox[{"EdgeList", "[", "euc", "]"}]}], ",", "First"}], 
     "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"rep", " ", "=", " ", "replacements"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Monitor", "[", 
  RowBox[{
   RowBox[{"Do", "[", "\[IndentingNewLine]", 
    RowBox[{
     RowBox[{
      RowBox[{"rep", "[", 
       RowBox[{"[", 
        RowBox[{"n", ",", "2"}], "]"}], "]"}], "=", 
      RowBox[{"Sort", "[", 
       RowBox[{"Flatten", "[", 
        RowBox[{
         RowBox[{"rep", "[", 
          RowBox[{"[", 
           RowBox[{"n", ",", "2"}], "]"}], "]"}], "/.", "rep"}], 
        "]"}], "]"}]}], ",", "\[IndentingNewLine]", 
     RowBox[{"{", 
      RowBox[{"n", ",", "1", ",", 
       RowBox[{"Length", "[", "rep", "]"}]}], "}"}]}], "]"}], ",", 
   "n"}], "]"}]], "Input"],

Cell[BoxData[
 RowBox[{"Text", "[", 
  RowBox[{"Grid", "[", 
   RowBox[{
    RowBox[{"Module", "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"uu", "=", 
         RowBox[{
          RowBox[{
           RowBox[{"Length", "[", 
            RowBox[{"Last", "[", "#", "]"}], "]"}], "&"}], "/@", 
          "rep"}]}], ",", "vv"}], "}"}], ",", 
      RowBox[{
       RowBox[{"vv", "=", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{"Complement", "[", 
           RowBox[{
            RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}], 
           "]"}], "[", 
          RowBox[{"[", 
           RowBox[{"Flatten", "[", 
            RowBox[{
             RowBox[{
              RowBox[{"Position", "[", 
               RowBox[{"uu", ",", "#"}], "]"}], "&"}], "/@", 
             RowBox[{"TakeLargest", "[", 
              RowBox[{"uu", ",", "10"}], "]"}]}], "]"}], "]"}], "]"}],
          ")"}]}], ";", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"EuclidVertexName", "/@", "vv"}], ",", 
         RowBox[{"TakeLargest", "[", 
          RowBox[{"uu", ",", "10"}], "]"}], ",", 
         RowBox[{
          RowBox[{
           RowBox[{"Style", "[", 
            RowBox[{
             RowBox[{"Length", "[", 
              RowBox[{"VertexOutComponent", "[", 
               RowBox[{"euc", ",", "#"}], "]"}], "]"}], ",", 
             RowBox[{"GrayLevel", "[", ".6", "]"}], ",", "Italic"}], 
            "]"}], "&"}], "/@", "vv"}]}], "}"}]}]}], "]"}], ",", 
    RowBox[{"Frame", "\[Rule]", "All"}], ",", 
    RowBox[{"Background", "\[Rule]", 
     RowBox[{"{", 
      RowBox[{"None", ",", "None", ",", 
       RowBox[{"MapIndexed", "[", 
        RowBox[{
         RowBox[{
          RowBox[{
           RowBox[{"Flatten", "[", 
            RowBox[{"{", 
             RowBox[{"1", ",", "#2"}], "}"}], "]"}], "\[Rule]", 
           RowBox[{"bookColorDarker", "[", "#1", "]"}]}], "&"}], ",", 
         
         RowBox[{"{", 
          RowBox[{
          "10", ",", "12", ",", "12", ",", "13", ",", "10", ",", "10",
            ",", "10", ",", "10", ",", "10", ",", "10"}], "}"}]}], 
        "]"}]}], "}"}]}]}], "]"}], "]"}]], "Input"]
}, Open  ]]

(10.72 is about addition of incommensurable medial areas, and is never referenced anywhere; 12.14 says the volumes of cones and cylinders with equal bases are proportional to their heights; 12.15 says the heights and bases of cones and cylinders with equal volumes are inversely proportional; etc.)

Here’s the distribution of the lengths of axiom sequences across all theorems:

replacements = First
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"replacements", "=", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"First", "[", 
       RowBox[{"First", "[", "#", "]"}], "]"}], "\[Rule]", 
      RowBox[{"Last", "/@", "#"}]}], "&"}], "/@", 
    RowBox[{"GatherBy", "[", 
     RowBox[{
      RowBox[{"Rule", "@@@", 
       RowBox[{"EdgeList", "[", "euc", "]"}]}], ",", "First"}], 
     "]"}]}]}], ";", 
  RowBox[{"rep", "=", "replacements"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Monitor", "[", 
  RowBox[{
   RowBox[{"Do", "[", "\[IndentingNewLine]", 
    RowBox[{
     RowBox[{
      RowBox[{"rep", "[", 
       RowBox[{"[", 
        RowBox[{"n", ",", "2"}], "]"}], "]"}], "=", 
      RowBox[{"Sort", "[", 
       RowBox[{"Flatten", "[", 
        RowBox[{
         RowBox[{"rep", "[", 
          RowBox[{"[", 
           RowBox[{"n", ",", "2"}], "]"}], "]"}], "/.", "rep"}], 
        "]"}], "]"}]}], ",", "\[IndentingNewLine]", 
     RowBox[{"{", 
      RowBox[{"n", ",", "1", ",", 
       RowBox[{"Length", "[", "rep", "]"}]}], "}"}]}], "]"}], ",", 
   "n"}], "]"}]], "Input"],

Cell[BoxData[
 RowBox[{"Histogram", "[", 
  RowBox[{
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"vod", "=", 
        RowBox[{
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"#", "[", 
             RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
            "]"}], "->", 
           RowBox[{"Log", "[", 
            RowBox[{"10", ",", 
             RowBox[{"Length", "[", 
              RowBox[{"Last", "[", "#", "]"}], "]"}]}], "]"}]}], 
          "&"}], "/@", "rep"}]}], ",", "dataG"}], "}"}], ",", 
     RowBox[{
      RowBox[{"dataG", "=", 
       RowBox[{"GroupBy", "[", 
        RowBox[{
         RowBox[{
          RowBox[{
           RowBox[{"If", "[", 
            RowBox[{
             RowBox[{"MissingQ", "[", 
              RowBox[{"#", "[", 
               RowBox[{"[", "1", "]"}], "]"}], "]"}], ",", 
             RowBox[{"0", "\[Rule]", " ", 
              RowBox[{"#", "[", 
               RowBox[{"[", "2", "]"}], "]"}]}], ",", 
             RowBox[{
              RowBox[{"#", "[", 
               RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", 
              RowBox[{"#", "[", 
               RowBox[{"[", "2", "]"}], "]"}]}]}], "]"}], "&"}], "/@",
           "vod"}], ",", 
         RowBox[{"First", "\[Rule]", "Last"}]}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{
       RowBox[{
        RowBox[{"Flatten", "[", 
         RowBox[{"Join", "[", 
          RowBox[{"Values", "[", 
           RowBox[{"dataG", "[", 
            RowBox[{"[", 
             RowBox[{"Key", "/@", "#"}], "]"}], "]"}], "]"}], "]"}], 
         "]"}], "&"}], "/@", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", "0", "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
          "1", ",", "2", ",", "3", ",", "4", ",", "5", ",", "6"}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"7", ",", "8", ",", "9", ",", "10"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"11", ",", "12", ",", "13"}], "}"}]}], "}"}]}]}]}], 
    "\[IndentingNewLine]", "]"}], ",", 
   RowBox[{"{", ".2", "}"}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"FrameLabel", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
     "\"\<length of axiom sequence\>\"", ",", 
      "\"\<number of theorems\>\""}], "}"}]}], ",", 
   RowBox[{"FrameTicks", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{"None", ",", "None"}], "}"}], ",", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{"Table", "[", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"n", ",", 
            RowBox[{"Superscript", "[", 
             RowBox[{"10", ",", "n"}], "]"}]}], "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"n", ",", "0", ",", "6"}], "}"}]}], "]"}], ",", 
        "Automatic"}], "}"}]}], "}"}]}], ",", "\n", 
   RowBox[{"ChartLayout", "\[Rule]", "\"\<Stacked\>\""}], ",", 
   RowBox[{"ChartBaseStyle", "\[Rule]", 
    RowBox[{"Opacity", "[", "1", "]"}]}], ",", 
   RowBox[{"ChartStyle", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"bookColorIntense", "/@", 
       RowBox[{"{", 
        RowBox[{"0", ",", "6", ",", "10", ",", "13"}], "}"}]}], ",", 
      RowBox[{"EdgeForm", "[", 
       RowBox[{"Directive", "[", 
        RowBox[{"Thin", ",", 
         RowBox[{"GrayLevel", "[", "0.15", "]"}]}], "]"}], "]"}]}], 
     "}"}]}]}], "]"}]], "Input"]
}, Open  ]]

We can get some sense of the dramatic value of “remembering intermediate theorems” by comparing the total number of “intermediate steps” obtained with and without merging different instances of the same theorem:

ParallelMapMonitored
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"(*", 
  RowBox[{
   RowBox[{"unmerged", "=", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", "\"\<ParallelMapMonitored\>\"", "]"}], 
     "[", 
     RowBox[{
      RowBox[{
       RowBox[{"If", "[", 
        RowBox[{
         RowBox[{"MissingQ", "[", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
           "]"}], "]"}], ",", "Nothing", ",", 
         RowBox[{
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", 
           "]"}], "\[Rule]", " ", 
          RowBox[{"Length", "[", 
           RowBox[{"Flatten", "[", 
            RowBox[{"Most", "[", 
             RowBox[{"FixedPointList", "[", 
              RowBox[{
               RowBox[{
                RowBox[{"Sort", "[", 
                 RowBox[{"Flatten", "[", 
                  RowBox[{"#", "/.", "replacements"}], "]"}], "]"}], 
                "&"}], ",", 
               RowBox[{"#", "[", 
                RowBox[{"[", "2", "]"}], "]"}]}], "]"}], "]"}], "]"}],
            "]"}]}]}], "]"}], "&"}], ",", "replacements"}], "]"}]}], 
   ";"}], "*)"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"unmerged", "=", 
   InterpretationBox[
    DynamicModuleBox[{Typeset`open = False}, 
     TemplateBox[{"List", "ListIcon", 
       GridBox[{{
          RowBox[{
            TagBox["\"Head: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["List", "IconizedItem"]}]}, {
          RowBox[{
            TagBox["\"Length: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["465", "IconizedItem"]}]}, {
          RowBox[{
            TagBox["\"Byte count: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["44760", "IconizedItem"]}]}}, 
        GridBoxAlignment -> {"Columns" -> {{Left}}}, DefaultBaseStyle -> 
        "Column", 
        GridBoxItemSize -> {
         "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
       Dynamic[Typeset`open]},
      "IconizedObject"]],
    {1 -> 3, 1 -> 14, 1 -> 21, 1 -> 1, 1 -> 38, 1 -> 34, 1 -> 49, 1 -> 
     56, 1 -> 135, 1 -> 166, 1 -> 135, 1 -> 276, 1 -> 152, 1 -> 189, 
     1 -> 180, 1 -> 505, 1 -> 736, 1 -> 732, 1 -> 870, 1 -> 1173, 1 -> 
     1949, 1 -> 1352, 1 -> 1561, 1 -> 3172, 1 -> 3190, 1 -> 631, 1 -> 
     506, 1 -> 1048, 1 -> 397, 1 -> 409, 1 -> 2441, 1 -> 3723, 1 -> 
     1001, 1 -> 1126, 1 -> 1721, 1 -> 4210, 1 -> 6740, 1 -> 9647, 1 -> 
     9489, 1 -> 12396, 1 -> 8746, 1 -> 25067, 1 -> 1153, 1 -> 32185, 
     1 -> 67254, 1 -> 5654, 1 -> 18538, 1 -> 19567, 2 -> 4824, 2 -> 
     8367, 2 -> 8367, 2 -> 13531, 2 -> 16171, 2 -> 16171, 2 -> 10217, 
     2 -> 18320, 2 -> 30815, 2 -> 31376, 2 -> 43790, 2 -> 34158, 2 -> 
     30573, 2 -> 110990, 3 -> 425, 3 -> 2373, 3 -> 1503, 3 -> 2179, 3 -> 
     1, 3 -> 1, 3 -> 6562, 3 -> 9993, 3 -> 736, 3 -> 836, 3 -> 1932, 
     3 -> 1932, 3 -> 5454, 3 -> 23600, 3 -> 31939, 3 -> 2317, 3 -> 
     3412, 3 -> 2841, 3 -> 3164, 3 -> 3945, 3 -> 5119, 3 -> 9598, 3 -> 
     506, 3 -> 1506, 3 -> 3964, 3 -> 1520, 3 -> 7375, 3 -> 2455, 3 -> 
     8652, 3 -> 3201, 3 -> 16525, 3 -> 38820, 3 -> 72045, 3 -> 48485, 
     3 -> 41352, 3 -> 45657, 3 -> 63258, 4 -> 22, 4 -> 52390, 4 -> 
     15116, 4 -> 3961, 4 -> 17740, 4 -> 18540, 4 -> 3398, 4 -> 7955, 
     4 -> 118, 4 -> 177397, 4 -> 255297, 4 -> 309924, 4 -> 3975, 4 -> 
     210, 4 -> 27464, 4 -> 318652, 5 -> 1, 5 -> 1, 5 -> 2, 5 -> 3, 5 -> 
     2, 5 -> 2, 5 -> 1, 5 -> 2, 5 -> 3, 5 -> 6, 5 -> 1, 5 -> 2, 5 -> 
     1, 5 -> 15, 5 -> 6, 5 -> 30, 5 -> 4, 5 -> 30, 5 -> 48, 5 -> 19, 
     5 -> 19, 5 -> 25, 5 -> 73, 5 -> 63, 5 -> 87, 6 -> 19647, 6 -> 
     38284, 6 -> 43494, 6 -> 45195, 6 -> 53466, 6 -> 53154, 6 -> 
     55918, 6 -> 50428, 6 -> 41999, 6 -> 44331, 6 -> 42020, 6 -> 
     42020, 6 -> 68917, 6 -> 23520, 6 -> 20268, 6 -> 27795, 6 -> 
     28042, 6 -> 53253, 6 -> 87868, 6 -> 221525, 6 -> 2, 6 -> 233017, 
     6 -> 70041, 6 -> 39812, 6 -> 274404, 6 -> 43649, 6 -> 55229, 6 -> 
     458950, 6 -> 389781, 6 -> 434479, 6 -> 139565, 6 -> 60697, 6 -> 
     11752, 7 -> 1, 7 -> 2, 7 -> 6, 7 -> 3, 7 -> 1, 7 -> 2, 7 -> 2, 7 -> 
     3, 7 -> 6, 7 -> 15, 7 -> 8, 7 -> 6, 7 -> 16, 7 -> 23, 7 -> 7, 7 -> 
     8, 7 -> 23, 7 -> 35, 7 -> 89, 7 -> 34, 7 -> 46, 7 -> 24, 7 -> 1, 
     7 -> 219, 7 -> 220, 7 -> 220, 7 -> 442, 7 -> 1, 7 -> 1, 7 -> 204,
      7 -> 1, 7 -> 2, 7 -> 178, 7 -> 444, 7 -> 1, 7 -> 456, 7 -> 8, 7 -> 
     8, 7 -> 499, 8 -> 118, 8 -> 783, 8 -> 2392, 8 -> 556, 8 -> 655, 
     8 -> 2743, 8 -> 2744, 8 -> 2928, 8 -> 1771, 8 -> 65, 8 -> 65, 8 -> 
     65, 8 -> 24, 8 -> 2895, 8 -> 3045, 8 -> 2896, 8 -> 3046, 8 -> 45,
      8 -> 136, 8 -> 307, 8 -> 4336, 8 -> 308, 8 -> 4337, 8 -> 391, 8 -> 
     4768, 8 -> 1922, 8 -> 2068, 9 -> 3543, 9 -> 3586, 9 -> 7629, 9 -> 
     16551, 9 -> 16551, 9 -> 7941, 9 -> 1, 9 -> 4823, 9 -> 17840, 9 -> 
     20205, 9 -> 8, 9 -> 5315, 9 -> 10979, 9 -> 205, 9 -> 24438, 9 -> 
     89, 9 -> 114, 9 -> 180, 9 -> 181, 9 -> 2, 9 -> 1, 9 -> 2, 9 -> 6,
      9 -> 1, 9 -> 2, 9 -> 2, 9 -> 2, 9 -> 2, 9 -> 7, 9 -> 7, 9 -> 8, 
     9 -> 10980, 9 -> 1, 9 -> 1, 9 -> 39, 9 -> 11724, 10 -> 1, 10 -> 
     1, 10 -> 2, 10 -> 9, 10 -> 32, 10 -> 102, 10 -> 103, 10 -> 33, 
     10 -> 226744, 10 -> 228716, 10 -> 323, 10 -> 878, 10 -> 879, 10 -> 
     279209, 10 -> 1, 10 -> 1, 10 -> 18725, 10 -> 20611, 10 -> 27267, 
     10 -> 27267, 10 -> 20502, 10 -> 319825, 10 -> 408640, 10 -> 
     440867, 10 -> 448859, 10 -> 381931, 10 -> 842497, 10 -> 880385, 
     10 -> 308650, 10 -> 605844, 10 -> 1738164, 10 -> 1738164, 10 -> 
     1915120, 10 -> 2359475, 10 -> 5485999, 10 -> 16108, 10 -> 30931, 
     10 -> 448389, 10 -> 2388459, 10 -> 2359505, 10 -> 6036076, 10 -> 
     434627, 10 -> 426484, 10 -> 876543, 10 -> 381932, 10 -> 2862054, 
     10 -> 890992, 10 -> 1172338, 10 -> 251225, 10 -> 251340, 10 -> 
     227826, 10 -> 250799, 10 -> 251340, 10 -> 202471, 10 -> 125054, 
     10 -> 480851, 10 -> 2548214, 10 -> 2530401, 10 -> 6116832, 10 -> 
     787216, 10 -> 957815, 10 -> 1264613, 10 -> 2965863, 10 -> 
     2930968, 10 -> 6557549, 10 -> 361925, 10 -> 1311832, 10 -> 
     3209590, 10 -> 2359506, 10 -> 6036077, 10 -> 6396519, 10 -> 
     23921481, 10 -> 12794, 10 -> 10237, 10 -> 1811901, 10 -> 1934358,
      10 -> 10237, 10 -> 6039667, 10 -> 449303, 10 -> 824048, 10 -> 
     3247238, 10 -> 2406753, 10 -> 398170, 10 -> 7128494, 10 -> 
     246531, 10 -> 246531, 10 -> 246671, 10 -> 246531, 10 -> 246105, 
     10 -> 246671, 10 -> 417462, 10 -> 210228, 10 -> 2086447, 10 -> 
     2202068, 10 -> 137963, 10 -> 6226469, 10 -> 482928, 10 -> 944985,
      10 -> 2795948, 10 -> 2566984, 10 -> 465701, 10 -> 6613882, 10 -> 
     357027, 10 -> 2844746, 10 -> 2271461, 10 -> 10238, 10 -> 6039668,
      10 -> 3306126, 10 -> 367686, 10 -> 9155471, 10 -> 1294543, 10 -> 
     650467, 10 -> 398032, 10 -> 685229, 10 -> 27268, 11 -> 1, 11 -> 
     2, 11 -> 1, 11 -> 1141, 11 -> 1155, 11 -> 3010, 11 -> 2, 11 -> 
     2272, 11 -> 7397, 11 -> 9229, 11 -> 8457, 11 -> 14619, 11 -> 2, 
     11 -> 749, 11 -> 24506, 11 -> 2, 11 -> 38327, 11 -> 3801, 11 -> 
     144, 11 -> 7077, 11 -> 11179, 11 -> 6819, 11 -> 217077, 11 -> 
     11079, 11 -> 13954, 11 -> 26280, 11 -> 73198, 11 -> 1127, 11 -> 
     5890, 11 -> 5891, 11 -> 59453, 11 -> 160579, 11 -> 207420, 11 -> 
     309910, 11 -> 61474, 11 -> 177929, 11 -> 207421, 11 -> 11484, 11 -> 
     63945, 12 -> 367801, 12 -> 397182, 12 -> 119315, 12 -> 692504, 
     12 -> 1308299, 12 -> 1308630, 12 -> 1311835, 12 -> 1909305, 12 -> 
     1723450, 12 -> 3352109, 12 -> 5763932, 12 -> 5450959, 12 -> 
     5763933, 12 -> 15768375, 12 -> 15768176, 12 -> 4877, 12 -> 
     2243219, 12 -> 5009994, 13 -> 61824, 13 -> 41215, 13 -> 5655, 13 -> 
     36473, 13 -> 36561, 13 -> 856336, 13 -> 192, 13 -> 95131, 13 -> 
     92410, 13 -> 171606, 13 -> 2877566, 13 -> 67412, 13 -> 447778, 
     13 -> 65939, 13 -> 91391, 13 -> 3967013, 13 -> 1732990, 13 -> 
     14412576},
    SelectWithContents->True,
    Selectable->False]}], ";"}]], "Input"],


Cell[BoxData[
 RowBox[{
  RowBox[{"merged", "=", 
   RowBox[{
    RowBox[{
     RowBox[{"VertexCount", "[", 
      RowBox[{"Subgraph", "[", 
       RowBox[{"euc", ",", 
        RowBox[{"VertexOutComponent", "[", 
         RowBox[{"euc", ",", "#"}], "]"}]}], "]"}], "]"}], "&"}], "/@", 
    RowBox[{"Complement", "[", 
     RowBox[{
      RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}], 
     "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", "unmerged"}], ",", "vals", ",", "acc", ",",
      "xval"}], "}"}], ",", "\[IndentingNewLine]", 
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", " ", "#1"}], 
         "&"}], ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListLogPlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"Filling", "\[Rule]", " ", "1"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", 
            "\"\<number of intermediate steps\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", " ", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}], " ", ",", 
         RowBox[{"Joined", "\[Rule]", "True"}], ",", 
         RowBox[{"Frame", "\[Rule]", "True"}]}], "]"}], ",", 
       RowBox[{"ListLogPlot", "[", 
        RowBox[{"merged", ",", 
         RowBox[{"Filling", "\[Rule]", " ", "1"}], " ", ",", 
         RowBox[{"FillingStyle", "\[Rule]", "LightGray"}], ",", 
         RowBox[{"Frame", "\[Rule]", "True"}], ",", 
         RowBox[{"Joined", "\[Rule]", "True"}]}], "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"10", "^", "8"}]}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

For example, for 8.13, 229 steps are needed when intermediate theorems are remembered, while 14,412,576 steps are needed otherwise. (For 10.72, it’s 184 vs. 23,921,481 steps.)

Superaxioms, or What Are the Most Powerful Theorems?

Euclid’s 10 axioms are ultimately all we need in order to prove all the 465 theorems in the Elements. But what if we supplement these axioms with some of the theorems? Are there small sets of theorems we can add that will make the proofs of many theorems much shorter? To get a full understanding of this, we’d have to redo all the proofs. But we can get some sense of it just from the theorem dependency graph.

Consider the graph representing the proof of 1.12, with 1.7 highlighted:

gg = With
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
gg = With[{g = 
    Subgraph[euc, 
     VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> 12|>]]}, 
  EuclidGraphLarge[g, GraphLayout -> "LayeredDigraphEmbedding", 
   VertexStyle -> (Flatten[{# -> {EuclidVertexStyle[#]} & /@ 
        Complement[
         VertexList[g], {<|"Book" -> 1, "Theorem" -> 7|>}], <|
         "Book" -> 1, "Theorem" -> 7|> -> Opacity[.6, Red]}])]]

Now imagine adding 1.7 as a “superaxiom”. Doing this, we can get a smaller proof graph for 1.12—with 4 nodes (and 14 connections) fewer:

PruneSubgraph
&#10005

Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";", 
  RowBox[{
   RowBox[{"PruneSubgraph", "[", 
    RowBox[{"graph_", ",", " ", "subgraph_"}], "]"}], " ", ":=", " ", 
   
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
      "subBranches", ",", " ", "branches", ",", " ", "prunecheck", 
       ",", " ", "prunable"}], "}"}], ",", "\n", "  ", 
     RowBox[{
      RowBox[{"subBranches", " ", "=", " ", 
       RowBox[{"Sort", "[", 
        RowBox[{"Tally", "[", 
         RowBox[{"Last", " ", "/@", " ", 
          RowBox[{"Union", "[", 
           RowBox[{"EdgeList", "[", "subgraph", "]"}], "]"}]}], "]"}],
         "]"}]}], ";", "\n", "  ", 
      RowBox[{"branches", " ", "=", " ", 
       RowBox[{"Tally", "[", 
        RowBox[{"Last", " ", "/@", " ", 
         RowBox[{"Union", "[", 
          RowBox[{"EdgeList", "[", "graph", "]"}], "]"}]}], "]"}]}], 
      ";", "\n", "  ", 
      RowBox[{"prunecheck", " ", "=", " ", 
       RowBox[{"Sort", "[", 
        RowBox[{"Select", "[", 
         RowBox[{"branches", ",", " ", 
          RowBox[{
           RowBox[{"MemberQ", "[", 
            RowBox[{
             RowBox[{"First", " ", "/@", " ", "subBranches"}], ",", 
             " ", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}]}], "]"}], " ", "&"}]}], 
         "]"}], "]"}]}], ";", "\n", 
      RowBox[{"prunable", " ", "=", " ", 
       RowBox[{"Table", "[", 
        RowBox[{
         RowBox[{"If", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"prunecheck", "[", 
             RowBox[{"[", 
              RowBox[{"n", ",", " ", "2"}], "]"}], "]"}], " ", "==", 
            " ", 
            RowBox[{"subBranches", "[", 
             RowBox[{"[", 
              RowBox[{"n", ",", " ", "2"}], "]"}], "]"}]}], ",", " ", 
           
           RowBox[{"prunecheck", "[", 
            RowBox[{"[", 
             RowBox[{"n", ",", " ", "1"}], "]"}], "]"}], ",", " ", 
           RowBox[{"Sequence", " ", "@@", " ", 
            RowBox[{"{", "}"}]}]}], "]"}], ",", " ", 
         RowBox[{"{", 
          RowBox[{"n", ",", " ", "1", ",", " ", 
           RowBox[{"Length", "[", "prunecheck", "]"}]}], "}"}]}], 
        "]"}]}], ";", "\n", "  ", 
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"Length", "[", "prunable", "]"}], " ", "==", " ", 
         "0"}], ",", " ", "\"\<Same graph\>\"", ",", "\n", 
        RowBox[{"Graph", "[", 
         RowBox[{"Select", "[", 
          RowBox[{
           RowBox[{"EdgeList", "[", "graph", "]"}], ",", " ", 
           RowBox[{
            RowBox[{
             RowBox[{"Not", "[", 
              RowBox[{"MemberQ", "[", 
               RowBox[{"prunable", ",", " ", 
                RowBox[{"First", "[", "#", "]"}]}], "]"}], "]"}], " ",
              "&&", " ", 
             RowBox[{"Not", "[", 
              RowBox[{
               RowBox[{"First", "[", "#", "]"}], "\[Equal]", 
               RowBox[{"First", "[", 
                RowBox[{"VertexList", "[", "subgraph", "]"}], "]"}]}],
               "]"}]}], " ", "&"}]}], "]"}], "]"}]}], "]"}]}]}], 
    "]"}]}]}]], "Input"],

Cell[BoxData[
 RowBox[{"With", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{"g", "=", 
     RowBox[{"PruneSubgraph", "[", 
      RowBox[{
       RowBox[{"Subgraph", "[", 
        RowBox[{"euc", ",", 
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{"euc", ",", 
           RowBox[{"\[LeftAssociation]", 
            RowBox[{
             RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
             RowBox[{"\"\<Theorem\>\"", "\[Rule]", "12"}]}], 
            "\[RightAssociation]"}]}], "]"}]}], "]"}], ",", 
       RowBox[{"Subgraph", "[", 
        RowBox[{"euc", ",", 
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{"euc", ",", 
           RowBox[{"\[LeftAssociation]", 
            RowBox[{
             RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
             RowBox[{"\"\<Theorem\>\"", "\[Rule]", "7"}]}], 
            "\[RightAssociation]"}]}], "]"}]}], "]"}]}], "]"}]}], 
    "}"}], ",", 
   RowBox[{"EuclidGraphLarge", "[", 
    RowBox[{"g", ",", 
     RowBox[{
     "GraphLayout", "\[Rule]", "\"\<LayeredDigraphEmbedding\>\""}], 
     ",", 
     RowBox[{"VertexStyle", "\[Rule]", 
      RowBox[{"(", 
       RowBox[{"Flatten", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"#", "\[Rule]", 
             RowBox[{"{", 
              RowBox[{"EuclidVertexStyle", "[", "#", "]"}], "}"}]}], 
            "&"}], "/@", 
           RowBox[{"Complement", "[", 
            RowBox[{
             RowBox[{"VertexList", "[", "g", "]"}], ",", 
             RowBox[{"{", 
              RowBox[{"\[LeftAssociation]", 
               RowBox[{
                RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
                RowBox[{"\"\<Theorem\>\"", "\[Rule]", "7"}]}], 
               "\[RightAssociation]"}], "}"}]}], "]"}]}], ",", 
          RowBox[{
           RowBox[{"\[LeftAssociation]", 
            RowBox[{
             RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
             RowBox[{"\"\<Theorem\>\"", "\[Rule]", "7"}]}], 
            "\[RightAssociation]"}], "\[Rule]", 
           RowBox[{"Opacity", "[", 
            RowBox[{".6", ",", "Red"}], "]"}]}]}], "}"}], "]"}], 
       ")"}]}], ",", 
     RowBox[{"AspectRatio", "\[Rule]", 
      RowBox[{"1", "/", "3"}]}]}], "]"}]}], "]"}]], "Input"]
}, Open  ]]

What does adding 1.7 as a superaxiom do for the proofs of other theorems? Here’s how much it shortens each of them:

Module
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";", 
  RowBox[{
   RowBox[{"PruneSubgraph", "[", 
    RowBox[{"graph_", ",", " ", "subgraph_"}], "]"}], " ", ":=", " ", 
   
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
      "subBranches", ",", " ", "branches", ",", " ", "prunecheck", 
       ",", " ", "prunable"}], "}"}], ",", "\n", "  ", 
     RowBox[{
      RowBox[{"subBranches", " ", "=", " ", 
       RowBox[{"Sort", "[", 
        RowBox[{"Tally", "[", 
         RowBox[{"Last", " ", "/@", " ", 
          RowBox[{"Union", "[", 
           RowBox[{"EdgeList", "[", "subgraph", "]"}], "]"}]}], "]"}],
         "]"}]}], ";", "\n", "  ", 
      RowBox[{"branches", " ", "=", " ", 
       RowBox[{"Tally", "[", 
        RowBox[{"Last", " ", "/@", " ", 
         RowBox[{"Union", "[", 
          RowBox[{"EdgeList", "[", "graph", "]"}], "]"}]}], "]"}]}], 
      ";", "\n", "  ", 
      RowBox[{"prunecheck", " ", "=", " ", 
       RowBox[{"Sort", "[", 
        RowBox[{"Select", "[", 
         RowBox[{"branches", ",", " ", 
          RowBox[{
           RowBox[{"MemberQ", "[", 
            RowBox[{
             RowBox[{"First", " ", "/@", " ", "subBranches"}], ",", 
             " ", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}]}], "]"}], " ", "&"}]}], 
         "]"}], "]"}]}], ";", "\n", 
      RowBox[{"prunable", " ", "=", " ", 
       RowBox[{"Table", "[", 
        RowBox[{
         RowBox[{"If", "[", 
          RowBox[{
           RowBox[{
            RowBox[{"prunecheck", "[", 
             RowBox[{"[", 
              RowBox[{"n", ",", " ", "2"}], "]"}], "]"}], " ", "==", 
            " ", 
            RowBox[{"subBranches", "[", 
             RowBox[{"[", 
              RowBox[{"n", ",", " ", "2"}], "]"}], "]"}]}], ",", " ", 
           
           RowBox[{"prunecheck", "[", 
            RowBox[{"[", 
             RowBox[{"n", ",", " ", "1"}], "]"}], "]"}], ",", " ", 
           RowBox[{"Sequence", " ", "@@", " ", 
            RowBox[{"{", "}"}]}]}], "]"}], ",", " ", 
         RowBox[{"{", 
          RowBox[{"n", ",", " ", "1", ",", " ", 
           RowBox[{"Length", "[", "prunecheck", "]"}]}], "}"}]}], 
        "]"}]}], ";", "\n", "  ", 
      RowBox[{"If", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"Length", "[", "prunable", "]"}], " ", "==", " ", 
         "0"}], ",", " ", "\"\<Same graph\>\"", ",", "\n", 
        RowBox[{"Graph", "[", 
         RowBox[{"Select", "[", 
          RowBox[{
           RowBox[{"EdgeList", "[", "graph", "]"}], ",", " ", 
           RowBox[{
            RowBox[{
             RowBox[{"Not", "[", 
              RowBox[{"MemberQ", "[", 
               RowBox[{"prunable", ",", " ", 
                RowBox[{"First", "[", "#", "]"}]}], "]"}], "]"}], " ",
              "&&", " ", 
             RowBox[{"Not", "[", 
              RowBox[{
               RowBox[{"First", "[", "#", "]"}], "\[Equal]", 
               RowBox[{"First", "[", 
                RowBox[{"VertexList", "[", "subgraph", "]"}], "]"}]}],
               "]"}]}], " ", "&"}]}], "]"}], "]"}]}], "]"}]}]}], 
    "]"}]}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", 
      RowBox[{
       RowBox[{"Function", "[", 
        RowBox[{"t", ",", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"#", "[", "\"\<Book\>\"", "]"}], "\[Rule]", " ", 
            RowBox[{"If", "[", 
             RowBox[{
              RowBox[{
               RowBox[{"Order", "[", 
                RowBox[{"#", ",", "t"}], "]"}], "\[NotEqual]", 
               RowBox[{"-", "1"}]}], ",", "0", ",", 
              RowBox[{"With", "[", 
               RowBox[{
                RowBox[{"{", 
                 RowBox[{"g", "=", 
                  RowBox[{"Subgraph", "[", 
                   RowBox[{"euc", ",", 
                    RowBox[{"VertexOutComponent", "[", 
                    RowBox[{"euc", ",", "#"}], "]"}]}], "]"}]}], 
                 "}"}], ",", 
                RowBox[{"Catch", "[", 
                 RowBox[{
                  RowBox[{"VertexCount", "[", "g", "]"}], "-", 
                  RowBox[{"VertexCount", "[", 
                   RowBox[{
                    RowBox[{
                    RowBox[{"If", "[", 
                    RowBox[{
                    RowBox[{"!", 
                    RowBox[{"GraphQ", "[", "#", "]"}]}], ",", 
                    RowBox[{"Throw", "[", "0", "]"}], ",", "#"}], 
                    "]"}], "&"}], "[", 
                    RowBox[{"PruneSubgraph", "[", 
                    RowBox[{"g", ",", 
                    RowBox[{"Subgraph", "[", 
                    RowBox[{"euc", ",", 
                    RowBox[{"VertexOutComponent", "[", 
                    RowBox[{"euc", ",", "t"}], "]"}]}], "]"}]}], 
                    "]"}], "]"}], "]"}]}], "]"}]}], "]"}]}], "]"}]}], 
           "&"}], "/@", 
          RowBox[{"Complement", "[", 
           RowBox[{
            RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}], 
           "]"}]}]}], "]"}], "[", 
       RowBox[{"\[LeftAssociation]", 
        RowBox[{
         RowBox[{"\"\<Book\>\"", "\[Rule]", "1"}], ",", 
         RowBox[{"\"\<Theorem\>\"", "\[Rule]", "7"}]}], 
        "\[RightAssociation]"}], "]"}]}], ",", "vals", ",", "acc", 
     ",", "xval"}], "}"}], ",", "\[IndentingNewLine]", 
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", " ", "#1"}], 
         "&"}], ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListStepPlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"Axes", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"False", ",", "True"}], "}"}]}], ",", 
         RowBox[{"Joined", "\[Rule]", "True"}], ",", 
         RowBox[{"Frame", "\[Rule]", " ", "True"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", " ", 
            "\"\<shortening\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", " ", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}], ",", 
         RowBox[{"PlotRange", "\[Rule]", "All"}]}], " ", "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", "12"}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
        }, Open  ]]

(The largest shortening is for 1.8, followed by 4.1.)

So what are the “best” superaxioms to add? Here’s a plot of the average amount of shortening achieved by adding each possible individual theorem as a superaxiom:

(*res=ResourceFunction
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"(*", 
  RowBox[{
   RowBox[{"res", "=", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", "\"\<ParallelMapMonitored\>\"", "]"}], 
     "[", 
     RowBox[{
      RowBox[{"Function", "[", 
       RowBox[{"t", ",", 
        RowBox[{"t", "\[Rule]", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{
            RowBox[{"If", "[", 
             RowBox[{
              RowBox[{
               RowBox[{"Order", "[", 
                RowBox[{"#", ",", "t"}], "]"}], "\[NotEqual]", 
               RowBox[{"-", "1"}]}], ",", "0", ",", 
              RowBox[{"With", "[", 
               RowBox[{
                RowBox[{"{", 
                 RowBox[{"g", "=", 
                  RowBox[{"Subgraph", "[", 
                   RowBox[{"euc", ",", 
                    RowBox[{"VertexOutComponent", "[", 
                    RowBox[{"euc", ",", "#"}], "]"}]}], "]"}]}], 
                 "}"}], ",", 
                RowBox[{"Catch", "[", 
                 RowBox[{
                  RowBox[{"VertexCount", "[", "g", "]"}], "-", 
                  RowBox[{"VertexCount", "[", 
                   RowBox[{
                    RowBox[{
                    RowBox[{"If", "[", 
                    RowBox[{
                    RowBox[{"!", 
                    RowBox[{"GraphQ", "[", "#", "]"}]}], ",", 
                    RowBox[{"Throw", "[", "0", "]"}], ",", "#"}], 
                    "]"}], "&"}], "[", 
                    RowBox[{"PruneSubgraph", "[", 
                    RowBox[{"g", ",", 
                    RowBox[{"Subgraph", "[", 
                    RowBox[{"euc", ",", 
                    RowBox[{"VertexOutComponent", "[", 
                    RowBox[{"euc", ",", "t"}], "]"}]}], "]"}]}], 
                    "]"}], "]"}], "]"}]}], "]"}]}], "]"}]}], "]"}], 
            "&"}], "/@", 
           RowBox[{"Complement", "[", 
            RowBox[{
             RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}],
             "]"}]}], ")"}]}]}], "]"}], ",", 
      RowBox[{"Complement", "[", 
       RowBox[{
        RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}], 
       "]"}]}], "]"}]}], ";"}], "*)"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"res", "=", 
   RowBox[{"{", 
    InterpretationBox[
     DynamicModuleBox[{Typeset`open = False}, 
      TemplateBox[{"Expression", "SequenceIcon", 
        GridBox[{{
           RowBox[{
             TagBox["\"Head: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["Sequence", "IconizedItem"]}]}, {
           RowBox[{
             TagBox["\"Length: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["465", "IconizedItem"]}]}, {
           RowBox[{
             TagBox["\"Byte count: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["5397840", "IconizedItem"]}]}}, 
         GridBoxAlignment -> {"Columns" -> {{Left}}}, 
         DefaultBaseStyle -> "Column", 
         GridBoxItemSize -> {
          "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
        Dynamic[Typeset`open]},
       "IconizedObject"]],
     Sequence[
     Association["Book" -> 1, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 1, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIYWBgYmRnBLBoBFihNSztGNkAOWQCuDgKK
       "], 
      Association["Book" -> 1, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgIWNhYWRbMDAQEABO5SG2cGAC2A1HdUq6gL8BoLt
I+w9agCI5xlgPibVSuTYAwDLWQPN
       "], 
      Association["Book" -> 1, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgJERyhgFQx8AAJpGAn0=
       "], 
      Association["Book" -> 1, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJAgJ2Tg5kJCBmZGJABEyoXC4ApwKuSHUozI+sYBfgB
M7mKAf41Ar4=
       "], 
      Association["Book" -> 1, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLAgJWFgXaAHUoz4lU1CsgHyCELALB1Aow=
       "], 
      Association["Book" -> 1, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIggJuVBQiZWRjhQoyMjCwsYAoqhFCMACwwmgVZFKgD
AiGAHVkxMxZTRgEDOKAZIGENxqwMMBY8SrBFAKYMAC5tAuc=
       "], 
      Association["Book" -> 1, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKggJ0VBNmYmBihgImJiY0NTEGF4DJIgBWIGRiANCuy
KFAHBEIAO5JiRhZGJEvRABYLGFEEceokE+A3EGwfI3ZnURkALQKGMwM4rBmY
gJ5mAgcaKATh4Y81AjBlAC8MBBg=
       "], 
      Association["Book" -> 1, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJggIOLjZOdgxEuxMjIyMEBpqBCyIphgAVGsyCLAnVA
IAQwQWlWEMGGxZRRwAAOaAZIWIMxFwOMBY8SbBEAAUxIbABHwQL7
       "], 
      Association["Book" -> 1, "Theorem" -> 10] -> CompressedData["
1:eJydUUESgCAIXJjk0C/6Uk/wA721H4WIpJN1aB1hZhEWcMvHngnAWUxAVlkS
MzmYOSVzTkWkg+gF1EvPakY9kYj6uKgMmiMmAoTevaf+w3dB06NnW7VT7xdT
vo0Dum0bJBirHYPpnmG7BivJ0lYe+59+gEfgjWi9CwMwA+M=
       "], 
      Association["Book" -> 1, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIQgJWXm4cRLsTIyMjDA6agQqiKIYAFRrMgiwJ1QCAE
MMHMR9YxCtAAKJwhYQ3GfAwwFjxKsEUABDAhsQFG6wL+
       "], 
      Association["Book" -> 1, "Theorem" -> 12] -> CompressedData["
1:eJydUMERwzAIkxDpHlmpI2SBzNqNGsBpyKXNo7qz8XFCMpqX9bkQwCuuhkku
WYCkJPcsYgAC2BETj5wj3D8iW18MJnZ1GzU5wm+c1FtFmZE3s3/hXvDwq3Wb
f61f/8NI4yJWzSTiiOybfTy21PNkRLQp6BZh2p6f4YqRUeu8AY+BA1g=
       "], 
      Association["Book" -> 1, "Theorem" -> 13] -> CompressedData["
1:eJy9kNENgCAMRHstfsIOruQILuCsbmSvoBKNfhlfCD2a0oOO8zLNEJGVW08u
Ztows5wVlAg8yJUEJhVpFxHMr0FFW5HGAgbqdOtxgg5hoxqrDvErvd+79xcP
88m3LxrtCifIk8//yeVIaJfcAMoOA2g=
       "], 
      Association["Book" -> 1, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQAB8zIxMUMDMz8vAwMTACAUQOQzEQsIBlmRhZWODK
gICZiZGJAQQhAEazgnVgMWUUAAEwvBkggQjGfKDwA7PgAYstAiCACYkNAFr6
AxM=
       "], 
      Association["Book" -> 1, "Theorem" -> 15] -> CompressedData["
1:eJydUMENgCAM7NHCkx1cyRFYwFndyFJAUYMhHklTyvVabknbmkBEew53BJaK
EBCjgNk5GBxI+agodGHmHKUk9S149sin8Go/vOaQ98wTTfbSL1dq9XHrP8wJ
4rGPlSa0MumThz5Rv4ltkhmp1ov9XP0fTUDbxHXFAxB0A6o=
       "], 
      Association["Book" -> 1, "Theorem" -> 16] -> CompressedData["
1:eJydUdsNwyAMvGKcWBBF+ckAXakjZIHO2o3qAyTSJs1HT8jY3Pkhc9+ej+0G
4EXzhWWacs5mNq+zqgldIWBA2IPqUVUBixodQBViTTFJQmolXSk8bkD6J2QH
D0Kot7Ssy9y/EC7Z3o+T0PbE4pctBLIMD8XqYxGir+ysPR0zRdkmFm+t0WSE
OmHD0FTxMGD7kY/HN8qGBSc=
       "], 
      Association["Book" -> 1, "Theorem" -> 17] -> CompressedData["
1:eJytUMkNAjEMHF/JWvBgP/xpiRK2AWqlI+yEVWDJk4mUxNbYY89te9w3AvDM
6wj3pSxmdlovZsJn9yIJcgJ9IsmVRAAjU+ZIaPAYWCvHQX13jIwCEicZE8kd
WZ1IOURDin+8UdmqIvgzZvsPDD2iXIsGva/fXEhXOMOfZj3ZiBiWzeTzU4rA
muQ1nGA1VqSnrrrPcxSQPsJ32xf6SwRy
       "], 
      Association["Book" -> 1, "Theorem" -> 18] -> CompressedData["
1:eJydUcENwzAIvMOBGsmVukJX6ghZoLN2oxocKZbS5NGTjMEcB8LP9f1aCeAT
5oBH8+bu93YTMbFaTQKoAGcEt5g74IVULcAgoqmqKHQT7C9Mi3TOIRN6QI47
zkhfFf+Dy2mmfjFJ2L0w/dwCIxvhQWw8JhH7yn61D8eMKEnpG+2CLjUV6rKc
TcvtRzAv5gvPEARC
       "], 
      Association["Book" -> 1, "Theorem" -> 19] -> CompressedData["
1:eJydkNkNAyEMRD02GBYtRaSllLANpNZ0FB9SQs6PPCEzAh8Mp+NyPkBEVw/v
zDHHGHNvzE20dxUHHeAVGNJaIxoFULUgUqsAe9UqFRUJxxKQawsfhzqyYL2Y
bafUMUy+Vv4J/7yNefZa9+0vSMtpKXT8Avstp8dn8jAS8fiyNeEubJAqKGzS
FmLIFhm9lNfGS322peXwBoEJBTQ=
       "], 
      Association["Book" -> 1, "Theorem" -> 20] -> CompressedData["
1:eJydUUESwjAIhAViGzvV8Qd+ySf0A77VH7nQquOMenAnISQQyG7Oy/WyqIjc
0nzA8XTofT/tgB6tNY+ENTMv1GJm4GRApJvqPDcIz+DAFGN4IMBaqMFDq9Ke
u2+IyFZrO9BJT1YfLBD+/ep/sJ/RpJbPTd7kb052G5J9WYrAJXL7im5wTzUq
EatyqLRnsj8uJbFhUKGOZjKahGrHKEqZ29qHUEuF359fn2FSVYQaQ+5i6QZJ

       "], 
      Association["Book" -> 1, "Theorem" -> 21] -> CompressedData["
1:eJydUcERwjAMcyQrpL2DHViJEboAs7IRttsCD8oDXezYia2zkutyvy3NzB7p
vuE89z4mAZfh7l0JOumF2kgibJBmXa3N84BJUU1OPEU1iKBCLQdUzJ7ZEaJd
q0lIrthtjQFQftz6H/jzVjlyjJu6Q39K2pHqy6dKUpm+bzd4qPY6JtaXQ5W9
in1vSmG901oyWgRi65Rtb84dMH6OHHF9Bq1YDBHYEw8WBhY=
       "], 
      Association["Book" -> 1, "Theorem" -> 22] -> CompressedData["
1:eJydUdsVQiEMa5u0esXjnwO4kiO4gLO6kaHXx4/6YQ6EQgqFcLpczxc3s9uk
jziOMfabiMO2qljVTGajB5JQU2S2wH0MwICITeY+h9alRYSkyZq6ubtyvha1
qqy1V0HBjGyNAbDy684/wZ+qiptBWZw3ERNPzNc3ywQNNadv9QG9GdnL7dO0
pNNeyfnclF0Nlu6kLbSCL7Ezl80VwRVOSPY3wr0/g9anOCh/7/RtByU=
       "], 
      Association["Book" -> 1, "Theorem" -> 23] -> CompressedData["
1:eJydUclxAzEMoygAXI3XfqSDtJQS3EBqTUcGd51HZuI8Ag2hg5AgUe/3z4/7
iIivpt9xu62FzLcdQK5GoQw9oySBQpERO0Zer8TgzLxIS7s2Oj3n9CFplmaG
hwjkS9O1tlWOBuxY7uMc+xpc28ud/4T+zNo8Alapb2IWvtGvP1hw84NF9grJ
Th/kejkgT3TUrqo1rW9uQfetLRtVzeAYrqcYa46dlxg+FnPyxCCSzB84PyQO
vwQT4wEkUQik
       "], 
      Association["Book" -> 1, "Theorem" -> 24] -> CompressedData["
1:eJydUcERwjAMs2TXoRwHb56sxAgswKxshBQ4eJUHSuo4ZzWSk8vtfr0hIh4O
Gziva5H7rirujJrI+iBJVi1LxGkHjtEEATS59r7HkgsToGiAQ4USKmxqtiAB
L822dFdoC039WmPb7n+o31WX5TbtSZeRaucNda+YAjSycqby6GYx78EkfWod
k5NvTsKcyai5cyKhMSAlFaM1gUMdXUa9ZOcpjvwCfgA/ULxc+eB4AhvDBf8=

       "], 
      Association["Book" -> 1, "Theorem" -> 25] -> CompressedData["
1:eJydUdERQjEIIxDwPe88XcGVHMEFnNWNDFQ/9cO0TenBEaDX++N2h5k9m75h
3+l+LJK+NWQwGZk5BhnuLivNLhv8cCiHAyj3vY5VGekBuMLEIpqMEH2VLCHI
vsqrpYumJ7SVg/Wj3L/A3952q9romjSMUDtvqHtxCNAKxpirz74mKFMH0V1P
6IoJzEAmYF6dTkLbBinJaaUNnPLcbnDJThasUX6A+QB9kK2qRvwF+p8F9Q==

       "], 
      Association["Book" -> 1, "Theorem" -> 26] -> CompressedData["
1:eJydUNsNwyAMPJuXSYUyQ1fqCFmgs3ajcJAqSEn56EkYPw6f8XN7vzYB8KH5
iTWHEDWXkpRAAWQEOSnmDJgL3nsHdCJW86YGOxrVjNBWAsRNFHVADUT6zdPL
s3H/wfT/gx4noT0fNr9tQVhleGnWk42Ic2V38nSWRRGb5IPSdYMJWgslhO88
N+Nf2+6XQgQ8
       "], 
      Association["Book" -> 1, "Theorem" -> 27] -> CompressedData["
1:eJydUVEOwyAIfaC0xCzZfnaAXqlH6AV21t1oPuxil3V+7EURAXmAy/ZYNwHw
pPiNq9mkF3dTAg7IEQyZkxngyXLOqfkVuJdctKDseapFuGoAJA0I9YB6qbni
5G7uYbl/YNx/52MllP1h6DEFoVfkJFkzRiD6yM7oqbgrLChvpM6uM1Q43PfI
vtqX/Uc+jC9qmgQa
       "], 
      Association["Book" -> 1, "Theorem" -> 28] -> CompressedData["
1:eJydUcERwzAIkwBfcsldM0NX6ghZoLN2oyKc9vypH9XDxkggbN/P5+MkgJeW
CVozP/ZtCcE2wki7wKpdoN3RogoySydvgfBg5Ik9K4Ku2Cd+chFkB+tmuWe3
aj+t/Qs2Zy9ac4QB5ECxiEwRdb2RHeoNJSxpVwwyG4N1DTRXdEDP5xnLZW/t
17TJ8vMTX7wBXgIEJA==
       "], 
      Association["Book" -> 1, "Theorem" -> 29] -> CompressedData["
1:eJyVUEESwkAIS4BevfgCv+QT+oG+1R+VgDqd1daanYElwEL2Ni/3mQAeMke4
kuHuVnCCGyhAyOQtojhVyjn7NJJy2Ump4P7Q7H55Q8/IiIYn/2vdv3HuwdIw
Up9FX/rGCQPzjkqmdJboScGl5RP5/3vbWu9G2Ca9ArkiA3o=
       "], 
      Association["Book" -> 1, "Theorem" -> 30] -> CompressedData["
1:eJy1UNsJgEAMS9rerzu4kiPcAs7qRvaBoIjn16VwzYWUlqx93zoBHPEMQZqK
MKHpfo6Y6+kq4t4wq7IqRUjMUticqw12lj373UVc+t+58zB/tSeETBotPkuw
KM/06wZ5EccJSdkDHQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 31] -> CompressedData["
1:eJyVUdtxAzAIA0s8TH/y0wG6UkfIAp01GwWc3PUr6VU2nM/CCPDX9ef7qiJy
G/cen2stflQvW2Z7527LHAsPYwEiQV17h2tf8FLl3AwQaUDvBTDTKFCl8LVo
1S4Pr4FVtUSFRBV6Ax7xZ7n/RL5lj960F5kVZt69PJHTGqzRDZoNeViSQx+H
vdtgnCH4AwFEx5+xDIk4sVNH55I+tihcyjTcBT3QHntnZXRmQkn9xVLN6K84
D0F1V8gdy+EINQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 32] -> CompressedData["
1:eJyVkLtxAzEMRPFbLInjXKDMoVtyCWrAtaojgedx4MDW+PEDAliCHLzfPz/u
KiKPvb0gwrGOqpWBPJhrTo4mswMIbI2LGuAhYWZvt5sp9JSSNckdIm0to5iq
CX9/tGrOZM5qoopkUVjlPd3B8/V3/wf/zu60bztGEUj4N+Q+o/Ee6ORXNrpd
21yiMXo52oHnBbKvtn5HtsA9L+2Q3WhIO4QYpFL9NHF2wW50g64crhH6A/Lo
v8j1ngbU5QmkvQgQ
       "], 
      Association["Book" -> 1, "Theorem" -> 33] -> CompressedData["
1:eJyVUcERwjAMkxwTcr22O7ASI3QBZmUjLAe43HEUqkcSW1LsOJftdt0I4K7l
F9apuWALYaRbB9N7hvaCk6c2shLMDhlSqmw3WNG57NSUSVA5qIT8EQaRt/sf
7R6C7bNPWn24AeRAMYlIUbz2j4fJb0hhSrtikNl4aK2g5lyXmIR5iZKK1lq/
dRtX8fUTbzwATsQEIg==
       "], 
      Association["Book" -> 1, "Theorem" -> 34] -> CompressedData["
1:eJyVUVsOwjAMS+xMG/3iD/HJlTjCLsBZuRF2i8bEBw9PnRLHkt30st6ua0bE
3b+vOLbJ4AlJYOYAkMKMqgjGVF0qFgUcGCzCHcRioOiaHzztYtgubaE61KZa
gfVL3H/Az1OP0/mVgZnjyh0KlyZE6fNN99NNpJNdmGMF2euXYCtk1BpjtiSa
NqFtFcLdeVmeefAe0Nn6S+zJB+pEBOc=
       "], 
      Association["Book" -> 1, "Theorem" -> 35] -> CompressedData["
1:eJydUMENAjEMix0Tjr5gBFZihFuAWdmIJNyjAnFIpFLT1HZt9brebyvM7FHb
7xqhXDzDAvAIkiJae6Rk5nZQM70hntzcHTUxb7mV19l3jCQ1MbtSBZQ+R1M/
v6v9q7iPbnDlEKtPEBrIKxRe/eMzK7WsiU19MSaa5hxjuEVbjkQoT8tyuSzL
t7SJgvFm/ARWigQ1
       "], 
      Association["Book" -> 1, "Theorem" -> 36] -> CompressedData["
1:eJyNUdsNAjEMs/No+YDPG4CVGOEWYFY2Is6dkPgA1ZWSNk6bOL3vz8dOAC+Z
BcSlVl4NbubuEWFBEsSsCGDIqDQyih3uWZRXghVOVxetQ47fRedMF2aBocfC
4ZHI1PPua+2uI/6yXY/y1ZJLBxu6GCUbkgWjtMkqu1keEisI8W3tHEbtj9H1
tj2pQpkTPiR0AmOEDVKDvG3bp1t+z65O9R8q03XZo30DroYEyA==
       "], 
      Association["Book" -> 1, "Theorem" -> 37] -> CompressedData["
1:eJyNkM1tRCEMhP03NhghlHfey7aUEraB1JqOYoiinBK9DzECj6UxPF8f7y8m
os8td7geb4/Rsl8j11qZubL38OYm24awmLGziUi6CwdP6tQcQJVKWxMnYQbh
79A5R/fwPgsboyJGUIypc6oqYt4c9zb5rxtRokStZc4AHPpD5j7Xu6C1UOa3
a2bbPqJr1VbUBeoHhGqr/l3ZDapxevccVn8pyhlkoMlsLsTp6DiE7SiG8S/K
3NvVO+iEMpyVvgAy2Ain
       "], 
      Association["Book" -> 1, "Theorem" -> 38] -> CompressedData["
1:eJyNkFtOBSEMhnsvBU5HdJ5N3JJLOBtwre7IcozxxZj5gKYtPxT6dv94vyMA
fG5ziafX9dxy5uw558js6R7SXGjvKiGJoJMQUWiFjgMcTJiNmZjVjAyYSOH7
yJ+cZ84WMc/C1jqOvgL6WrIOZvYYV597lf8vjCijAHNmnt0snH8YY/tecA0P
d/OdEakvs8pD1Hst9gqc2wOLnTHbbdHtcrCoalYhqcYQYzMQhRdErmZiVw+z
UqiJstYU/IURbyNvN4US7OoNGb4Av5UJbQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 39] -> CompressedData["
1:eJyNkMltQzEMRLkvWmxAp1zTkktwA641HYVSEOQU4z9BA4lDYCh9Pl+PJwLA
15ZrrI8eLe+9jbXGGGtkuoUJbVMJSQQNhYiaGaHjgIAwVa1SaQQJEKKC/B96
mz3NLWchfUd0B++T52Rm9XF53Iu0t657CVdXcXNVU/5ljH2udynX0jJ/XBHZ
9hFeqzZrXZTtoM4c1b8ru4HZT++eQ+ovidEbiMFEFCPACE09uOwoVME/GDHj
nqlwQlENGb4B98UIkA==
       "], 
      Association["Book" -> 1, "Theorem" -> 40] -> CompressedData["
1:eJyNkFtSBCEMRRNuHqQ7Q4szNf9uySXMBlyrOzL4KH8sqw+QIg8I3JfH2+uD
ieh9mZM8PfeRmVuOkXPmdA/pLm3ltHETYW/SWgst13knJxPAgAaoWVNCK/N1
5E/ut5E9Im+FzetxbDNomxPzAOCxn3/uOf6/MKKMEh3HGPfNLBw/ZK69F6jh
4W6+IiL1Zah8F9WCl+Pon1gAG8yWLLqSCIiqjmokJUwDWycxujKjxORuHmZV
oSYKrSn8C5gve14uSlWwuncGfQCQ2glc
       "], 
      Association["Book" -> 1, "Theorem" -> 41] -> CompressedData["
1:eJyVUNttBDEIBIaHDetd6TpISynhGkit6Sh4pShfiS5jgTAzYPDb8+P9yUT0
ud2rOEbl46jruirzzMwRM1Q2ZcKiyiEqIukuHLxo0nAzB8TMxhAniFiLf33i
XDU9fK6GHkdmVFDUwloALNY/xn0J9Scb0Q5EY1Sd0asYvlG1497L0MeabOuM
qm76djjPNlhfDH7DAhit35ktAOLW7jm0/1LAGaRGi1ldiNNtdu8uHepdxKb8
AzDnfGQa7W7G7gz6AsRHCHQ=
       "], 
      Association["Book" -> 1, "Theorem" -> 42] -> CompressedData["
1:eJyVUUtOxTAMdPwZf1qVlsJjzZU4wrsAZ+VGON2wAj0mkjXyWDOJ837//LgP
Ivqa5WHg7TXTMzOqfFFdNJJltIepiBlDwMxmRgOjKAmi6qqsagZW0iY9/GvC
7aWiA+LczxPPx3HUllTbk2wlIpH1n+s+gvhTzeziRMuy77eKSJ/PlImI5uIN
6ePpjqsDYPZxDbU4VcwaF5Ai1bwN5BIlBW2ydxDzoMEEEBudNKR3SdbWqhG9
PxgHxHX8oJe/Vq4raOb2J8QQ+gYKUwj+
       "], 
      Association["Book" -> 1, "Theorem" -> 43] -> CompressedData["
1:eJyVUMkRwkAMk2UxO2wHPGmJEtIAtdJRLIdHGIZM0GN9SD7W9+X5WALAy895
BDJiZNKIrh2UgMRFrTAl8ppIJVtYWW5Q2s+DmZI2ZQEe4foKoW5P/bXuCfCY
fdPeQ7TdUdFEpcK87dfHXE+0sKWbYifj3pkz65zuNOsSda0a6eg2xq9tq1XU
QT8Hr/f/A/0=
       "], 
      Association["Book" -> 1, "Theorem" -> 44] -> CompressedData["
1:eJyVUMltBDEMk3VbtmceqSAtpYRtILWmo1D7yWsRLA0I1GHq+Hx8fz0GEf20
eQMfERYRWuWlepkny4CGqYgZ63BmVlUaPiYlGYtMd1hTcELKyPWl/lk1s2ru
s3fsfc5aRau2niNQqvneuP/j9SiNKpggWuu+z8qc0WtKQxVccIsQvKgIj464
e8f9WdTcn05IPuETa4CLJGJIwsPHuNFo4JLClElstGjIYCab27qPNxhSYfwH
Yb4qrsup+6Iwh9AvVKcIfg==
       "], 
      Association["Book" -> 1, "Theorem" -> 45] -> CompressedData["
1:eJyVUclRBDAM8yHLTgaKoCVK2AaolY5QwoMXMKuZXPKtvD0+3h9uZp9newYc
FgkKwAJfAhFmQGZVwivcu8qjo53Ow0fEsTJgupZl/Jq+W24JtoC918Jqm9Ui
qg79ZLv/An9ab706J9kz03XHTBzUWQJKnNpjHeYKg+J1ITWLHilnXvRKNGYS
ozh8i6gkVCE/n5FWZQ5r98irLKQeXvcmdwCOqR+osuLXokE2eEpk/wJVLAeC

       "], 
      Association["Book" -> 1, "Theorem" -> 46] -> CompressedData["
1:eJyVjzFyAzEIRUEfEMLs2I3XjRtfKUfwBXLW3CiwkxlXScavQIgPQv/x/Px4
MhF9dXiL87rd73m57Ju76zKMroJ4iLARxhhuxqwcNMlN1YChKu7DaJRA8vvS
TF/T54qMsFPhy8hWaqYq1OPd7/5H/qmaUXujiG3LtlJefsjs3ArYcZh6V0Sk
5SPgugM7rC6GeaATCJQXaDeUPo/erRbVJIERk4ZSMosJIaxHtDBRVrAKvwDz
Oj3WUupXpEYY9A2bpQiD
       "], 
      Association["Book" -> 1, "Theorem" -> 47] -> CompressedData["
1:eJydUEFOBDEMS+PESaphpFmJEye+xBP2A7yVH5EWceCwSGC1bqs4VuPX+/vb
fYjIx6K/4WVW5ZyV59MxL4KjPaBDzez5dANmhKqrDwz61xMAaRC6Q+APzau8
nJFM0m/XdcVJibM8szs9H3f+E/PXKtlk6+yBgwziG5ncQ7ET2Me+Ap3CKm9C
VW/04K2JDRa2fvESrLW06x+qJmqdmCikbHiYuqm2p3ZwB30oOufxA0fWcbgs
tzkyBuQTW+EIDw==
       "], 
      Association["Book" -> 1, "Theorem" -> 48] -> CompressedData["
1:eJytUFFKBUEMa9OknXXfzkPwAl7JI7wLeFZvZGcFeQoKgvmYmZBpGvJ8e325
uZm9reOPEFkkcx6Xy2NBgBkCHhFPk61uY8DlBFGK2DIRgSrRAv2b+NlbKUYw
mal5XI+YZZpqV6lGO/wztl/VqpVpxRpjZUh+oor3aKbz0S2c1714En2gvmk9
ufjoRe40LNZ1mtK7PwSjywuktFGOBOhfsIv7TluLH3yUh70DiBMHQA==
       "], 
      Association["Book" -> 2, "Theorem" -> 1] -> CompressedData["
1:eJydkFFuhCEIhMGBEZRs/mQf+twr9Qh7gZ61N6qaJn3qJt0vEZUBdXx/fH48
VES+dvgv9/u83a6KCA+i7RREm5lS0FobEaquJSkjSAKNtDFaSFuC+N+XVkX2
6DlqDM5FJIVZXuUOj3rhuU+ZT1VStjfJrCq6by8/zLnX3PZ4JnrfGTPb8gm4
rjXAtSH6wTuQWF7gpwDop3YbW50CxVzf5FKq1k0wTosvaK4OddNfoJrjLdNl
n2JqVMg3M/MIKA==
       "], 
      Association["Book" -> 2, "Theorem" -> 2] -> CompressedData["
1:eJydUEFOBDEMS+rETTqdQdobIw7wJZ6wH+Ct/Ih0hMSJlcCHtI1jpfbb/eP9
riLyucqf8XrcbudTRHgQbXUg2syUgtZakKquKZSgO4HmbhHNpBUh+H3pPiN7
9BxzDG5z2yIpzOlzusMj//PdR9gesqQsbzLnvu/LSnn5xnGsOwvgddD76pjZ
oq+C5xM4wXoQ/YJ3IFFe4Gug+H7N7rWolJUN+pBGmapGE1TIJfECzdWhbvoD
qOZ4yaxQS20lUcgXPpMIVQ==
       "], 
      Association["Book" -> 2, "Theorem" -> 3] -> CompressedData["
1:eJydkEtOBDEMRO2UXbG7kx5pdrTYcCWOMBfgrNwIp4XEipHgLZxP2XHKb4+P
94eKyOcKf+e4389bRHgQbV1AtJkpBa21IFVdUyhBdwLN3SKaSStB8HvTOSJ7
9NzGtnEf+x5JYQ4fwx0e+a/vPmF/qpKyvMkYc85lpbx8cxxrzwK8FnpfN2a2
5Cvg5QROsA5Ev/AOJMoLfCWU3q/cWY2qsmaDvkmjDFWjCWrIVeIFzdWhbvoD
VHN7zayh1itWJQr5AgqqCDQ=
       "], 
      Association["Book" -> 2, "Theorem" -> 4] -> CompressedData["
1:eJydUEGOAjEMSxynmc4gjtznSzyBD/DW/dE6u4DEAQ7jVlGVOE7q/Xa/3tzM
fjocAJljzlyD/lBwAekB4LRtYljq1CA7xUSVyhDJ4/PQHBtFj8qqYGQgaciB
TIkQeWzdz7h8rZIKMBujSgsAE0/su96cAifXVXdZO8OhL8udfkZvnViUWfjf
tsgNvkRUlD3is6wtDHPl0hAmARsaHoTam9s+/vX6G+LMCDib4BHS+AWRrQZv

       "], Association["Book" -> 2, "Theorem" -> 5] -> CompressedData["
1:eJydkEtuwzAMREkOf4plu45WBbJpj9Qj5AI9a29Uyll0lS7yAI0wGlEffty/
v+5MRD9TXuFzvEd0j4BMK8Siyk4QkTBjNk4ycgUWMwHUXXTGU54ee+zrkq0t
676usb1t26UHZb/a9TCDt3zxuU+5/JtGlCjR7TbGkUV9DQ96j9IoEOcUnnNF
VU89N41RA1EmkCfeqiFwBxw6Q7RZoqMuqgICw5PEaWdWK5fuoWqFq7GpmPIf
YN56tclITWGiwaBfdowIww==
       "], 
      Association["Book" -> 2, "Theorem" -> 6] -> CompressedData["
1:eJydkEtyAzEIRIFuQPL8MtbGXvpKOYIvkLPmRmFmkaychV+VkLpaCNDj+fX5
VBH5PsJbjHvmGpmwQ5mokRoCM0t3VdcmLkGgZxrACKM4SQFfvrpvy9R6n5Zt
WXL9WNfLnNLmq193d0Rv77b7isu/bmaF6vZ2G2NvES1rkBqpmOc6IwvkueXp
lk38XhqjFrJEop1EByZEAAEeJvqRwlGFKkGgiCYWsqnSS1XVJL0IujrNqX9A
dZ3rm1zohBtTIT9Hnwif
       "], 
      Association["Book" -> 2, "Theorem" -> 7] -> CompressedData["
1:eJydUEFyAzEIAwtk8NrJsdcmT8oT8oG+tT8q3ulMT+khOmBjIbC4Pb8eTxWR
7x3ewzUiPIm2E4g2M6WgtRakqmsKJehOoLlbRIO0IgSvh64Z2aPnmGPwmMcR
SWFOn9MdHvn2d1/g+JclZXuTOdda20p5+cXlsu8sgOdB7/vFzDZ9BnzegTtY
CZEnvAOJ8gLfNcX3s3bVoFLWbtCHNMpUNZqgllwSL9BcHeqmf4Bqjo9qK7uL
lUQhP6JhCA8=
       "], 
      Association["Book" -> 2, "Theorem" -> 8] -> CompressedData["
1:eJydkEFuAzEIRYEPfFtNPJ7FKOteKUfIBXLW3qh4qqqbtos8y19YHwzi/fG8
P1REPpa8CDmYHbZiEzV3DYGZMUI1tElIOtDcDfBM82Uv+fPPOWZPss/rnHHd
9q1tXdq2Y9+AINvr4/7O5V+39xIXud2OY7bMRnwzxopZoA47+eW6+6ln0px1
wXoQ7aQWhjdkAglfJvoq8aMaVYFAlV0sZagaTLUx6R5FemgkwvWHSh+XMUZI
VEs3p0I+AekaCFo=
       "], 
      Association["Book" -> 2, "Theorem" -> 9] -> CompressedData["
1:eJylUMkRwkAMsyXLJNABP1qihDRArXSEzDAMD+ABStZeK/Gl03Y5bxkR1zG/
4lh7ISNdY6y9OPy6mC4/TIksElRI0ekwkvxYktzBheAMQipnIKodwijUH+O+
hb5+BWam8QDLpj3EeHg439kGm5LjQw+jWZmUr7wP7SQzzbpj5azxgtlz/nx0
s56FAIOJbEe0GtZ3OvqYnPcJy16Ly8Jau7f1TcYNy9kFaA==
       "], 
      Association["Book" -> 2, "Theorem" -> 10] -> CompressedData["
1:eJylUNERQjEIgwRo6xau5AhvAWd1I0PP8/xQPzTtQZsCDZyP6+VwM7u1+Rm5
im6uGm3li02vmfDQomeS2mBappXrbE5+rEguqBCUQYxR0wcspq4QAvGP3Heo
r69Aa2oPMGRKItqjxYkrgcVMZK7NRLespnXkFq0kMcXYmNFtvEBloyMfv8Et
YKDR4aUbNQ3rmbHEiuz9hMZepygFKp7pqaHbHbzDBW8=
       "], 
      Association["Book" -> 2, "Theorem" -> 11] -> CompressedData["
1:eJylUUtORDEMS+I4aSseaPROwJXmCHOBOSs3wn07JGABXliqm4/rvj+e94eb
2cemv+P1ZcUAzCIjq2pmJVDdHgx4eiHidhwRgWSGFQkL/jiQOTQJbHbzOM4b
3pb1IYUA58C/7H6D/vV2re3JbM4x5KircDkRdBDrzYWStsg5t6Kyi6+iXa/7
3twX5lLrpTe4ZSx1k0OL3PUb4aR5WoYrNXckYu+gMqFBaad/wTnqPNuo2Dpq
qPUTT5gHcw==
       "], 
      Association["Book" -> 2, "Theorem" -> 12] -> CompressedData["
1:eJylUdsNAjEMS2yne7ASI9wCzMpG2D0kkBD84L7TxE3cy3G7Hl1V90x/YLGN
Ku6VizFK6pYbW0OOBNLWmiZQ7fENgJJRC+bwASXm5L3pzflfup+Yn7dMPcg6
g1S4p41JkRSSpKtyPy32lLj2lpGFDdlPz7AXw1bMirRD5sxj/8aSZfCrSLif
tmId4RTt7P7O4CtobXEcND0mrAcqpwSq
       "], 
      Association["Book" -> 2, "Theorem" -> 13] -> CompressedData["
1:eJylUcENAzEIAxtIKt0SXakj3AKd9TaqyfVTVe3nrASsQIgh9/352N3MjjZX
AGemmQfluWVlRM4JBMjiTHK7TXHUyDACsMTPchEIlZIFxqjmMDDEqmIGL8r9
wv+CaKW5ZKGVQY20pgX0FkTadbiPV/ydtHohuosz5+zujDEX7aWr6zWXHqVo
nBbuClmX1B+xMreRGrZO/QOBHlMnYnhNp70AIrAFww==
       "], 
      Association["Book" -> 2, "Theorem" -> 14] -> CompressedData["
1:eJylUMsNQjEMy99Nt2AlRmABZmUjnAcICQkuWGqSNq5j5XS5ni8qIrcJf8EO
BRU1M01TNa1MMbfUECAr2SnYAu/kmLh9VQNWWHg0uh1oeLdUoyiElch/7X5i
/exyuEhM3ns8FOKFtaYGEYiqrmKfL3nEeBC7eWJewA+DUVjkPxk9OVlsDrLZ
iwq3Jy5QVeduI9wzfe+ducXTfcUbNhY2jUk6Z2ukhd4BqdwHNw==
       "], 
      Association["Book" -> 3, "Theorem" -> 1] -> CompressedData["
1:eJylUNsRgCAMa9J+eG7hSo7AAs7qRvYB6nmePwYoj5YQsrRtbRCRPcI/JIPK
PAGonXqnUCGDnWeZL+yDTB3Ugm/ImmNU+rfcB/iZHe8BoSTidTHXjMkbXaqn
szR/il4ULxAZE+XRcKqMQddhxk6bDppq3VUblr2pLdr7yQF9XQOY
       "], 
      Association["Book" -> 3, "Theorem" -> 2] -> CompressedData["
1:eJylUNERglAMa9o8BJnClRyBBZzVjUxbuFM/+CEfvaMJL2ke2+u5wczeOS5i
vplx9rGuC+w+jRFkcNBDEC8LlwI+lZx24ukC/BfWOyCJ63H/DM9ZbwmZQXKi
k1SWnsrWiYO50fG55y5CssShUScA0bNIBML7MKnVloQGHZ0K1mcaH4+F+Xd9
3pVpXX75I+wDM5MEaQ==
       "], 
      Association["Book" -> 3, "Theorem" -> 3] -> CompressedData["
1:eJy1UMENAjEMs+P0uAFYgJUY4RZgVjYicY+TTkLigYjaqErs2M1te9w3Anh2
+jWuQF6kdR2AUoI0Rg4sWjQRATAkixFfNN/tONdIxif436NUw8qdy0SwA3Y0
vU3HZa8b7h8gdjd4fNoIzuzypHh+L0imyMxEmpE1zY+6Om9v36+1sR+8AJNx
A44=
       "], 
      Association["Book" -> 3, "Theorem" -> 4] -> CompressedData["
1:eJy1UMENwkAMs+Nc2zVYiRG6ALOyEYmPVqqExAeiu+gUx7Evt/1x3wng2ekH
kau0bQNQSpDGyIFFiyYcAEOyGPFF84DjWiMZn9r/HqUaVu5cJoIdsKPpbTou
ew0YP5vYaPD8tDs4s8uT4vm9IJkiMxNpRtY0P+rqur1jI62N98ELdBwDeQ==

       "], Association["Book" -> 3, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 3, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 3, "Theorem" -> 7] -> CompressedData["
1:eJytUMkNwzAMo6RKcR55dYKu1BGyQGbtRiXloEAf7SsEJOuyTPOxH8/dALzk
rsBqXlUIN7NgWqOWjARTGFh1QAYE/PeaEDL7YOKuHdmx2rlcRPeDP1z0XtJR
ItL3SFoonhc9Tr4QO3Lsj7OulfOrzZpdb9/QberhetlUnur0/BjGKa0vdszu
tZlwm0VO8pSeXyARdzuFbrwBClUEVA==
       "], 
      Association["Book" -> 3, "Theorem" -> 8] -> CompressedData["
1:eJytUNsNwkAMs53rlTVYiRG6ALOyEXHuWvUDEEL4pCS6OA/nut1vGwE8bP6C
iN4vZEukiVhDrbcmgGLZaAtyHKH3XWhMFxVz//XPp9Lf8IV+WZxXkWVQmtug
VnWL0URRi1b+IBVR3DlwooT5KtMnW0OmkMkc40Bc8p0GJSdcC74Axml9aTwB
pdQDwg==
       "], 
      Association["Book" -> 3, "Theorem" -> 9] -> CompressedData["
1:eJy9UNsNgDAI5E4/XMOVHKELOKsbyQGNNTHxx0gaKMf71rZvDWZ2SH0jC+DN
KE1pGTJjMnBwkjO/zOxh3jFU+e9CzY4lYos8DuEnXBu7hy6ZgwQjsa7CIOwV
xZN/LJkLJudqfo2LjEf2MDw7ARBVAyI=
       "], 
      Association["Book" -> 3, "Theorem" -> 10] -> CompressedData["
1:eJytUMERwzAIkwSLZKWMkAU6azcKWHbau17zin0GWzIg2I7XfhDAu81DKzMA
1YUSE1lOMlWOtREgmfxftGgO87XgY/Y5ubPgPUt/iWghbSe2tFifFSsaUUTj
ccVzPFZXPZGA7SBrNqKT1diwOsaY08zdBAzqR7HTgp/IEzliA1s=
       "], 
      Association["Book" -> 3, "Theorem" -> 11] -> CompressedData["
1:eJytUEESwkAIg5Cs4/gBr37JJ/QDvrU/MlAPXvTUbJsuDIWQx/Z6bhkRe9NZ
uN8QqwCSWtdVoug05hGqpkod/QJZcxoHk07SPdGN60S5g/8NyaNkrRZi9nbl
F/2pgws+1lxUZyh1XlPUqjlB4bON7wsabqsAwf9WD5IicoY6ZOTKi6eTmTkl
3r7d+Jbs+/hkUxsB2t03n0AFSQ==
       "], 
      Association["Book" -> 3, "Theorem" -> 12] -> CompressedData["
1:eJytUNEVwkAIgxDOp1u4kiN0AWd1IxPOD3/0q7k2PXgUQu7H83FkRLxMp+GG
WAWQ7HVd1Wwqi3kaVVPUjn6BrDnGZlJJqifcuM6Ua/xvSO6StSxErO1KL/yp
zQUdaS62M+x2vqfIqjlB4bON7gs9bKuAhv4tD+qOyBmqkJErL5pOZuaUaHu7
8S1Z9/FJphoByt03ePkFMA==
       "], 
      Association["Book" -> 3, "Theorem" -> 13] -> CompressedData["
1:eJytUMERgCAMaxKOPVzJEVzAWd1I2oinH1+WIy2UljTLtq8bIuJI+NHQNaw1
jQVRzP4IRiig+kyf9cMKHhbezv5LN7l90/ETKYkkXneTi/mZMWs+KgeE7nrU
YU7FksJYyaEN4Wat4VYgfWd3TMuYMd/6cPrnPCdT2wN9
       "], 
      Association["Book" -> 3, "Theorem" -> 14] -> CompressedData["
1:eJytUMERwjAMsyXbKceLBwOwEiN0AWZlI5QQjvYBr+qhc07SxdZtfdxXN7Nn
pwNxZWWRRDAQABO8nBeQnqwwAm6Jn/lQSvGRjVahBwwMTVGUeuy6ZvyrAm8L
hOikQzTFHD7ghnfIJHInkRvzEKc2f4PpSnNaJ3k5GjNU5sL0sYh/ke6B1ntR
b2h+aoq+AJ39BTE=
       "], 
      Association["Book" -> 3, "Theorem" -> 15] -> CompressedData["
1:eJytUNsRgDAIS4BT13AlR+gCzupGQqmeH/auH80Hj5LQHHs5j0IAV4SZUDOj
QyAUz0pZt0U0OlPECDb+KVuI7Dsx2+6wh9dLRYfRFfOjZHY/uhj73dRjUhh1
0uSRa1uU8Gc0O3FYmN8INyjsAyk=
       "], 
      Association["Book" -> 3, "Theorem" -> 16] -> CompressedData["
1:eJytkLERhTAMQ2UrR5GKFViJEViAWf9GX7JTUFHhS8zFxNZTjus+rwDwc/o0
9jmZSYayvqpIIoGZ3NMX5lu3eyJXhEInpLNPGvk17vvA0tNmm6Epmiuar3kX
MYtR1lXnulQOGPF0pbehc/3UquES2rZ+rIRNQ3UWwxijtQKl9gB0h8cTpefG
wB/gTQQu
       "], 
      Association["Book" -> 3, "Theorem" -> 17] -> CompressedData["
1:eJytUEESAjEIIwGdXhyPnv2ST9gP+FZ/ZCDreNLT0k7aAg2B+/Z8bIiIV8Ox
drmCZKFwYiaCSRBY57pVx1fk78+QETaOw6v0LBEdLvc/4dSjZdky1VupwRY4
KL05ofh42AcH4eaNkxRfz0yGMbldqAou5u10XUpjHDIGmDuTBa5OYOwjSnHp
+gbVdAQS
       "], 
      Association["Book" -> 3, "Theorem" -> 18] -> CompressedData["
1:eJytUMERwzAIQxL99pl/V8oIWaCzdqMKcNNX+4rskzHGQsfjeO4HIuJVdDE2
SBSEpJIBxwQzdWc9Z/D3XxjEoMswS75aFZfb/S/Y/Ti2BjZRRlgmh+1X/RSf
DOtgM+z55C6Kb4ZN0bXVKBPTbPaUO0hplGuYay4Lt2VyRiRrOXwDqMED5g==

       "], 
      Association["Book" -> 3, "Theorem" -> 19] -> CompressedData["
1:eJytUMsVwzAMEoj03BW6UkfIAp21G1USTnrLKdgPf8A21mv/vHdExLfpbiCT
grAxlSgCCW35ZKuKvDhaIIxxw021VJZye9ZrtWU6llFfo0R2SHPlzZHi2GEP
HEZlPnlM8d/hUIy3H5Lgx9xtr4kyfTMDXHVZeKyQLlHZulo/nKgD4w==
       "], 
      Association["Book" -> 3, "Theorem" -> 20] -> CompressedData["
1:eJytT8kNAjEM9DmOIyHYD39aooRtgFrpCHsBIR7wWsenZuJMLuvtujIR3Tvt
bueTsPFCB5rhbiLiQ+aUQcIs5L8fzczhwKialgkgQcjUclXDsrfU+IsClbRr
RMLMXd8W0b2XaR2Hd7RGs4btRapQr8E36kZXaHM2RjTSXeuo/VQDjMQowXoU
KnJ9G5+VXP5lKGV15QmasdIDpmQGpA==
       "], 
      Association["Book" -> 3, "Theorem" -> 21] -> CompressedData["
1:eJytUDESwzAIA0uAr70M2bv0S3lCPtC39kcVuQxd2ik6DLaMjU7P/bXtbmbv
TtfjYUZbfPV1ZnKMkdOXhdPg7pa/h0YECVA1JgUQpmNGpID71Urj7y2gVIqS
HiiQWdVKJAgtSEgcBVHN8FBacTZpJZTPHjRTSTZfnRQs/a9B7jQM733ICbeb
KPkh/9hPCWc528MvMEo+ySTZ5hke9gFq8AaS
       "], 
      Association["Book" -> 3, "Theorem" -> 22] -> CompressedData["
1:eJy1UNsNAkEI5DHccIm5DzuwJUu4BqzVjhwuGr/0SyfswDKwS7jst+vuZnYf
+gtg7Sc/ryQigotvG9oUu3V87KoqIBPytUJIpOnKKpJL8tdj1lc1U9Rm1P+V
siS7OdBFnALzcFmcDI5Ju55FOpTyqslDJTD5HpKhu2cOd1iGs2asKrfFwrUP
7Q/TinS0wnjDJekBqFGrqmDF4g9bKQa7
       "], 
      Association["Book" -> 3, "Theorem" -> 23] -> CompressedData["
1:eJy1UNsNgDAI5GiNJPy4gis5QhdwVjeyUE3rI/1S0vIoBwed07okENFm6g/R
qKykR8REsBOyj9Ap40ZyABRrt6S/nrO/f+WzSUzXQvdhBpYFXpqVRwc6FHcE
WkeEaXDKyaij8EicExLOL3usjzLCte0OVh0Dbw==
       "], 
      Association["Book" -> 3, "Theorem" -> 24] -> CompressedData["
1:eJy1ULENwzAMI0ULGTt06dqXckIeyK35KLJQ24WBdGoImBJoi5L83vZ1I4Cj
0i0oLz7xAEEzGoMhCpLJdVnFQNIUwwBWs3+P+duw9eMXgFglpWnergzfvB3c
9vkoGJTHnfFHNVtqaXF66oqGacv+vEFtNsSvqigidAJ/wQOf
       "], 
      Association["Book" -> 3, "Theorem" -> 25] -> CompressedData["
1:eJy1UNsNwkAMy8MehJUYoQswazfCDqJUquhX69NZed3FyWN5PZeMiNV0D5o6
qExUiRvoThuJ/01JUIU08CGEDEW6W97VKnGedboty1LgEb7Y23YPEcOquT0Y
TMSlU73lPVjVXFITB6MgCrUtC4EXU9D2kD9on7Mn5PyYYHa8AW+eBWc=
       "], 
      Association["Book" -> 3, "Theorem" -> 26] -> CompressedData["
1:eJy1UEEOgzAMsxMHrrvwgH1pT+ADe+t+RJpRKiFtJ7BUp3Ib1+lzfb9WAvg0
ugm24AGCZjTSDU6HZD7pZw8TRaeaBrDmdXXI//OTI1ZHDmZf6ZT3UIZvnQ7u
8+wKBtWKYP5R282tVcEo3fPBsuVxvcN7NmiSQlmhDWSRA5U=
       "], 
      Association["Book" -> 3, "Theorem" -> 27] -> CompressedData["
1:eJy1UMsNQjEMs5v/EwNwQmIlRmABZmUjkl44wQncNHVrS7F6vT9udwJ4Tvsb
zkiPUHLFoZlMkBQcn4dWpXumV0OqIrISubk08vLriPldHVmByVRp5i6iKjtK
NhdvSC/v2GrzYmbd1bZpUpd4X3xbxzR8PCY6Yp+mqnuQElS6YQlq0U4Cuo88
u9Fmdr2xyIij/xijKqfwApWFBss=
       "], 
      Association["Book" -> 3, "Theorem" -> 28] -> CompressedData["
1:eJy1kNERwzAIQ3mgxgNkgqzUEbJAZ+1GxZzjXHrX/kUfMgYJg7f99dwxs3en
28AKhjsOyOVhEtHiv+vr7EgH4bcO+xOXV3MZInKdXijOuk/pkacoLznz5NFh
ZigyjkaSmypsXSeZyhYMCUN/4vzJpcVjEZmJDzBcA2Q=
       "], 
      Association["Book" -> 3, "Theorem" -> 29] -> CompressedData["
1:eJy1UMENAjEMc9rEORBPFmAlRrgFmJWNiNuKe8ELrDRN5aS2ctsf990APJX+
B7sw081antupAlY1ts+iEd7D6wg+Affw8Qq//tphfGVLHCCQKTuk7Mzsb3sH
BhGDX026uIiFGssxzMWERigftRxYR5agI5rZJgvJqjgVG2nsdqAo/aclaXOm
wAvohAYY
       "], 
      Association["Book" -> 3, "Theorem" -> 30] -> CompressedData["
1:eJy1UNsRgCAMS0qqc7iSI7CAs7qRyEPhw/OL3LUNNH1ct3jskQDO200EkQdY
YqRZgMTg+qkZY+WpweRlPzBMZUPhJc8uWzWoIhKPb5r2U2RgayQZlOmazKT6
CnxPwn4HdjdZXO6exAgX8f4DNA==
       "], 
      Association["Book" -> 3, "Theorem" -> 31] -> CompressedData["
1:eJy1ULsVAjEMsyPLse9WuIaVGOEWYFY2QuFR0EAFKpzETvTJ5bxdTzez+yr/
xF7FMUbNcRyYBvdh+Vk0kxFApLBTAGFgdmZ3J7Zf++PXKaAizSr5gdyge9t6
QYfuglB4LuBcnSCrqqnteiXTDXVed1aUro4QQW9rWNmxoknIXVkllstWwlVh
SQ53/UpkhFFkeMMQI4sMQ8iA7+nTHpPZB0E=
       "], 
      Association["Book" -> 3, "Theorem" -> 32] -> CompressedData["
1:eJy1UEEOwjAMi+0wbeKE4AN8iSfsA7yVH81Zq2oc4LZItRI3cd081/drRUR8
Ck6N24OZvF+xzJjjIiGkfwPoJ45dzqlk5JlWfwSPhUSQwO5yR5sdTgfPgir8
3YFdYTAsqIpkm5ZTmK03BaQ7mmCTsywnL+IrukFfetETvKUNkrQD1A==
       "], 
      Association["Book" -> 3, "Theorem" -> 33] -> CompressedData["
1:eJy9ULkRwzAMw6OcipSpUmYlj+AFPKs3CinqHLtJl0ASxaNICIfXui0rAewZ
fosnKT3u7B0N3Sbob/2cB+cuxZaY999x+TNVDEReESdVR12oLmqUQvlhdOUZ
iyrnNRkcaYwMA5y8HoT1HKSxHPZ9IHsaRdyk3ujGN189A58=
       "], 
      Association["Book" -> 3, "Theorem" -> 34] -> CompressedData["
1:eJy1UMERwjAMsyT74MWLBViJEboAs7IRllt68IBfdRfFseTYyW153BdExNN0
MEjxesH5hIqSENQ/O7YVn66OOaVHDvoDXz0zCRKYKYd72Hyre54mIyeViP2j
19hMiz6RW4/sEOxUO+a1nAtbpgXvFRpwkK7U2lpUFZR4AVXRA6o=
       "], 
      Association["Book" -> 3, "Theorem" -> 35] -> CompressedData["
1:eJy1kMENAjEQA3dt74YfHxqgJUq4Bqj1OsKBk7h7wAssxYo0jmL5utxvS0bE
Ou3fupwBZLEZJWVIH6MliKQdUHeVrwGqJTVLP6+L73RiRnQDhSnSxVzuZZu4
84PGIMYBkbvwE25s+w1eR5GMaWROmj6ewsoZV75VmYJH9Ui2wjj56QMvfgTz

       "], 
      Association["Book" -> 3, "Theorem" -> 36] -> CompressedData["
1:eJy1kMENw1AIQ7ENzRA5ZaWMkAU6azcq/pHaJFJ7ai2EkDDiwbLd1w0R8XD6
u2YCTN4YUxWj6qNTAi2oJ+gEBKCSVEzp12jf74fb7GiMhgEGnTQKvHQoz+pr
URcD33lv7vIimieZAYUCmeoXcOxS700PFXRQmcU47WbHZM8TsVAENw==
       "], 
      Association["Book" -> 3, "Theorem" -> 37] -> CompressedData["
1:eJy9UEEOgzAMs51UgMQn9qU9gQ/srfwIhyIx0LTb5rRWGketm8fyei4EsBb9
HhK9mjBPo6DxSyNYgOCkqJdD8k79x+4VLCsoR3Xo9nrpreMzMom8NfDkLp6K
76yfO8LEaB6Bw7ODmVlvNsQO7UhLhzNnmYMaYwM1bQN5
       "], 
      Association["Book" -> 4, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoB+hr20gCyCELAJZUAnw=
       "], 
      Association["Book" -> 4, "Theorem" -> 2] -> CompressedData["
1:eJy9UEEOgCAMW1uIHP2CX/IJfsC3+iM3IEQPetMmNGN0W9my7esGMzuC/sCM
MiFblmDUmxL92FXlMZW89EuPD7jNlAgSqC4ru9nhdOQZFFBN+afHolsczHiM
G9lnJA9BTzUFjbUhWzt/MuZQD6SoVBstppLhWzoBK08Dgg==
       "], 
      Association["Book" -> 4, "Theorem" -> 3] -> CompressedData["
1:eJy9UEEOwCAIo+D8h1/aE/zA3rofraCJ0WTLLlsjHLAFSqnHXiEip6dfgLwh
i6qC6ZHYQ7AWQemHG74Cd4iYi/dsDD8TuTtclIm3AS/kEyzcojn3TvxgwUQx
kILZRhlfqC/aVwL5
       "], 
      Association["Book" -> 4, "Theorem" -> 4] -> CompressedData["
1:eJy9ULENgDAMix1VYmBm5yVO6APcykfEaSlMTAgrcpo4ldOudd8qzOwQ/QNy
SrPlbcjdCZKRHUJURjhaSf98q3dVMpRjJRFxoZ2zlRsrc/TBPpQv6MyhtQ4e
JKNSIDPAPMIkyB79L2LMWtyYQ6frevqlbidr2wPF
       "], 
      Association["Book" -> 4, "Theorem" -> 5] -> CompressedData["
1:eJy9kMsNAkEMQ5N1vrPUgERLlEAD1EpHOCDECU6Id7AyTqR4crpczxcVkdvI
nzhaCVQ3qc9LI9wMsCC7EzgEHntE78uxfh3Kv3YBSk8sgmGtzu7MZDTqWIl+
tOzlUGskM2Jl0OrEY3SGns6i01OmJ6oquEiVfxVJZjIJKMNtEmabqj1vIZbg
+w1r9/I5UsBKD64pdwD4BpY=
       "], 
      Association["Book" -> 4, "Theorem" -> 6] -> CompressedData["
1:eJy9kMEVAkEIQ2ECgV2rsCVLsAFrtSPD7smDnnzmwLwHA/lwvT9udzez54R/
aZXBfRk/m5IZAQSlS0pIGJJNdjex/5opv1YBBXlWiQdD1r33wFFok5OIwvFy
MhEDvx0bSH00lko4P1Wz1NwMnpOKMavJyF27mm01WIQr6ljkcs8ZG2FZK7He
pFm6kyFk4zud9gK8vwZY
       "], 
      Association["Book" -> 4, "Theorem" -> 7] -> CompressedData["
1:eJy9UNsRgDAIC4EP13AlR3ABZ3Ujgfj60S/PtJe2kAu5jvMyzQZgLfoNUTTA
nxWWoAnsglbkMzw7X0d6N+x5VCzBnYwgK6Q483q3cFRYB5stM5/cIlwVNqG1
NSjCNExb8ryEu5wJK+dbwKEExP5Fnl553QAxzAOc
       "], 
      Association["Book" -> 4, "Theorem" -> 8] -> CompressedData["
1:eJy9UMkNwzAME0klmaMrdYQs0FmzUSXa9a99FSFgQgch0Xqcr+eJiLiabgQB
xPF9abVZmpZBGIgRU9qQ/zb0+//oNm1rQiL70U7tk8spVp2mSkQsnhNWhSZY
60VkUMhaqHBZPsVnZufcoWzIKC9A+kjqK2Vp4w1G6APr
       "], 
      Association["Book" -> 4, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoDNjpb+XQAIyMjGAMZIEwGwOMxQiLJdyxxYbEBgCj
JAKa
       "], 
      Association["Book" -> 4, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoCWTFGBlZpHFKMzIywmhGZhDBwAjmMYDFmRiY6eNK
VDdBnIMWUih8JiYwwtTKCNaOaSAGkxFhJCMTiMvIwMTICvI4E8xuRjS9YIsZ
gaqZgXZDLAJyuBk5ADnDAz4=
       "], 
      Association["Book" -> 4, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoCnQYGRlNccoyMkEcBFTEyMwEBAxMQB6QYgSJQ3hU
BUQYyAhyDhMTWkgxIvNB7sTmOJAizBBmxMJkQjWSEeRXRmawCCgs4GphAlDA
BETAQANiaJAxiDDzAgBbEgNx
       "], 
      Association["Book" -> 4, "Theorem" -> 12] -> CompressedData["
1:eJy9UMkNgDAMi50vH0bgwUIdoQswKxsRA604isQDYfWyY7VuhjylDDObtfwL
AONzkSgmOAPGYLFB+sa+TfPKApAXJ45cOVvhZLq/gMaR5yuhv8JXRb2o3iLs
YIxoWkypdFrv3QIHRwMx
       "], 
      Association["Book" -> 4, "Theorem" -> 13] -> CompressedData["
1:eJzNT8ERgDAII8HryyVcyRG6gLO6kQRa/fnyYY4rJaSEbv3YO8zs1PEjuDtB
MrJDiMoIR5Xk14bvA9MPyrGSDmKi7knlxsq8eZRQW3vyQzN6xUgwtTJqDTID
zCNsfFdTpswqHqzR56Ln6Zd9uwA8hAOp
       "], 
      Association["Book" -> 4, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAAjIyMYAxkgTArA4zFCIsl3LHFhMQGAJkUAo4=

       "], 
      Association["Book" -> 4, "Theorem" -> 15] -> CompressedData["
1:eJy9UNsNhDAMSxOnrZAY4lZiBBa4WdkIu0jwgbgvdFbk5tE2Tj7rd1mLmW2i
vwOPlYhAzQTPyIPS6DR6jYXpbSnxW6iU1iGLAgiIagWoECdy8D3DFEe5wjMz
fuKgKtIYHjrcwi2hDYWXou5sWySEFxOO5GO/ELTQtqzPvc09FewpGAWI
       "], 
      Association["Book" -> 4, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEWBkgjiIEQiYmYCAgQnIA1KMIHEIj6qACAMZQc5h
YkILKUZkPsid2BwHUoQZwoxYmEyoRjKC/MrIDBYBhQVcLUwACpiACBhoQAwN
MgZeZm4A0RoDBA==
       "], 
      Association["Book" -> 5, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBgBtZmQEU6OAugAAmjwCfg==
       "], 
      Association["Book" -> 5, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAjAyMgyU1cMeAACY8gJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBDAxoFsN5TOCAJzPCIKMaEroAhjpax1ORzCg+J6R
EYpgYoxwBtzBAKqrApo=
       "], 
      Association["Book" -> 5, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCKBbzQQRYQIBCB9MACETIxQw0dF5QEuh9gKdSi5g
ACO88jAWdkfAwgEUXGAOODjgIcIACSqQJFAMpAIoCAD6HgM0
       "], 
      Association["Book" -> 5, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDDACbR9QBwxPAACY4gJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDjAxMoAgCDAykukURqhuRjCTEWweI1iAEWomVBIF
MEAxRAuacYxQxsAGDjKAuZEoFwEAzJcCtg==
       "], 
      Association["Book" -> 5, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEDAyoDmAkRFKARlggiBgAhNAAGYCKTCDCQqQJFEA
AxRDtKAZx4TFGnoDVJ8zMTGCxZjQJcB8RhQeAwD8LwLx
       "], 
      Association["Book" -> 5, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMDKCSAoMABGMUFOAFCPEUAhAkkQBDFAM0YJm
3IAHCgaAuZEolwEAxeQCsA==
       "], 
      Association["Book" -> 5, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGOcDtRKAMIyNWFXARRkawCoi6EQIAoP4Chg==
       "], 
      Association["Book" -> 5, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjBiZRJSOgqgACNMAJitAn4=
       "], 
      Association["Book" -> 5, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjAyM7AwMzICncKMKclEf/dAACMYDQ2AGkgArtoC
kQ==
       "], 
      Association["Book" -> 5, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGHDCCERAzMjNiAKA4AjDT01VgN6G5AEkWLs2IJgLT
ywCVYWSEsWBsFBEoRLcdTsINgRiKrhIYJlAnMkDNAQDfGALi
       "], 
      Association["Book" -> 5, "Theorem" -> 16] -> CompressedData["
1:eJzNj8sNgDAMQ218YA5WYoQuwKxsRJImrVDviBysfJ5k52jX2QjgdvlBaTch
Ni4lgiPkp2klc5YsgLXvUHF1jaGWPSdZYhASQlH5R26m2fIZh2bX+XLiwHxU
rhjwA/bZAw8=
       "], 
      Association["Book" -> 5, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAWCEuGOQuGaIAwCYwQJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGA2ABEcxYXMMIIRgZYexRAA4HRjBAEkAAAKXMAo8=

       "], 
      Association["Book" -> 5, "Theorem" -> 19] -> CompressedData["
1:eJzNj8sNgDAMQ22isAcrdYQuwKxsRL5SJcQV4UPquD68HPMckwAuH38RsRsO
Xbm6tnDsxodSNRa16VZE2Kpff2PtXBJ7gVdUCd2qOyJZz5F3DnaDz7BsQnR6
A+3xAwc=
       "], 
      Association["Book" -> 5, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGB2BixS1FR2fAAQsYDQJAsu8BpmcCjQ==
       "], 
      Association["Book" -> 5, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCRhETgEBxkHnImIBAJecAn0=
       "], 
      Association["Book" -> 5, "Theorem" -> 22] -> CompressedData["
1:eJzNkFEOgCAIhmEw5zW6UkfwAp21G/XzQ1s110NPfSIiiArL2NahIrKH+g0W
ygMaqTCaFT5JYgUKaGKhocUleENqZsrjOjq6Wc9X8bGvCOU1flpTouDGEtAc
bBStcTdIZWWrIghfnIDzAMxHBKU=
       "], 
      Association["Book" -> 5, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGF2ACATADRDAyAzETIxQw0dMhjIxgAkQBbWaCOAJJ
Fi4NFQQ6E0kcIgQ3iBHJUEZUETDE4TOQIqgUEy41EGVgeyEqANTtAtc=
       "], 
      Association["Book" -> 5, "Theorem" -> 24] -> CompressedData["
1:eJzNULENgDAMi2uHP3iJE/oAt/IRTdIOSMwID5bjWJWbvZ9Hh5ldQf+CB1IU
0UiXOwD5hz3atjUFaKYHyOgiFQvTRjrFTAmZYBldKZSfDmM5ofcS8WGWVA1x
jgEljwFrOS4UifHWDZerBEk=
       "], 
      Association["Book" -> 5, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEcB0DhMOcdoDFhYGRkYWIAliMjExMUIAXBbEAgtA
hRiZwAxGJLeyMEAVoXkAKoIkyETQOagqGFGYjCjGAwDJtALT
       "], 
      Association["Book" -> 6, "Theorem" -> 1] -> CompressedData["
1:eJzNTkuuAkEIBKuhYXq6x6TNy5vE8bfwQB7BC3hWbyTj5u1cuXi1qAAFVZzv
j9udiei50n/CqW/HWrfLflmGw2We20+h8XeWeScCr/7tvPJZXWUlmqber6NZ
cZEUAFCrBbubw8W9RDWsE1EJfhMwNUGDR+cY3rAiGGEmsHWpASVOVHsEpbSh
DUONktKRWTQsTIuriuYwzixZPPEfwNx3rfdM2TLis8KgFwDKCSk=
       "], 
      Association["Book" -> 6, "Theorem" -> 2] -> CompressedData["
1:eJzNjTsOwjAQRNfO/rJeQ4IV8ZFAQkJQcRmOkAtwVm6EDQUdFQWvmGJmNHOc
77c5AMCjyV+xXQ/jOF1O5/Nyv5umsnHIu5Me1sxoY//rO/+etlgAUirlkFXd
iPCNe1/VrDc0MnMzleYQt4a8a9kJHa1ahumFpuaoEmorOWJCEpFSjxAjxNAx
AxJcQyDupFNa9cxSZ1k4iHKm+KGLMefFMAho1nqOJWJ4AmA7CbY=
       "], 
      Association["Book" -> 6, "Theorem" -> 3] -> CompressedData["
1:eJzNUMkRwkAM8yE7dkiYtEBLlJAGqJWO0JIBXvx4oMfaI40ley/77bqriNzH
81+YIqLOp3Ut1swJYlOhygwI/Dpt/i6pCqCqJpJZlUa0vbAs7L0Jb6+x4NyD
QYS7Z7A9SliTaT/GCoN5IzhgSPozEGDUOFPEZX6qQpfyTPiwHpql6QdcEhs/
zLSTeeqtLg+BNQXR
       "], 
      Association["Book" -> 6, "Theorem" -> 4] -> CompressedData["
1:eJzNjTsOAjEMRPOzHSfZxChaEAUFEohPxVk4wl6As3IjvOnpKHiSRx6PLR+X
13Oxxpj3Kn/G9XZ+lFrKqdaU7mU2+7ZRmEku9Otn8WvirHW9a1N0K6bUdrV2
oQEAxCiqooBQa701dTrhnFWHqGECBsmkkQxAbwlq1YK8huo4ckz6yHtvnPNM
xqGZnAX0zSMirBUQtsHilCGhDsEPLAZOmTmb7WGGIIAu2A996Apf
       "], 
      Association["Book" -> 6, "Theorem" -> 5] -> CompressedData["
1:eJzNjTkOAjEMRbN4SWwcMkzBUiAhUdFRcBGOMBfgrNwIZ3o6Cl7xZX/b35fl
9VxiCOE95N943O5Va722pnqXOZzqxtQNnm/061/fA1OMaZq82PoWMevBbNeL
w4yItXbX7mBns8nMu+GrutoQRC0FC3ZlH7UVIEJCs4KGbQy9q6LC42OCkFOu
HBIGyREotUx+AAAeRg0jNAEhNzGvRILCIlTC8bwH8vgE8QMqcQnz
       "], 
      Association["Book" -> 6, "Theorem" -> 6] -> CompressedData["
1:eJzNjbkNAlEMRP8xtv9prT5arUAQIFIi6IQSaIBa6Qjv5mQETPB8aub8fD2e
3jn3XvF3ut5qr/UyTa3d2+KOVVvvKfG4ya+jvhsG78M8W9PtS1LqB9XdYBZh
BlDKMA4TBqvOqiLrpvRubCusZEbG6IyBaRMLIFBlKNp6tKnkkpMFxRhdCDGL
C+Ra8CRRIzOTBRIRK3m0gsK2pLjJm0WuVYrbnxaQggP8ByvOChE=
       "], 
      Association["Book" -> 6, "Theorem" -> 7] -> CompressedData["
1:eJzNTTFuAzEMsyVRsmX5jByKogWyFOjQ5Dl5Qj6Qt/ZH1WXv1qEERIgiIX7c
H7d7LaV8H/T/cPVw/1xrjMt4KWefM6J3O33pXzf9/pBqpX3PZcuUmsV7xGmZ
tWYGoPeVvBJYNuc+Z6q8NPfkeRDgGTUsz8F8QhRQRBgC8zBTNR9uWcTMhYhV
CkkZVGG8sbAIQOlJcKVwasqq4CeqSrPuauXt/CrYREjqD5fACUM=
       "], 
      Association["Book" -> 6, "Theorem" -> 8] -> CompressedData["
1:eJzNjksKAjEQRDtJdU/+kzAMggjiTtfexCN4Ac/qjexk786Fr6HoD3TV5fl6
PA0RvYf8IbnmfOu9lHve6JTXUqv3sl+XXxt9f2iNsb2PLERBqcdatyYTADE2
1aagSSkaVWRsQkqqU4AcBAEtCRrqhD3gtRdUpHHUKXgtNXLOkbUuLGSFijW8
uNWpG6shM0thgxwRRZfsJkYQfcoh0OG8A4XFwnwAo+QJgg==
       "], 
      Association["Book" -> 6, "Theorem" -> 9] -> CompressedData["
1:eJzNjcEKwjAQRHc7m23SJG21QhAr4qE3v8ZP6A/4rf6Rm4J48+TBNzBsMgNz
XR/3lYnoWe0fWW63vpRj2Z0TxXnWSwFcGMKvd/LXNCUztVaeplNU7Tze9H29
vQGTT3a09UdEzFW2UuyADt4eHmFDI5CgahWThYgQVZ1sSKShhrltSRwtzJBG
2Os+WO5MrbJ6l4U/gDkPaRwdheTr+IFBL1dJCHM=
       "], 
      Association["Book" -> 6, "Theorem" -> 10] -> CompressedData["
1:eJzNTVsKwkAMTJpHs+7W0m2xIAVR6IU8Qi/gWb2RSUH888sPZyCZZAbmuj3u
GwLAM8ZfYl2HeZ6mac5QTotdFhFKNf26Jn93w9bY43guqgejN0oJbQ5yWnbR
xoeZfSrvoa54kMwPo7RDc3xUPeJ0SZlYVUcvYm6gQTeBBW6IosRkMiT3xdkq
qnLH+AEh9v2xVgHrLLorErwAEM8IPw==
       "], 
      Association["Book" -> 6, "Theorem" -> 11] -> CompressedData["
1:eJzNjcEKwjAQRHc7m+2mSdraHoJYKIJXf8ZP8Af8Vv/IbUG8efLgGxg2mYE5
3x+3OxPRc7P/5NrXeqyHNVNaFl0rEOIQf71SvqY5u6m3yjyfkmpneNP3220O
XJbNYrv9iIi7yl5KHdDB/GGIO5qADFWvuDxEgqjq7EMiDTXMbUsS6MIMaYRN
p+h5cLXKaqEIfwBzGfI4BorZtvGJQS/3UAgj
       "], 
      Association["Book" -> 6, "Theorem" -> 12] -> CompressedData["
1:eJzNjVEKwjAQRHc7m03SJG01H0G0iOCJPIIX8KzeyE1B/PPLD9/AsMkMzOX+
uN2ZiJ7d/pSptUPbrZnSetJzA1yc469Hytc0ZzO1Vqn1mFTHgDfT1O9gwBSy
Hb7/iIi5ylZKIzAi2CMgbmgCMlStYrIQCaKq1YZEBhqYvSdxdGWGDMJBa7Tc
mbyyBleEP4C5zHlZHMUc+vieQS++jwf1
       "], 
      Association["Book" -> 6, "Theorem" -> 13] -> CompressedData["
1:eJzNTjtuQzEMk21JFG28jx/SZO6VeoRcoGftjSojc7YO5UBKIgTy8/n99Swi
8rPov6L3iJueMhEP4LrO+2x/naHvrVJkjNSHiLsqNoBxHOc+BjuBSI5Ej50c
JGxd3D35YxEZdtAYvjP4gmOj59tB8L7MVJ/XfPVoUksNz0CJnLTM2oy1taZm
SpN2DQ0okIUWCswiN5ftNgxboPbyC7uGCKo=
       "], 
      Association["Book" -> 6, "Theorem" -> 14] -> CompressedData["
1:eJzNTTkSwkAM8yKfG0LJQDpghgfxhHyAt/IjFDIUNFQUyB75kI/TfL/NTUQe
C/0trpdzP5b0abTt1gwx+q9f5Fe1imR0633s7hV4IyJWDtCiItKWjqqSTV9D
lUAiWARyxUAR7oAzUsQANd7nIy5IU3aZyU6bp/LQxjRTo8ytFtW0fWCfOR1M
3BwKHRvkCRlTBv0=
       "], 
      Association["Book" -> 6, "Theorem" -> 15] -> CompressedData["
1:eJzNTkkOwkAMyxBnmdDhBELqpeJLPKEf4K38CLcSBy6cOGCNPEmcOLmtj/va
ROS50f9iWWouOc7DpslMc/ivN8RXtYpkfFY1yr2HvuG+xUFoaGZlhm0VAGTD
3lRdtWswCe07oihymAb8KWopjP5cxAFpYJWRnNA8QaODIRPRza03OAztA+fM
68V4oikUo6m8AOuyBtY=
       "], 
      Association["Book" -> 6, "Theorem" -> 16] -> CompressedData["
1:eJzNjjEOQkEIRNllYAG/25qvlbW38Qj/Ap7VGwlqY2Nl4SOZEJgwnLfbdWtE
dC/5Yy7jGOSrsxmzWODXAbuv24gUITKb013V84s3EdVbwlkWZq41AZAqeJrK
n1uU+ouFWVm1VGrMC0NVZwb1DurIC9SZAo3rCNI2BoaLijcMCNoHh7DTKpSW
Ct83pgfKqgbU
       "], 
      Association["Book" -> 6, "Theorem" -> 17] -> CompressedData["
1:eJzNjrsNgjEMhJ34kXMeNSAkJFZihH8BZmUj7HQ0VBRc8flkO+fcj+fjKET0
Svyz2hk0Ts4AM7cuv873r1MgoERmc/ow6xBW5RT2j5AFu6BbdlRkcy95dGIq
ybHVV/rWmI01LS9Wi/w4VKtQzYdB6lI4vVSNdWmupl5EI7p86DJxuxqZIM+u
IvQGjjUGsQ==
       "], 
      Association["Book" -> 6, "Theorem" -> 18] -> CompressedData["
1:eJzNjTtuAzEMRClpOPqsREOh4y5B3OU8OYIv4LP6RtZu7y6FH8AHfoDh9Xb/
uwUReex6a/qn/HQbY5TCj9/83/GvA2MI0X01JlJKrfY958XJnEkArfmyL+A0
czNy32xmy2MX0CtR4UY4TgdagII5iYmxH9e0ta3V9SilJDGmmiWqjBg0p5lI
qipWcWhAb+hcS00HgWhl67XK5esM7WBEeALe/AkU
       "], 
      Association["Book" -> 6, "Theorem" -> 19] -> CompressedData["
1:eJzNjc0NQjEMg/vjuGnySh8jIDERI7wFmJWNSOHAjRMHvkiWlVjO5bjfjpxS
eiz5b2bar8beAc4Tf93ev/+e70hr7jZUNwNEsGgtPCyAwX26O9eGZGjnK7Q8
YaGG7YWOaIAqoDFxxACbNo9HtdaUUYFwaUMGK9asuIiwSYarSPmQS5nkeWey
6qiKUZCf5fMHOg==
       "], 
      Association["Book" -> 6, "Theorem" -> 20] -> CompressedData["
1:eJzNjc0NwjAMRu04duzGEWkPSBUSpTN0E0boAszKRri9c+PAk/XkH+nzur+e
OwLA+9C/c2+tVDer82y/zv4emBDTtkXjAKqt1av72FWnSVVEhqGHeyBdl2Vb
llKOjdYaPiViRaVIrypd1pMYo9xVXLzwOamZtXhElAATMgMxXBLmTIqaJTMT
B8pIjxs3IRGmkziaaS0OI1fOQy6U8QM+IAni
       "], 
      Association["Book" -> 6, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLAA6npERQmPKQBmMYBUQdSMEAACfzQKF
       "], 
      Association["Book" -> 6, "Theorem" -> 22] -> CompressedData["
1:eJzNjUsOwjAMRPObdJyW0kBpVQJILLkOR+gFOCs3wumeHQsszdPII4/v6+u5
WmPMu+Lvp9s3u47k9ZJ+Xc2vibPWzbMaFdn3/WNZbkUkJREAImdlKeeCIjnP
OUfUzTCdlGMFkBpBgzIJCsZtQD3Fsqgw1hAMQz7kXh9574x1NgTjo8nV+GTp
GdRoWSSsPw4YCbLZ+hEYU8uOO5MxILQhergPO5oKIw==
       "], 
      Association["Book" -> 6, "Theorem" -> 23] -> CompressedData["
1:eJzNTssRAlEIg0eAx/u46tiALVnCNmCtdiS7Ox69eTDMZCBkAvf1+ViZiF4b
/T9mjAFYP9uvk+PrpjCXOZnZiVrrvQ33HvggIg4OZMWMaLopqprsupuq5deI
HAJtR52b4p6WrGwxoZ7JeVFEiCFAdnQCw8QFEqgVbuZhjOZqUkTKARa5drtd
jBYbkIpRwG+WxAcG
       "], 
      Association["Book" -> 6, "Theorem" -> 24] -> CompressedData["
1:eJzVjTEOAjEMBB3Hdhyfj3CHFIkGiZqWl/CE+wBv5Uc4R01HwxSrlb3avW7P
x5YA4DXkD1juN1Vpvf66+HshpoTuYRSgtd7PzezgHBCFqA7vATubudlcfL/r
rjRy8Yyv69Dpw6p8ijyzcWFyppW1Wu0xRISxmkWAGC4psWTNymsVKdEpzqkM
g4IYqUFGavNhWQpMx4lYyZDTG3gKCIs=
       "], 
      Association["Book" -> 6, "Theorem" -> 25] -> CompressedData["
1:eJzVjdENAjEMQ5OLnbY56G2AhPhjHEa4BZiVjUiL+OWLn3Ol18hO3ev+fOwq
Iq+BI4i83S+l2L97fxSqCpB3iKwrQAKwiFZa61vPOWmpzRoyo32dZCDRuyM6
0mrd8sFcsuFg+qeBbucsDXy+zGPLIgpxVcvRTDkCdxpddK0IVrLUKWOBzQSV
4XSWzd/inwdp
       "], 
      Association["Book" -> 6, "Theorem" -> 26] -> CompressedData["
1:eJzVjbsRwkAQQ+8neW+9cHh8w4xDaqATSnAD1EpH7JmYjIQXKFhppdv+fOwx
hPAa8hfc64TLtf66Vr46KcZk9om01vvWVM8GpxSSIuZqDg2qpnqa7LjLoSAB
M9B9GTof2FrZqQoqJw8QK6Vq7T5USvLVTIaCsMUIZsmCpXJ0eQfi+EFiSp4a
5FSWdm7LFOY2A1I0l/gGOscIZA==
       "], 
      Association["Book" -> 6, "Theorem" -> 27] -> CompressedData["
1:eJzVjcERQjEIRIGwEPK/yegY77ZkCTZgrXYk0bM3L77DMrMsy/X+uN2ZiJ5L
/oNaLY7x89avG2GWutaFqPc5Z+y7hyeqAGqN1EiwzOo+3k5rDWtkRjUvsCOa
I/Jg0eeGCz6+uqac0cxt5iNVya/FjBR0YoYVLRVbmFlWWoCtMCAmkqlFER2H
PkaQb64oaqL8Ar0pB5k=
       "], 
      Association["Book" -> 6, "Theorem" -> 28] -> CompressedData["
1:eJzVTsEJw0AMsyPZd7lQCDS5exc6ROfoCFmgs3aj+lJofn31UxmEkI3ky/a4
byoiz05/gtvVHb8O/RI4DEKqahEZR9IYQM7JU2qtAngzGjLAEIaPX1hbXVdj
bgwrN+ynHeNqC3d/ypV1wRmlFO6VGoPoVYqrIiSg1hfwyHfRKWGyA2rWe+li
tDletNNsLx4lB/o=
       "], 
      Association["Book" -> 6, "Theorem" -> 29] -> CompressedData["
1:eJzVTcERAkEIg0vAY0/H5WEB9mAllnANWKsdyd7N6M+XHwMTmMAk1/VxX1VE
noP+BTd3/Nrzi6GqkDWbSARpLCDi4BGZHcDOSATAWgxvvbGod2MkS4rE9jrQ
0jo3fRmUOKO1xj2yCtMkSnFV1AqojQNnwmbRZcbRPkB15dKFbienWbvwBaQn
Bz0=
       "], 
      Association["Book" -> 6, "Theorem" -> 30] -> CompressedData["
1:eJzVTkkOAkEIhIGCHnpMekm8+yWfMB/wrf5IZmLiyZsXC8JWbLf9cd+ZiJ6H
+Rus0F+vtO8UMwHpa2p1NwNUI4qv65zDXedsmugaqkmJviu9t3qErUFiysip
rmdrNsk2MYBIux1kl5yP8PMgU4oIkRIyWWhhWSzrrAWCQlyLFP2AAcm7ZuSK
S35occULoOAHag==
       "], 
      Association["Book" -> 6, "Theorem" -> 31] -> CompressedData["
1:eJzVj8sNAkEMQ/NzPCOxRdASJWwD1EpHOMuZGxd8iEa286K5n8/H6Wb2mvE/
WvlrIr5HEUa6uyoAiequnFeNgNRMSWaVTKVyCFzpVYpAxSwpwkeCqNqiTSlm
jd1NHcx0C31ymbe1uzYPC24XRl1uGHg0kzrKS2IL2FR/3RIbjPI3vUoFtQ==

       "], 
      Association["Book" -> 6, "Theorem" -> 32] -> CompressedData["
1:eJzVjc0NwjAMRp34s52kaRMFUZUjKzECCzArG+H0zo0LT/KTf2T7/nw9noGI
3lN/RP71Qfs6iSHEffekEYnU2o7er8MsJTMAKQ33cDC0tb01s9kp6+quU8CS
DRljVQxcTtR3Db17YJtDr0ouufoj5kghMkAsNGJQxcaJDSIsziaBa5GirCp8
AkXOZbGFbvWAbLCI8AHuyAhD
       "], 
      Association["Book" -> 6, "Theorem" -> 33] -> CompressedData["
1:eJzljdsNw1AIQ3FsE27UJbpSR8gCmbUbFfLTvy5QI/HyQTzP63UiIt6T/ln1
2x1bEftetcrOJCWO1uqe2WJHVqY8G9ud5Rs6VoPMHvJGB5p+GFNjdrWk+5EQ
ENKxMY4NfjAwN5I1UKNEEl9tQK2jSjGUQIPxAbjoBWY=
       "], 
      Association["Book" -> 7, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFGBkRHcySIARLAxmMEC5EADVgiIG5oMxAyMDI6aB
GKYPMQAAxXUCrg==
       "], 
      Association["Book" -> 7, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFmBiwiLAxAQmwRwoFwIgCphQxMB8MGYEQiZUAxmh
GMJhhFjHCBYlAzCAEV55KIMywIggGBkBJVEDRw==
       "], 
      Association["Book" -> 7, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGGGBC5zOBMJhkYGBkZAB6iZGJiREMwGpBDAiJAAxg
zMDECNGJGzAOvRACANOEAr0=
       "], 
      Association["Book" -> 7, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOGBmBmEwycDAyMgA9BIjMzMjGDCDFIAYEBIBGMCY
gZkJohM3YBx6IQQA4HUCyw==
       "], 
      Association["Book" -> 7, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHGBEB2ARoDhIClMWG2BAplHMBpuBbBedPUc5AADe
BQLH
       "], 
      Association["Book" -> 7, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHmAEOntoupweAACYNwJ9
       "], 
      Association["Book" -> 7, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIGACo1GAFQAAmVACfg==
       "], 
      Association["Book" -> 7, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJhi6Lqc1AACV/AJ7
       "], 
      Association["Book" -> 7, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKAA6nRGr87GLYjVgWAIAmv0CgA==
       "], 
      Association["Book" -> 7, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKmBhAWJGIGBhAJEMDCDECCGRAAucQAIMUAzhIAGI
GTAOI4QDJUgHDEi24JDHdAIZgBFBMDICABMDAzE=
       "], 
      Association["Book" -> 7, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASpgYhpoF1AGAJfgAn4=
       "], 
      Association["Book" -> 7, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLmBmJl4UizpqOmUQAQCj6AKJ
       "], 
      Association["Book" -> 7, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLGAFIiYgYGUAkQwMIMQEIZEAK5xAAgxQDOEgAYgZ
MA4ThAMlSAcMSLbgkMd0AhmACUEwMQEAeRAD0Q==
       "], 
      Association["Book" -> 7, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMmBjZmJkZGNgBAIGBhBiBJPMjFDAzMzMAaJZmRhR
AAMUg0lkAyFmwDhQWShBOmCA2YJbHsqgDDAiCEZGAB46Az0=
       "], 
      Association["Book" -> 7, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMGBhYMT0AQsLgs0Ik8fiURZMoWEBAKp7ApA=
       "], 
      Association["Book" -> 7, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGBiYgD6gREIGCAMBkYWFjAfDJhA8iBJJkYUABJk
YYCqw2k4WB2EZmBgJANATMArT8AJxAFGBMHICAAFjgMk
       "], 
      Association["Book" -> 7, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGBlZmTkYGAEAgYGEGIEk0yMUMDKysoGotmYGVEA
AxSDSWTzIGbAOFBZKEE6YIDZglseyqAMMCIIRkYAIcIDQQ==
       "], 
      Association["Book" -> 7, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNuBgZGJlYAICBgYQYgKTLExQwMnJyQyimTmYUAAD
FDMCIROycUAOIzxQgCywLFiEkQzAAEZ45aEMygAjgmBkBABXpgN8
       "], 
      Association["Book" -> 7, "Theorem" -> 19] -> CompressedData["
1:eJzVUIENgCAMa7c4lC98yRN8wFv9yG2IRr3AEjpWuiYwr9uyEsAe9HOQRHUA
E1BHFJEiZqWYQ1UZlTparA74dtM0kDF5wzxwuKOh6lUkycGOdk5JXkpykyIh
G7nG4nxyXGaX+a9XPTofSO3rCyObm+0zeAAetARy
       "], 
      Association["Book" -> 7, "Theorem" -> 20] -> CompressedData["
1:eJzVjMENgEAIBBduQ64NW7KEa8Ba7UgWNfczMb6cwAAhsIxtHQZgl/5OByJJ
AyRoxggWob2a0xONxhbsdTlpmX4P7npYVXoPKh73V/MNn3I/AMzbBC8=
       "], 
      Association["Book" -> 7, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAHAiDJwMDBysDKyMjKyc4KBuwgSRADQiIAiMvI
ysLOysXJzo5sEgsQwwOFmZkBpBVEgwjSAQMY4ZWHMigDzAiCmRkAJvUEww==

       "], 
      Association["Book" -> 7, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAiAJYoBJJkYoYGZmZgPRbHARCGCAYjCJYhID
kgBMFkqQDhhgtuCWhzIo9D+CYGQEAAf3Ayg=
       "], 
      Association["Book" -> 7, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAjQBhCCUZgRBGGCAYCwmQfEQBgCyEQKa
       "], 
      Association["Book" -> 7, "Theorem" -> 24] -> CompressedData["
1:eJzdUdsJgDAMvMZaGywW6l//XMkRuoCzupFNteADBH+9hEtyIYGQKS1zUgBW
oT8gxggEwDv0RNx5tpaZPRGpHJmagcUqIBrb4LQex/MiC1BbC61hzB6FvgPF
X/tHcsf1MfmKotHzciVe5oVU3rUBYSQGUA==
       "], 
      Association["Book" -> 7, "Theorem" -> 25] -> CompressedData["
1:eJzdUcENwjAQc+6sJgUKUis+/bESI3QBZmUj4itBpWyAEzm2o1xyym153JcE
4Cn6C8xXYAQuJxzN+jyUwGBmScL8XDQaoKzk8UBO07ZOBozNuKPr6koGVXjD
qiPiLgleI1UIw88x6TdrMxy5bwffH1O7iMx+G0+aHq+Vq3e8AAkOBdU=
       "], 
      Association["Book" -> 7, "Theorem" -> 26] -> CompressedData["
1:eJzdUYkNwyAQM3dWoE9SKVEH6EodIQt01mxUfClRmm5Qg4xtxMGJx/x6zgnA
IvoP3IERuF1xMTvlvgR6M0sS5kPRaICyksczOU37MhkwNuOOrqsrGVThDauO
iIckeI1UIQy3Y9If1mY48tgNvj+mdhGZ/fadND1eK1fveAPt4QW7
       "], 
      Association["Book" -> 7, "Theorem" -> 27] -> CompressedData["
1:eJzdUYkNwyAMPBwrJsJqJTJBVuoILNBZu1FtU6o8G+RAZ99ZGCy29n61BODj
dBtU4KkoRItUzVlVKxEli0rTQ30NwD0ttTCv675HASgPIYJ5tsgcZJCBnofF
Jye4W94hBP+Pef5jL4aK/kccP8amCI+uQyffEq91ZXd8AasvBwk=
       "], 
      Association["Book" -> 7, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGE+DjZuBiYuJk52YHA24mJiZGEIOJmZcdBGGAASTG
zibJycIiKopsABsDAxMzjMPMzMDKCqFBBOmAAYzwykMZ6AA1YoC+AIsxYfqY
EYTA+kEEI9AsAJaGBU8=
       "], 
      Association["Book" -> 7, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGESDPO8MsEACWtgJ8
       "], 
      Association["Book" -> 7, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGPBhmgQAAlbECew==
       "], 
      Association["Book" -> 7, "Theorem" -> 33] -> CompressedData["
1:eJzdUdEVwyAIPCCBJ6Y/2SArdQQX6KzdqEJqXpNs0FNP7hTF59Zez0YA3kF/
hQcW5jrNXqu7zyJCfXaS1aMNoA/2opOI2W9+AViHUEXpBnKLBXRgj9Oyi5O8
W3FCCjvSIv5yLKY6l5A4fwwzpcf3B1P0rDiI+h0fU7EGwQ==
       "], 
      Association["Book" -> 7, "Theorem" -> 34] -> CompressedData["
1:eJzdUdsNwjAM9KMOQU6ohJCqfrISI3QBZmUj7CtFrdiAS3L2XeI8lPvyfCxM
RK+kP8MsMg3Wbld3N1XliM568WytOUAx5FxPg2qt++oQYpsohcYxYu+ggG1Y
c1iZlJ0DXheRGUH0b1nmH85JKOx/xPFjRBie/D6Xs+PGSRxnvAHc8Qem
       "], 
      Association["Book" -> 7, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 36] -> CompressedData["
1:eJzdUdsNwjAM9KMOQU6oVIkf/liJEboAs7IR9pVWrdigl+Tsu8R5KM/5/ZqZ
iD5JZ4PIY7B2n9zdVJUjOuvNs7XmAMWQa70MqrXui0OIraIUGseIvYMCtmLJ
YWVSdg54WURmBNG3ssx/nJNQ2P+I48eIMDz5fy1nx42TOM74AsfMB5E=
       "], 
      Association["Book" -> 7, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHWBBMBkZoT7E4lEWTKFhAQCgAgKG
       "], 
      Association["Book" -> 7, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGM2BkhPoQi0dZ6OsUugEAnAYCgg==
       "], 
      Association["Book" -> 7, "Theorem" -> 39] -> CompressedData["
1:eJzdUYkNwyAM9BMTKluRKnWBrJQRskBn7Ua1L48SdYMecPYdmEfM63tZmYg+
RX+IweL1dHdTVc7orOFo4QDlkEcfB9Xer5UpxA7RGk1TxghQwg5sOaxK2sUB
b4vIjCDiLKt855qEwv533D9GhOHJ71O5Om5cxHnGF434B0I=
       "], 
      Association["Book" -> 8, "Theorem" -> 1] -> CompressedData["
1:eJzdkNENgDAIRA85GtZwJUfoAs7qRgJNmib6558v6bXAHR/d+3l0AXCl/BFv
lriIbPkQqdra6BvibEZvNPc1SEDmp5Dhi1u1JCD1CYuXgYKaOS6m4eMQTOM3
mFJLYukN8sIEnQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 2] -> CompressedData["
1:eJzdkd0NwyAMhI1jOYEDoSp97UNWyghZoLN2o9qmrRp1g36C43z8CMR23Pcj
EdHD5S+5lNJ7uzHz1A2elrrUua5rDcg6o+SrSM7f+woRz++iNVK1USTEAHRg
fqiohWjAJwkdiwhKVkBFEQz/Up+MKs4/c/4Y5hQZ/z40eYM7v2lS6BN4owb0

       "], Association["Book" -> 8, "Theorem" -> 3] -> CompressedData["
1:eJzdUVsOgzAMc5K1Ql1BVX9giJ9daUfgApyVGy0JA4G4wazWtd2mD/U9L5+Z
AKxG/4ncTdMrigiNCpJcc33Wcej7qoB27trwEAnhXNYCnHZTCppGxxidFGnH
pj0yUU6J87YIKcFNPMpM/9gm3fn+V1w/hpk84/s7yZrf2Ij0jC9c/AhX
       "], 
      Association["Book" -> 8, "Theorem" -> 4] -> CompressedData["
1:eJzdkesNgzAMhP1oZKOAfyH43ZU6AgswazeqfaAK1A36KbncOS9FeW77a2Mi
epf8KWvE2FSVI2Gd+tTHHtEBZZfB7aHqft2VQdo3OC1LjvMMScxaswLeUCrj
lwr0WERmhHBkO/2pNYmE8+/cP0aEUZPfZ3I1K1fCeccHW9gI9A==
       "], 
      Association["Book" -> 8, "Theorem" -> 5] -> CompressedData["
1:eJzdkd0NgCAMhPuHBFPQFVzJEVjAWd3I0geiQRfwy+USjlxTwlaPvSIAnM3+
Sik6MTMWAzkbRddFNasq5Kw0JxFmkXspAVDoh9QEEKObE0J8Ib2FVnN90y+H
5Z8fQ4Se0fhKbPJ+M7RZF0ikBto=
       "], 
      Association["Book" -> 8, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLVCVYWVmZmZUAAJGZm4hbiEuIVERcXEhIGAAYiY+
XpA8KyuyHl4GBiYuGIeHh4GDA0izsYEJIOCCAQgbLARi8CCJgEmIIgYuLgYw
hw2uDcSGkiBJMA9sPipAjRgmJkawGBOmJxlBCOxiEMEItAMANhYIKw==
       "], 
      Association["Book" -> 8, "Theorem" -> 7] -> CompressedData["
1:eJzdUckNgDAMc1K1QqHwqEB8+LASI7AAs7IRSTgEYgOs1nWcpocyLeu8EIDN
6L8YYwiBJgWFutRFSt8NQ1FAJ7eN5WN8ljQAyxXkjKrSNSUnhVw4tFsm8sNx
PjZBBB6ku8z0yZb0yM9/490YZnKPv38kG/5iI9I7dhYdCAo=
       "], 
      Association["Book" -> 8, "Theorem" -> 8] -> CompressedData["
1:eJzdkdENwyAMRG0jULg4leo/PrtSR8gCnbUb1TZt1agb5AkOn20QiNv+uO9M
RM+QE1NLKTwcLqutBrPrGOaQT7lsUa/1d8dGJPiY3mlZfG0txQF04vHUpp5E
B76Z1NlEUHIDbYpkxm+NYro8/8jxY0Q4c/L/RI6RN9ZwCn0B9uYIDQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 9] -> CompressedData["
1:eJzdkYEJwzAMBOWPlOYrCKQbZKWOkAU6azeqJLchIRv0sB+9/DYIr9vruTUR
eaf8MwCGOcAw+eQ3XxYvJDbu5EOVPF4Ig/FnIjemUS3RDFsn6q5q0aSTe6e0
h4QmYWhqLHr91TwsV++fOX8M0KqH64QtV41g6Yz2AdMKBho=
       "], 
      Association["Book" -> 8, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNeDmZmYCAmYOJhTAAMWMQMiErB7IYYQHCpAFlgWL
MJIBGMAIrzyUQRlgRBCMjAAb3gNB
       "], 
      Association["Book" -> 8, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGN+BmZgICZg4mFMAAxYxAyISsHMhhhAcKkAWWBYsw
kgEYwAivPJRBGWBEEIyMABFtAzY=
       "], 
      Association["Book" -> 8, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOWBmAgJmDiYUwADFjEDIhKwayGGEBwqQBZYFizCS
ARjACK88lEEZYEQQjIwABwcDKw==
       "], 
      Association["Book" -> 8, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGO2AEAlYmRhTAAMVgEkUxA5IATBZKkA4YYLbglocy
KPQigmBkBADl0gMF
       "], 
      Association["Book" -> 8, "Theorem" -> 14] -> CompressedData["
1:eJzdUdsNgCAMvBZCTAl+gPGfP+dxBBdwVjeS1kc0buCFHHfX8gp1WeeFAGxK
v0edKrlYYpE8DOOYG1By5j5555z3z94EsNwmoevaHIJRg1w4tEUq0iMxPpog
AjPhXqb6ZC2as/3feH8MM1nG38eRDruxErUzdtP4B8U=
       "], 
      Association["Book" -> 8, "Theorem" -> 15] -> CompressedData["
1:eJzdUdsNgCAMvBZCTEE/iPHfT9dxBBdwVjeyrY9o3MALHHdHeYVxWeeFAGxG
/8c4Ucg1V6l9PwxVAe3ctTGEEOOztAVYLlMKmkbHlJwUcuHQHpkoj8T5KIII
3KR7memTbdKd7//G+2OYyTP+vo2s+Y2NSM/YAbFtB58=
       "], 
      Association["Book" -> 8, "Theorem" -> 16] -> CompressedData["
1:eJzlUdsNgCAMvBZCTAl+gPHflRyBBZzVjaT1EYwjeCHH3bW8wlK3tRKAXekH
WMjFEovkaZrn3ICSM4/JO+e87zsTwPKYhGFocwhGDXLj1BapSF1ifDZBBGbC
s0z1xVo0Z/u/8f4YZrKMv08jHXZjJWpnHI/5B3w=
       "], 
      Association["Book" -> 8, "Theorem" -> 17] -> CompressedData["
1:eJzlT0EOgCAMK4MQM9EDId79kk/gA77VH7kNNWr8gQ2UtgwYc12X6gBsSn+A
833uM+dSpikLIJPGIXjvQ7gXDgDxaVJC18kao5GATzRtkYp0S4xbEZhhJl7H
VB+sm+bs/lfDD0fkLKOPn+mwjpWcvLEDbMoHVQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBGBiRAEMEAwUZWBiQlYHDBBGeKCA1TAwwAjSAcQE
vPJQBmWAEUEwMgIA3/IC/w==
       "], 
      Association["Book" -> 8, "Theorem" -> 19] -> CompressedData["
1:eJzlUNsRwCAMCmSSrtQRXKCzdqNCrPbx2d9yJwJBz3Np29oQEbvpH0gmBVNm
hjYvQnyvyWB+ipSq0Sus+kDXFfGVFPfIN5ThPGZ9soflnk/4AFwEHAvJA0M=

       "], 
      Association["Book" -> 8, "Theorem" -> 20] -> CompressedData["
1:eJzlUdsRgCAMC1WQdgA//HMlR2ABZ3UjaXwcnCOYg1ySltexln0rAcDh9BPo
orOaKYE6RXMah2Ga2q4MSHxMSjDzMJMq4oNLM3KRmoR8NSFG0OR3meubvUjH
/Xv0HyMSmMn3WcEHb+wU6hknl4EF8Q==
       "], 
      Association["Book" -> 8, "Theorem" -> 21] -> CompressedData["
1:eJzlkVEOAiEMREuBbBjT7GoAf90fD+QR9gKe1RvZFjVuPIIvZJgZGhLCut1v
WyCih8m/gOvhclx6P9daaW0n7i3FGFP6HmpEvLxDKTRNuufsogDzLIb6oVm0
RAE+jesYIghpgGSBM/xL7dCT379n/zHMwTv+fVWwBXNiSSBPsdYHyw==
       "], 
      Association["Book" -> 8, "Theorem" -> 22] -> CompressedData["
1:eJzlkdENgzAMRC8GH44X6HdXYgQWYNZuVNsBVNQReEpOvrMVJcp729etAfik
PIb+6u7eE8SWbpynaVl+ZwwQPQ0J9wytJFDlIOqhxgiVqldSOoagRBilxUAy
6kOzWa7Ov3P/GJFWmfw/quWqGzMdlV+KKgX1
       "], 
      Association["Book" -> 8, "Theorem" -> 23] -> CompressedData["
1:eJzlkVEKAjEMRNO0ZelIWBdp9Hc9kkfYC3hWb2SSqrh4BB9lOjMNhdJ1u9+2
REQPl//hsC5H1Uvvna564rOWnHMp3yNKxMs7tEbTZHutIQYwz+KYH1rFSjTg
04SOIYKQBUgVBMO/1A8jxf179h/DnKLj3zclX3AnngTyBJCYB6c=
       "], 
      Association["Book" -> 8, "Theorem" -> 24] -> CompressedData["
1:eJzlUcERgDAIo1TCUadwJUfoAs7qRgJV73qOYK4NJORBr1s/9l6I6Az6EWxt
zZqZkVduhqVW1SlBxPIIH3k0a5BDBAPeD1a4KX5fJ3mESEAuBApJjP7mGKaa
V0jMH8Nc0uPvk0qc3BihILgAYJoFtg==
       "], 
      Association["Book" -> 8, "Theorem" -> 25] -> CompressedData["
1:eJzlkdENwyAMRI0BBd2HEyUV/DYrZYQs0Fm7UWyTVI06Qp/QcXdYSIh1f217
IKK3yT8xTa09am30XBZuNcUYU/oeqEQ8X6EUGgbdc3ZRgHEUQ33XLFqiAJ/G
tQ8RhDRAssDp/lQ79OT337l/DHPwjn9fFGzBnFgSyAFz6geY
       "], 
      Association["Book" -> 8, "Theorem" -> 26] -> CompressedData["
1:eJzlUcENhDAMc4Mi8gDJ5dUHn1uJEViAWdmIJgUEYoSzKsd28kjU37otawKw
O/0V5mnKJJEzZRxUu0712R8AscuUgr6vNUbUQVpD1Y3VashC3klwGwIN1dDU
GGj6ZG+Ge68QeH+MSIpMvgclf3TlayejHRTRBwU=
       "], 
      Association["Book" -> 8, "Theorem" -> 27] -> CompressedData["
1:eJzlkYEJwzAMBGXFIjIJvB3wAF2pI2SBztqNKslJacgIPcz7/2UMxo/99dwT
Eb1d/ottq601qrXyumSZppx/xwsRlzP0TvNsu0iIAZSijvmholaiA98mdBwi
KFmAiiIY/lAfRor7r1w/hjlFx/f3JF9wp54U+gHqsQbJ
       "], 
      Association["Book" -> 9, "Theorem" -> 1] -> CompressedData["
1:eJzlkdEJwzAMRM8yIhiOCBzIf6ATZYQs0Fm7USW5LQkZoQ9z1p30IePteO5H
AfAK+TMe3cHau9istVbVc3cGhF/TGqbJ7xzRgFwWC7wequYhG/lLUscQaHBD
U2My6o9GM911heT6MSIlM7k/p8TJjS2c0d5JIweO
       "], 
      Association["Book" -> 9, "Theorem" -> 2] -> CompressedData["
1:eJzlUUEKwCAMixURoWziYfd9aU/YB/bW/Wg2OlD2hAVJk7Rgxf28jtMBuI3+
hlKBrRRZl+C9D2FsLoDoa1JCjLVyJBhUc1YDtTIykYaE3IagCprmtevO1qSb
VyDmjxFxzOT7GmeHGxu5escD94oG+w==
       "], 
      Association["Book" -> 9, "Theorem" -> 3] -> CompressedData["
1:eJzlkdENwyAMRA8DyiE3H8lHfvKVjtQRskBn7Ua1TVM1ygh9QsfdGSEhtv35
2BOAl8vfsa4blvsi81RyzqX8ziZAbkdoDcNge60hBjmO6pjvWtVKNvLbhPZD
oMICtSqD7j/qw0hx/5nzx4ik6OT6mOSL7tSTUt/lNAar
       "], 
      Association["Book" -> 9, "Theorem" -> 4] -> CompressedData["
1:eJzlUdsJgDAMvKYt5iOCgv74I365jyN0AWd1I5P4QHEEj3C9u4SS0qmsSwkA
NqP/YR7RDT21TYoxpvRsNQDJZZhRVXrm7KRgrmsxqD44i4aGO3E+hsACNSxZ
riHTJ1vTnd//xvtjiIJn9H1LsGJTtnYQlh3X9gas
       "], 
      Association["Book" -> 9, "Theorem" -> 5] -> CompressedData["
1:eJzlUUEKgDAMy7oNe6gwQS9exC/5BD/gW/2Rbaei+ARDyZK0jI7N67asAcBu
9ENM6MeBupJijCk9OwUguQwzmkbPnJ0UzG0rBtWVs2houBPnOgQWqGHJcg2Z
Ptma7vz+N94fQxQ8o+9TghWbsrWDsBy2JAaF
       "], 
      Association["Book" -> 9, "Theorem" -> 6] -> CompressedData["
1:eJzlUcENgCAMLAVCHyXBxI8PH67kCCzgrG5kW8RIHMFLc9xdG1LCVo+9OgA4
lX6JeV1wKsF7H8I7LwDI3RBBSnLGaCQgypkVohtHllDxJMZtCIhBDHHkPqT6
Zm2as/tHjB+D6CzD70OcFqnStR0TX6FTBm0=
       "], 
      Association["Book" -> 9, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 8] -> CompressedData["
1:eJzlkdEJgDAMRNO0RXoQ/FE/9MuVHMEFnNWNTFIVxRF8lOvdpRRK53Vb1kBE
u8k/GTqexhRjTOlZj0TcX6EUahrdc3ZRgLYVQ33VLFqiAHfjWg8RhDRAssCp
/lQbevL737w/hjl4x993BFswJ5YEcgDJ6QbG
       "], 
      Association["Book" -> 9, "Theorem" -> 9] -> CompressedData["
1:eJzlkdENgCAMREuBeKQxwT9/XckRXMBZ3ci2qJE4ghfyuDsIgbBs+7oFIjoM
P9XMU00xxpTebSViuUMpNAw65+xQAeMoJvWNWbREAZ7G2TYRhDRAssDV/EVb
9OTn9+o/hjl4x99nBBswZ9cOAjkBaIUGGg==
       "], 
      Association["Book" -> 9, "Theorem" -> 10] -> CompressedData["
1:eJzlkYsNgCAMRMun4UIanMGVHMEFnNWNbIsYiSP4Qo67oyEhrPux7YGITpO/
EpeWU0o5v8umfR0BoFJ0Z3ZRgFrFUN+VRUvjaVz7EEFIA4RlDJm/1Q49+f0z
88fEGLyL31cEWzAnlgRyAVgtBhU=
       "], 
      Association["Book" -> 9, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 12] -> CompressedData["
1:eJzlkesNgCAMhMtDjvvtArqSI7iAs7qRfagJcQQv5KN3bQiEdT+2PYnIafiv
ljqVUusQieT5MaT0rntrDhXAkNbBRg1B4E2cMSSgqAGbDpiivmlNd37+qPFj
ck6e5e8bki34bc0RvABNygXg
       "], 
      Association["Book" -> 9, "Theorem" -> 13] -> CompressedData["
1:eJzlkd0NgCAMhEuhudwMPrmAwziCCzirG9kWNRJH8As5ekdD+Jm3fd2KiBwh
P6a1Wlt7P8IiotNtSAF8NktxAHa87mr0EASeJLU3CShuQPOGoNeXxmK63H9k
/BjVkpl+r1BiIE8bjuAJLwAFrQ==
       "], 
      Association["Book" -> 9, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGMuBkYREVRRZgY2BgYoZxmJkZWFkhNIggHTCAEV55
KAMdoEYMExMjWIwJ0weMIATWDyIYgWYBAGEgA/8=
       "], 
      Association["Book" -> 9, "Theorem" -> 15] -> CompressedData["
1:eJzlj7ENAzEMAyWKpv/xRZoAqbPSj/ALZNZsFMpdikyQs0FLoi3Bz+t1XhkR
75a/ppL8aUqRiT7nHAVwL+vicRdQknZKx9F76wqbGnRYTgrEJnFjLVRdsYkl
ROdmelpmRQJjBCrcKxSOCJFwo17rbX7Bm6f4F+2lbzM/o8cFAA==
       "], 
      Association["Book" -> 9, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNuDm5kbmsgAxPFCYmRlYWSE0iCAdMIARXnkogzLA
jCCYmQFFogPT
       "], 
      Association["Book" -> 9, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOODiQuaxADE8UJiZGVhZITSIIB0wgBFeeSiDMsCM
IJiZATr7A8Y=
       "], 
      Association["Book" -> 9, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOpBE5rAxMDAxwzjMzAysrBAaRJAOGMAIrzyUgQ5Q
I4aJiREsxoTpekYQAusHEYxAswBFMwPd
       "], 
      Association["Book" -> 9, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARJgY2BgYoZxmJkZWFkhNIggHTCAEV55KAMdoEYM
ExMjWIwJ08GMIATWDyIYgWYBADDjA8Q=
       "], 
      Association["Book" -> 9, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQMjI4RkHA7hAQCZkQKA
       "], 
      Association["Book" -> 9, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARQMk8AAAJWpAns=
       "], 
      Association["Book" -> 9, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQwwMzMPtBOoAACbzwKD
       "], 
      Association["Book" -> 9, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQgwMg6XsAAAlz0CfQ==
       "], 
      Association["Book" -> 9, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARgwDpewAACWcwJ8
       "], 
      Association["Book" -> 9, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQQMl7AAAJWqAns=
       "], 
      Association["Book" -> 9, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARwwMw+0C6gAAJl9AoA=
       "], 
      Association["Book" -> 9, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQKwDLQDqAAAl/ICfg==
       "], 
      Association["Book" -> 9, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 32] -> CompressedData["
1:eJztkd0JgDAMhNNAOG4Gn7qSI3QBZ3Uj86NCcAU/yvUuCSXQuY59DRE5Q346
uj2OFMBvsxQHYOG+1OhFEHgrqTUkoHgAzQeC8rdGM1O+3+kfozqypt9tRxzk
tpEIXvkBBWg=
       "], 
      Association["Book" -> 9, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARpgHWgHUAYAmJ0Cfw==
       "], 
      Association["Book" -> 9, "Theorem" -> 36] -> CompressedData["
1:eJztUckJwDAM86EM0pU6QhborN2okepPCNkgwghbGMngqz93dzN7SQc7ACyz
TNEA0BoI9ZCUmBXxL8mBQ+1UXyz7lMkSPT8mwqXFeqOzoGs5jYwPQ4QD9Q==

       "], 
      Association["Book" -> 10, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASZgHMIBAwCWWQJ8
       "], 
      Association["Book" -> 10, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARbANNAOIB8AAJZYAnw=
       "], 
      Association["Book" -> 10, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAVbAzMzNCAYMDIzkAgYwwisPY2EFLEDMxsAANgbE
YWRiYmBiApIwXUA+CAAlWVggKhgZAdrxAzA=
       "], 
      Association["Book" -> 10, "Theorem" -> 6] -> CompressedData["
1:eJztkdEJAzEMQy3bVTjCfXWCrtQRboHO2o0q5ejP0RH6IALJwgnkcbyeByLi
bfnzm3uS+5wdMc0wZHcVAHmpI4xe1OYkh7VuMDOIUIkYOEuZifwmHsopUvPC
+THUaTtgW7uqoiqrmIvQUzxlUCtV960fqYEEyw==
       "], 
      Association["Book" -> 10, "Theorem" -> 7] -> CompressedData["
1:eJztkdENwyAMRO847AYzRVfqCFkgs3aj2CRfkbpBn8QJni2DxHs/PjsBfCv+
/KC5zTkFRIwxvDDrXSIZ4Zml6H3RtjKpMmS1ZcCINEbn1bT0bVTFm/G8+/oY
z9XrRG7wbJcgNcnaAvmUqhr8lWVf805/MgR/
       "], 
      Association["Book" -> 10, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAS7ACAYwmgzAAEZ45WEsrIAFiNnADmFgBHEYmZgY
mJiAJEwXkA92KAMLCwtEBSMjAM7BAx8=
       "], 
      Association["Book" -> 10, "Theorem" -> 9] -> CompressedData["
1:eJztzdEJAjEMBuD0mjSJnq1JFeHgPERwhQPHcIB7cAFndSOrPruB38MfCPnJ
6f643QMAPN/x99PSIW4ARHLuh1L2dZoul93O3UWqu9XG6jjPyzwLvzft0sz6
vngbzJOz13y0atcPlsHVS5m8+GY0bgUXVc3tWYwdhC4QQSTIXUCMGgQTEkUi
JKUQq6EmTEnwK0VRWfEZnPstKnI+rF8zFQ0Q
       "], 
      Association["Book" -> 10, "Theorem" -> 10] -> CompressedData["
1:eJztzdEJAjEMBuC0TdpEewlnEUFFq+IQBafw2adbwFndyJ4+u4EfJIGQn5ym
531yAPCa299vHlEBmFXz3mxdar2ex9HMRIqZlk7LtrVHa5zmTb/sch6sD07V
khU9atHbR+KdSY/XXsNBUw8Yi0h/AiF4cN4RQSBQ7xCDOMaIRIEISciFMqJE
jJHxKwYWXqQLrFIeUDAtN/IGzSoMYg==
       "], 
      Association["Book" -> 10, "Theorem" -> 11] -> CompressedData["
1:eJztkcERwyAMBE8+kDQyjwwdpCWX4AZSazqyRPzKIxVkB25gJQ0PnufrOAXA
u+LPDyIc2GPO6YVZa72THMMzS9FVteWKMqkyutWRA0akMTp1sfRtehVv5ve7
n4/R3K1uIg+oQFgjGxnbAhSpqkFNme3ZwAvbCAVU
       "], 
      Association["Book" -> 10, "Theorem" -> 12] -> CompressedData["
1:eJztkTEOwkAMBNdxyGkV3J4CbsilzWt4Qj7AW/kRdpKKghcw0lnavZFceN5e
z00AvHP8+cXaA8tC0hMzchwjteZnR7/u3B7ZVM853Zk0GBGS0XlItYZyNlN+
RqoBv9ceh7nE6zOJEEUgqlDtVIduByqSrqEMJQxIbP0A6XYJlA==
       "], 
      Association["Book" -> 10, "Theorem" -> 13] -> CompressedData["
1:eJztkcENwkAMBNcYcnbwPU8ByxKKriNKSAPUSkf4kh8PKmAeK3m90j523V7P
jQC8h/z5yRnoq6qGe4SZ6jyLSO+RGonEtVar9fYYTnNPXe4y6DCBhJuE1J3W
MiKejssynnm1RL9bj2EuRz+ISFEIxAzmE/N02gETjayhTEVQQNn6AbnxCTw=

       "], 
      Association["Book" -> 10, "Theorem" -> 14] -> CompressedData["
1:eJztjcENAkEIRWGHAYa/TlbjTS/GmzdbsYRtwFrtSGYasAEfyQt8SLjt79fO
RPQZ+vOL1twdwIrez8dt8wgzpDNDKgZWMXNLP4YiXLpLIE8xTwFtB7fc9Eye
Y+kt7Hq/eH5hZsqqlUgIOSy0skpkXhYp6kLlFNYkKTJhVdGq7rRqSPYQK1/J
xggD
       "], 
      Association["Book" -> 10, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 17] -> CompressedData["
1:eJzt0bERwjAMBVDJcSLZ/iI+cmlScDgcaViEmhGyALOyEXYWYAFeodP9X6hQ
2d+vnYno08bfT895PsVtizpN3o+jmeWsZkkr01EkiJRHS5ZSgKTXuiKdLZtB
SzaFHi4Fdse6ZtygrUTBEoLO9YhzTPUfGomFQMyOHQ2qQ9dJ36tITxK8eOZW
HTp2SAmYSIYB3nuNCF+7kgnv
       "], 
      Association["Book" -> 10, "Theorem" -> 18] -> CompressedData["
1:eJzt0bERgzAMhWHJGOuBbRmcQMGlCOZScNkmI7BAZs1GARbIAvkKNX/xCs3b
+7UxEX2O8/fbMKhf1xbjaG3OXdf1PVKK2CX0Io3I/ETSOM1FNUrZo/eXmFNU
LDlBcbqVkFZdSqcPlSP6e5iAZtg3jGHa/4GWWCgQs2FDDnBVJXUNkZqksWKZ
j3Sq2ATvQ7iSOBettfARX7j8CgM=
       "], 
      Association["Book" -> 10, "Theorem" -> 19] -> CompressedData["
1:eJztjcsRwyAMRBEskgCDnM94JscU4iZSghtIrekowhWkgLzDO2h3Vs/j/Too
hPCZ+vMD+731fjFVEWYA62puc2AyRhujXOeFW3OXism6KBZYU9hZNVNPO3pX
t84QDczCu79IKYVEJBJiDjeimGIl5SaAclbumVQgIOcUUUJ82Ni2GrT4ckWJ
oC+tuAfj
       "], 
      Association["Book" -> 10, "Theorem" -> 20] -> CompressedData["
1:eJztjdERAiEMRAksSYCDMOrcjJ92Yg2WcA1Yqx0ZrgIL8H28j+zO5nG8XweF
ED5Lf37h1nqfpirCDGBOc5sDkzHaGOWyLtyau1Qs5qbYYE1hZ9VMPe3oXd26
QjQwCz/9Q0opJCKREHO4EsUUKyk3AZSzcs+kAgE5p4gS4t3GvtegxZcrSgR9
AYU3B6g=
       "], 
      Association["Book" -> 10, "Theorem" -> 21] -> CompressedData["
1:eJztjbsBwjAMRGX7cpZt5DhQUFBRsg4jZAFmZSPksAAD8IpX3Olz31/PPYjI
e+rPT2xmo6vmTAIYo7u7g55tUraZsDV3qZicu8L7dvhAvT3BTGHQGaOBzHz4
AyBKDIkqoNxCWJhqUlrhnLjSGKhcMpJvpS+IF+tjVGmrX64oEeEDejwHug==

       "], 
      Association["Book" -> 10, "Theorem" -> 22] -> CompressedData["
1:eJztjMsNwjAQRP3bZOP1xqDYKHADJJQzEoGAfaEESkgD1EpHGCqgAJ5G7zAj
zXZ+PmYphHh99Oc3pumeQ2D2nh15n4pTTtmlLnab2NV4K80wnorjsYhogYGR
0hg4U9+vSipqy3I5B75SDITERMPusPfl3xglpJLGCFWJpZIalJWoEQA0gKka
kJqsaREQHXxRNTTOEq6Fqx0ba8q/fQNDDw1h
       "], 
      Association["Book" -> 10, "Theorem" -> 23] -> CompressedData["
1:eJzt0b0NwjAQhuHzX2zn7uw4cSJkUoCUjhEAISRKKkbIAszKRiRI9AzAU7zl
13z7+fmYBQC81vz9aN4M01TXpeQUEYecqB+6IXRbXFnsU6TD6RyY8NgmQiQ7
ZsvdpcTM4cOGrm34di3hzs0YLRPTwaNPy7yUyxsCjAGogIUQSkhw2iqldGWs
QwO6rrRXUikjvzx573ZQO4paas/BvwFdhwt4
       "], 
      Association["Book" -> 10, "Theorem" -> 24] -> CompressedData["
1:eJzt0bsNwkAQRdHZn/czM2uvF1tYKyRInLgEIICEEIkS3AC10hE2EjkFcIIb
vuQd5udjFgDwWvP3q20/jiGU0qaI2KdEXZ/7mHe4stg1NU2nc2TCY06ESLa0
lvN1qDccP2xs24ZvlyHeuSnJMjFNHl1a1qVc3hBgDEAFLIRQQoLTVimlK2Md
GtCh0l5JpYz88uS920NwVGupPUf/Bh/QCxs=
       "], 
      Association["Book" -> 10, "Theorem" -> 25] -> CompressedData["
1:eJztzMsNwjAQBFB/1l5n10s+chy4IAUJBeIeoAdKSAPUSkcYKqAAnkZzmMPM
2/OxaaXU61N/P1v3E1GMEbnvS+3relldGdPQpSH4InU5zeJ4Ou6QmYeQY+Dl
nOTCOacalFYavt9GKTxlCRyFly61bT231ihtNIDSTvVWGzCkg0Wo0HvfeA3o
wBtnDLgvg46JhA6KmihAgBybNxCGCxI=
       "], 
      Association["Book" -> 10, "Theorem" -> 26] -> CompressedData["
1:eJztzD0KwkAQBeCd2Tczm2xi0EIJsbExggcQtLAXA2JhKZha8KzeyHVP4AH8
isf8MLMaX9eRnHPvb/z9zgyJArPpoIiP2+k5GeZda10bynOabA77uorLXSpj
LL2qx+W4aO61ZGaajmOhUiMoPCpD32/Xkr9755jBjr0T5tyCmERIAFE48kTI
4TNiDo2GoqBK0p7EYPwBOT0Kmg==
       "], 
      Association["Book" -> 10, "Theorem" -> 27] -> CompressedData["
1:eJztjDsOwjAQRNf2etefkA0icWzFDVfgFLRAk4IqF6DmmNwII3EBet5II80r
5rg9rpsCgNen/vzA8yDLUiXHGKukEH0Y/I2oI/IuNBN7KUMKUiTNqcN7xRz7
S4mz+3KuLk/jWqcsa8E8c+5kEG7XSiloMQbAQGxDAyrUrnlNzIYYdMcUjDUG
rSVrrUZ0zgYu4IlH1t7tT7s3J0QLZQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 28] -> CompressedData["
1:eJztjDEOwjAMRZ3EsRMX6gKlqUUWrsA12FgYQAy9AGflRgSJC7DzvvSl/4Z/
XJ6XxQHA61N/fmHbm1WdRaTqlCXlId2IVkSZczPSqw1F1HQqU4dLxVn6q0lJ
X86V53H3qHvTu+Fc2DodlNuzcw5aQgAI0LXhAR361Lwn5kAMfsUkIYaAMVKM
0SOmFIUPkIlH9jltTus33R4K+g==
       "], 
      Association["Book" -> 10, "Theorem" -> 29] -> CompressedData["
1:eJztjEEOwjAMBB1iJ6lRgkojFXFAQmYP/Q5nTv0Ab+VHuPAC7sxK3l1b8nV9
3tdARK9t/PmJx7IAUM2AZcdyrzXXmmTbqKqZTWpw23MHI+to2cYPSSYMaK2j
Yp5uDBPoobXij0MI5BIhEhq87KgEZk8hRmYpTLFKKuwlftkOSSQdqUi5SORy
nk9vK4ELLg==
       "], 
      Association["Book" -> 10, "Theorem" -> 30] -> CompressedData["
1:eJztjEEKAjEMRVObNJ3AVKeDgriQCQFrj+MR5gKe1RuZcePave+HD/8nZFmf
jzUAwGuzP7/RWjcbBu6m7CjP45h8aGtEpKlOcjNVFZw7GsuhsU4fEtU+WClz
H+1Y72iNFtmXkv1vCAFcRAAE2cMOKCAm72NEJEaIEonjF1/ERJgqZMxXv+bL
6fwG9UAKzA==
       "], 
      Association["Book" -> 10, "Theorem" -> 31] -> CompressedData["
1:eJztzE0KwjAQBeDEeZPMtPSHpDmAa8GFEkl3uhK69Ai9gOfweN7I2Bu493sw
PHgw+/X5WK0x5v09fz96TcssotOSVRu9aRBhEXiN9+xFxmvufVryfFZKkYpq
GPXSboAYpbRNClqGFKgMOLlu6LA93tVY1E7GWWusYQuSOhAYYDbUeDiuwBsC
aAf2vXGOO3Ysx/7wAWswC6M=
       "], 
      Association["Book" -> 10, "Theorem" -> 32] -> CompressedData["
1:eJztzE0KwjAQBeCJ8yaZaekPSXMA14ILpRJBV9ILeIRewLN6I2Nv4N7vwfDg
wezX13N1RPT+nr9fTctV1aZlNmvsblFVVBEsPeagOt7mPuRlLmfjnLiYxdEu
7QZISUvb5GhlyJHLgJPvhg7b312NQ+1M3jlyJA6sdWAIIELcBHipIBsGeAcJ
PXkvnXjRY3/4AAbnCv4=
       "], 
      Association["Book" -> 10, "Theorem" -> 33] -> CompressedData["
1:eJztzDEKwkAQBdDZP7Mzy67JEQyBHMETiHaCsFhIKklhkyKNhTZ6Ar2bNzKJ
J7D3FX/4fJi6G3LniOg9xd/PmjMQnk22YFaFFCfK4VFnr9ZXOdq9yds1ULzQ
BouDHdLMa7rZMcbFJbSpuKI9+Z0vy8Lmt46cA0AkJBgLAeJ0HMYr7IVgzKai
Kl+kymBWI+95Lyy8Wa4+2S0Pjw==
       "], 
      Association["Book" -> 10, "Theorem" -> 34] -> CompressedData["
1:eJztjDEKwkAQRWdn5u/sZlUQG0mxYOsJPICtQrSQVIH0oofwFp7KGznJDex9
xYf/H/zd+LiMgYg+U/z5nYE51neHCDzRlKKlJEV9dTlZe+9W5vJ0ZG4q98i5
xbXMSFzUfDNL1fpldrm1s/me5tdAITAzkZKyFxLWEF2IqIgpiQkbFDCdIUDY
ZeMGm2k5rPdfIM0NSg==
       "], 
      Association["Book" -> 10, "Theorem" -> 35] -> CompressedData["
1:eJztjTsKwlAQRefd+b3kxUJbSUIQcQGuQxCXkM5KQjoR9+iOnGQJ1p5imMsZ
5g7j4zYmIvos488PADIcJxXRTjbulXvsw2HKrm031RbyegHqHm/JpdXZV5RL
b0/33OdXqULu5e5NU/L6NFFKAIiEBBGIIcnWNhU2JTgjilSjbIHMGMxWU9id
xNF5e/oCY1gMJw==
       "], 
      Association["Book" -> 10, "Theorem" -> 36] -> CompressedData["
1:eJztjEEOwjAMBJ1dO0VKinoliBQCiMdw4wn9AG/tj3Aq8QHOzMFarVdzWd6v
JYjI2s+fX3g2MrXW7rlNtQ61zo/elFrHMc0nj44pjdfzlG/ZNo5Kb1SVHvoz
KQ+llJ0bQ6AEwExAGQCJAlAt+rqjBKOC8BHCF913WYimhk2JDwBsCAk=
       "], 
      Association["Book" -> 10, "Theorem" -> 37] -> CompressedData["
1:eJztjMsNAjEMRP2NkBzB1Su0KAm7N6qhhG2AWukIOyXsmXcY2TOj6cfnfSAA
fFP+nGJjrvuzDxtX9+LeXuksrZnV+xqnmamwcn/cbJhOVuFwRITjyLAKL+5+
iUFEBiRSBWIoRFDijVIRIeIkwhAMpkzEcgwjUMq20g/B8geK
       "], 
      Association["Book" -> 10, "Theorem" -> 38] -> CompressedData["
1:eJztjD0KwkAQhed/s84mlUKIlY1ICntBUtsEYSEXSGflWb2Rmz2CtR+84b2v
mNP6zisCwGc7f37DJOVlnj0fht6GfuevYsb7lDwdb6W6RzZjeU777uFaiWEz
Hk1baUxY2iCX8XrW+pEBiISAGISoTkFCVWRmVQZkrAaZKkgUOgtNRBc1wZJA
X8iACc0=
       "], 
      Association["Book" -> 10, "Theorem" -> 39] -> CompressedData["
1:eJztzD0KAjEQBeDJ/IbE7BHULfYEXkAsrLQJFpJG2H6RLbQXBM/njcyuN7D2
K97weDBtf829A4D3FH8/6rJ5s7WPYaLkX20WtWGVgz27vN8ipgcWb2G0U5yJ
xpudQ1jcfYlpxHKRgzRNsvmhA+cQEYCBsRZAZKd1qJdJGNCITFmVv0CVkEgN
ROjIxLRbbj4qtg5k
       "], 
      Association["Book" -> 10, "Theorem" -> 40] -> CompressedData["
1:eJztjMsJAkEQRHu6u7bno4J4EYVZPBuAEXhdGD3IXjcAxSQMw5zMyNkJwbPv
UFD1oA7T4zI5IvrM8edXCjrgiZiSpuQV+VWCt929rKx/l+HMHHseEcIe19SQ
bpHDzcxnG5ehyq0NVnff/hw5x8xESsq1kLC6rgoRFTElMWGDAqYNAoSrjNVg
My+n9fELecMMNQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 41] -> CompressedData["
1:eJztkbsNwlAMRf29yXs2iD6iQMo+SAxAkQUYgCmomICB2AgnDT0150i2LF3d
xqfldlmYiN7r+PMzgGNGDCtuyHz2Ecf50YfM1/UsjOTwNk64tw1X9MrD06L5
nmPS1jLDvp1aP1Fikk0mI6nFLOA6hMzLQktSVTGVYHPdCZvEwT8kUgim
       "], 
      Association["Book" -> 10, "Theorem" -> 42] -> CompressedData["
1:eJztzLsRwjAQBND77J0kJBOBx0NGwGdogMQ0QODA44AUN0CtdARWCcS8YOd2
g9vPr3FmInov8fe76f4sw6ZrvWvjanSU860vueyu3zPnpO6Kod82j2xVCO5A
Tm4NokPRBJyOl4PVb0okAiFRgkitYGEzNsAMxMq8LKxSsUhce4yJi5mDLcD0
A4B8CVM=
       "], 
      Association["Book" -> 10, "Theorem" -> 43] -> CompressedData["
1:eJztjM0JwlAQhPdndt/G99BTIOTmQSQHC1DBDgQxkALSQGq1I5PXgle/wzAz
MHOcl/fMRPTZ5M8PvKYytn3nfRd5NOThfis599fVltyom2J6tIdntkpKbsCu
cSsIh6IknIfLyeqZEolASJQgUiNY2IzXkRmIlXlrWKXCIrH3iOBi5mBLSPIF
U9kJBw==
       "], 
      Association["Book" -> 10, "Theorem" -> 44] -> CompressedData["
1:eJztjD0KwkAUhN//Zn272oiEWFn5dwUhhQhpVPAIuYBn9UYm7wq2fjDDzBSz
G9+vEQHgM9ufX7iX56ZrrWsX/jDx4/WyKr7tp1g8sxnLrV/XwTXIaV48m1Zp
TFhqktP+fND4YgAiISAGIYoqSKiKzKzKgIyxIFOARGlpqcnooiY4KdEXIJwI
pw==
       "], 
      Association["Book" -> 10, "Theorem" -> 45] -> CompressedData["
1:eJztjLsNwkAQRPczu3f23eEMZJmEBCNRABIENIAlRAEEboBa6QhzLZDygqcZ
jTS7+fWYmYjeX/35idW0Hvow9LG9OdLhci45bU9LTKlVd8X9uumexSohuAOp
cSuIDkUOGMfj3uqVEolASJRMpFawsBkbYA5iZUaVVlgkdh6bhrMtO1tAkA/7
yAh8
       "], 
      Association["Book" -> 10, "Theorem" -> 46] -> CompressedData["
1:eJztkb0NwlAMhP3/sF9iFNHQIMQy1IgmQnRZgEGYjo0waRiAlu+su6uu8Wl5
XBcEgNfH/vyGRe9eJ5Z9drPDcx4143Y5E/bEsObHdpcVtRjYW4tJ3WOLsTff
ZKZ957h+woBAqxAUqIKQjBCEkIVriFegClUfUJQnYqFxp2/JTAf8
       "], 
      Association["Book" -> 10, "Theorem" -> 47] -> CompressedData["
1:eJztjL0NwkAUg9+P/e6SC6FCitJRJRITQINoqGhghCzArGxEcivQ8hWWbck+
Lu/XoiLy2eTPjwzjEOOQ2ydR5utl35XxvNpSGg86HrdDfy+spBQE2ibYIQcc
u4R5Ok2sRy5iBhNzgVmNUFNS1xEJUVfdGnWrqFnuI+esHRlQJtC/vAEH+A==

       "], 
      Association["Book" -> 10, "Theorem" -> 48] -> CompressedData["
1:eJztjMkNwjAURP/u5cvBOLngCAEtceeSBqg1HZGEErjyNHoaaaR5LO/nggCw
7vrzK6VoKWYleY+neq9zq9F7vyhOGX1or9swhy/WsnkMU1S3cRuvllMbGx8/
tIUY96JEIJsRDESAVZgUISixsIjxATCnIO4ZI4mJauAzfQBCLgcZ
       "], 
      Association["Book" -> 10, "Theorem" -> 49] -> CompressedData["
1:eJztzDsOwjAQBND9jWPZOMnaSkE6TsBdOEL6CHFUboSTggvQ8qTZYkba2/Z8
bExE7+P8/WyfX3vMV8Bzq6350gzuc9QCxbrc61r9FGKBASg9uJjCI/I0jehP
RJiEyIRMaSDhs2DtGGY2GDFUcA5faQwpFc4hWN+CFP4AsxwHvg==
       "], 
      Association["Book" -> 10, "Theorem" -> 50] -> CompressedData["
1:eJztzM0NwjAMBWD/PTdNCYqbQ3tlBFZhhN4RYlQ2IuTAAlz5JPvwnuXL8bgd
TESvz/r7Xb0/07IBMUdtLbYwRNSkBYq9Xdd9jcFTgQEofXAyRSTM/RL9hwiT
EJmQKU0kPALWjmFmkxFDBaP4ymfPufDibr1zKfwGcnEHSQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 51] -> CompressedData["
1:eJztzMENAkEIBVBmgQF2BcwYY2J0E+1iE6uwhG3AWu3IUVvw6Dt8Dj+fy/q4
rwUAnu/4+4FlUW2Zoe4RsXHPfkTmlGx+jha3D9FjWq/mjPRTSB+kmln0F4gD
lKEwAzLEUIhQilIlZmQmVi6439FUqdZKXxXVdJQrbHl0EpLpYC/avQkb
       "], 
      Association["Book" -> 10, "Theorem" -> 52] -> CompressedData["
1:eJztzDsOwjAQBND9jWPZOGFtpUg6TsBdOEJ6xFm5EY4LTkDJk2aLGWlvx+tx
MBG9z/P3C8+YN8Bzq6352gzu16gFin291736EGKBASg9uJjCI/KyzOgfRJiE
yIRMaSLhUbB2DDObjBgqGMNXmkNKhXMI1rcghT8w+AbQ
       "], 
      Association["Book" -> 10, "Theorem" -> 53] -> CompressedData["
1:eJztzLsRwkAMBFD99nw+I8byBXZKCbRCCW6AWukIcQEVEPJmpGBXo9v5fJxM
RK/P+vuJuuxAzLH2HnsYItaqDsXR79uxxVCqwwB4Di6miIo5L5EPRJiEyIRM
aSLhEbAmhplNRgwVjOKrXUtrzkspll0R5zfxiQZb
       "], 
      Association["Book" -> 10, "Theorem" -> 54] -> CompressedData["
1:eJzty9sNwlAMA9DEzkPlplBGQEJMwCSM0AWYlY3I7Qx8ciJFlqPc9vdrVxH5
zPX3G8vqrMd1W7NyxlGLuRvrPNaqPDy9McMd9Hm8OO9jnKL/ARGq0AUUaE8X
IFpHeCc1gtqgB1G1RERoqgVAM+cXmxEFXQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 55] -> CompressedData["
1:eJzti70NAmEMQxM7Px9cGqhPSAgQYh1GuAWYlY3IXcEElLziyZbs8/J6Lioi
71V/fsSerNP9UVV57VjTFGRw3I41KjYuRjq/BMs4Z+bouypFAbYpAYh3Jdwj
iO3WhkGh2ruVDnHIXVDDzAFzOj6t+QWb
       "], 
      Association["Book" -> 10, "Theorem" -> 56] -> CompressedData["
1:eJztizEOwlAMQ5PYye9XfjuxMbEg1IEdCXVjZeAIvQBn5Ua0/wyMPMmWbcmn
9f1aVUQ+u/35FcE235eW7XjbYmZFBPhcDtMjvVPLvmQNHzkEwbHwMl/P3v8Q
MaOJQWjWK9XUXQG4QxTaF4V11KxMUYaqSQ/qJscXRCYHFw==
       "], 
      Association["Book" -> 10, "Theorem" -> 57] -> CompressedData["
1:eJztyz0KwkAQBeDZN39h180Zoh7BCwgWNmIhEkQIQqoIoqClp/RGrvEKln7F
Gx6Pmfa3XR+I6PWJv5+Zt2o+NG3062y/XgL5jq7yePE2jdTS0w8xTh7VMeUz
upNutK6zj++BQgBAJCQohQAJVoZyhVUIzuwmZvJFZgxmc1LlrbDwqlm8AT2b
DLY=
       "], 
      Association["Book" -> 10, "Theorem" -> 58] -> CompressedData["
1:eJztizsKwlAURO838178gFgYMAhuwNIdWIrogyDYpbG0dpnuyJu3BktPMTBz
mP34uo1MRJ8p/vyOkhI2z7JE/75eTiJtLw/PufMyq2gz3+UBSFvcFzlkhzNi
T/XNxCwiREYmUUjFuAmhaqowUqjAzR1WIXeVkG0YX0/LcXX4AmLbCjM=
       "], 
      Association["Book" -> 10, "Theorem" -> 59] -> CompressedData["
1:eJztkbsRwlAMBHX62e9JMM49BHZBBBRA4AZogD6gLTpCOKEBQnZndHPJJVq2
22UDEb0+588P6aOf1mcfMh/XM8MTYW2c/d52TLx7DG6p0eyImKW1zNDvgtRP
hEC8C1LiCoAdVZjUykJKEhFW4YCaHBjKMdkbf2gHcA==
       "], 
      Association["Book" -> 10, "Theorem" -> 60] -> CompressedData["
1:eJztjcENwlAMQxPbSX5FYAcG6oURugCzshFBPbAAR56lyJal+H48H4eb2etz
/vyS7lBtHVHrFkqJ3RmXOlmp2FiVKmaGxnLva+v7QDCjxTiM6DLAAXAKFxA4
s4/MfdaUZQFupJRLb1TtBOA=
       "], 
      Association["Book" -> 10, "Theorem" -> 61] -> CompressedData["
1:eJzti8sNAkEMQxPHk8zOT3tdceJMKSAhUcI2QK10xLASHXDkHSw9yz7vz8eu
IvL6xJ+fcl97H1ePXls2D+Ppto2tlYPsHoll8VRZPRlH4qWN6sfVRABC1ITA
VAAKIZEsaNMjKWcF6pdcIi9ZK9znzBl4A+hqBj8=
       "], 
      Association["Book" -> 10, "Theorem" -> 62] -> CompressedData["
1:eJzti7ENwlAMRO3z2U6+f9KjVPTsQYnECFmAWdmID4gNKPOKk57u7rw/7ruK
yPMdB/9l6X29RfaqySKN2/W0bL19iUhnzeHFCjcuzktfKz5PEwEIURMCQwEo
hAQtacPTR69jpD+yZU6zNkQQHkx7Ab5LBeo=
       "], 
      Association["Book" -> 10, "Theorem" -> 63] -> CompressedData["
1:eJztkbENAlEMQxM7CbkjX+gKBkBiEiZA1EiI66ioGZON+FxzC1DyLNmF5cqH
+XmZVUTeX/vzY4737ZC72/mkGqmVQ3vldVwgMlkRnl7pvXxgjNbK1zX7JxQV
LFJxQQ8C0RuD0FbEjDB6qDknmKH2mw9aMgdV
       "], 
      Association["Book" -> 10, "Theorem" -> 64] -> CompressedData["
1:eJztkbsNAkEMRO3xZ/HeYXQiQWRUQ45IEETXALXSEWYTGiDkWRq/xE7mtD4v
KxPR6xN/fs1taxmP6xk8JYe3OLa7Dsz7LNFaXyyi7zgOHpvM9O+xVCdCTBjD
ZIRaYDiYFCwq9UgGVILymdVkgSimvb8BAo8GeA==
       "], 
      Association["Book" -> 10, "Theorem" -> 65] -> CompressedData["
1:eJztkbERwkAMBHU6Sfb79Qy5IxoicE7iBiiAKiiNjhBOaICQ3RndXHCRLvt9
2yEir8/583OWKfN5uyoiMbzNazzagTOWGFN42mh+wljZW2a375b1EwpEDyEm
WgFooIqKeVmwFJJq1A5zDoVpP/sbCOkGgw==
       "], 
      Association["Book" -> 10, "Theorem" -> 66] -> CompressedData["
1:eJzti9kNwkAQQ+fwjCeLBBs6oIzkkxIoIQ1QKx2x2Rr45Ek+JMuP4/06VEQ+
p/35PVfWunYa09hv/dLvmFQyiwATLI69Jdv+3DBfJuLmepbwIQlTKQHU4T5C
YxQEkDERs6jIJbRZpCUKi38BfN0FlQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 67] -> CompressedData["
1:eJzty7ENwlAMBFD7fLbzv/NJjVJRMwg9I2QBZmUjkkiMQMcrTjqd7ra9npuK
yPuIvx+IHHNNFmlcH9dlHe3UIzJYLbxY4cbhvM+X4nkyEYAQNSGwVwAKIUFz
2t6DcFMz16/smVPTjji2YNoHfhMFZg==
       "], 
      Association["Book" -> 10, "Theorem" -> 68] -> CompressedData["
1:eJztizsKwlAUROfN/RmuZgU24hYsLbQTgoWmsk1hFQi4U3fkS1yDnaeYYTjM
bpjuQwHwnuPPL9h35yO5frGPaJ5xbRbMcvRbZo6rPmf50Iu17SaWT0EpJAGF
sg6QWryK2iqmYIiEq7t+gbtQxANm0qmonLaHDx9UClk=
       "], 
      Association["Book" -> 10, "Theorem" -> 69] -> CompressedData["
1:eJzti7ENwlAMRM9nO84PpIjSICEFUTMAE9BGAgqUNgswKxvh/B3oeMVJd093
Xt+PVQB8tvjzE+7zjexOXLyUoz93FW32U3lFtFMsfUl5iDlyb+tFIEISMBiz
QGnSpFA11TBoKMPNPawCd2XKLo2P23IdLl93QAhW
       "], 
      Association["Book" -> 10, "Theorem" -> 70] -> CompressedData["
1:eJztkbERwlAMQy3LNjj+4ehTsRAFA1BkAQbIlGyEk4YJ6HjvzjoVqnxbX48V
IvLez5/f8LwrYqA8z0tseeCMKeoUPqzSL6iFmWOUfWfsn1Ageggx0Q5AA11U
zNuGrZBUoxbMOStM6+ofrWsFwA==
       "], 
      Association["Book" -> 10, "Theorem" -> 71] -> CompressedData["
1:eJztkb0NwmAMRH3nv3yOAKUNFSulT5MFmJWNcGiYgI73JJ+uuMqP47kdEJHX
ef78iA2IKyoy19zjgzHmKDO/eA2/oe6sHGOK70r7JyoQtmeakAKAgS4U87bR
VjpoqglzXUjjvPgbaeQFKw==
       "], 
      Association["Book" -> 10, "Theorem" -> 72] -> CompressedData["
1:eJztkbsNgEAMQxM75PgcSIiGlpUY4RZgVjbigpCYgI5XxEqsuPFWjr2oiJwx
fr5CtZl0pHcLVz5Y5pDcso8Mc2HvKSV7n1g7gUQxuJWCKmqgapggaEQIpAai
Lo04OMetndoLEo8ERQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 73] -> CompressedData["
1:eJztyssNwjAQBNBd7ydeJiiCC0KEGFwSJaQBaqWj2FADt7zDjDSa5/p+rUxE
nx67v6kTKurS8/51m4EFjzKhoPQZM67NqZ1VhYRlOFByOjKrK0vOPoi4Wbgb
u6Y2NsL8qxTjJeJM5gYV9RjzBiIVB3o=
       "], 
      Association["Book" -> 10, "Theorem" -> 74] -> CompressedData["
1:eJztzLERAkEMQ1GJb3tvbyAmZuiIEq4BaqUj1iRUQMYL7MCWbsfzcVjSq8ff
78CdCijmRw6YZEISfWT05rJ+IxBm7DqVznZUiG3rSC4V6cQZ/sKe+3XVqlti
RYzeIFQElQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 75] -> CompressedData["
1:eJztyksKwkAQBNDq/ySKE8E4UZmFR/AqrgR3WQs5qzdy1DO48xU0VdDnebnO
BOD5Pn8/VLWkfJv6/foj4l69lPFxGqdhOWgpflzlIXftlYjQIgIIUhsMJuXW
iNVM1MCdWy8s4vzlbmrhFaFp5xq2vWxeH+YHqg==
       "], 
      Association["Book" -> 10, "Theorem" -> 76] -> CompressedData["
1:eJztzL0JAkEUBOB58/6OXfU6ECzCEswPMZQFrwExNTSyPTvybnsw8wsGBoY5
zPfzLAA+a/z9EltmfcSldB71lddSNs+h1e2b7eZTjONu6FuBCEnAYFwKlCax
ntBU3cBUzfCI8A5mStVImPtkSzvtj1+uewmX
       "], 
      Association["Book" -> 10, "Theorem" -> 77] -> CompressedData["
1:eJztjLERwkAQAyX27t/+gZiYwA25BDdArXTEPQkVOPMGuhnppNfx3g9L+ky5
OBVaQGf9kR0GmZCUvc2kLo96jECYPnRrutvRQizLrGTRIp04w3+w1/GsWc2V
qIrRFw9/BHA=
       "], 
      Association["Book" -> 10, "Theorem" -> 78] -> CompressedData["
1:eJztyrENwlAQA1Cfz/f/KfyIgirFL+ioGYURsgBCYgwahmAgNiJJxQJ0vMKS
ZR/n62U2AO81/n4rapviXjfhrZdnluz52A2dr0m3HMdWtqfBjCQgiLZWyrQM
pMJDYIr1C9yd7hqw7Hu5/Hw4fQAdqgfL
       "], 
      Association["Book" -> 10, "Theorem" -> 79] -> CompressedData["
1:eJztyrENwkAQRNGd2dm9O8uAnJGSgERAD27BkhtwA9RKRxwISiDjBSON9E/b
fdlgZo/X/P3YfJzWffnIcA0tY1RNucaiy/V2znfoZqRodAuyX1BwREBeQv3X
hIge4asdsraKHSPVS4U/AXfNBX4=
       "], 
      Association["Book" -> 10, "Theorem" -> 80] -> CompressedData["
1:eJztyrERhDAMBEDpdJLsMQwZCUNA/tV8CTTwtX5HGPdAxgY3c9Id5+97qoj8
73g9bV+3pQ4lI4Othje2cOMc/MzLxLEzEYAQNQmgVwPUhYTzJv2JQL/SBjUr
U9ZStQ+C9GTaBTe/BNw=
       "], 
      Association["Book" -> 10, "Theorem" -> 81] -> CompressedData["
1:eJztisENwzAMAyWRsuT4UbhIH0H66UoZIQtk1m5UJ52hvx6IAwHytR/briLy
PvXn5zxvy3Thec+sLXuJyp5a18w2P2ZeNxsx6FncTDisEkIKSIAqWRQOd3wZ
UwSzTQqzwnCi2wc/TwT4
       "], 
      Association["Book" -> 10, "Theorem" -> 82] -> CompressedData["
1:eJztkcENwzAUQr8Bo9StfWoGyEoZIR3As3aj2rlkgtz6kEDi/xvb0fcjRcR3
2p/7eXzKCbG8WG0/c11c0TqKW6u+Xjk2YaTA0MwcQCQSeVw0OlEST8ImxOxE
4Q0JbS0/UkIFKA==
       "], 
      Association["Book" -> 10, "Theorem" -> 83] -> CompressedData["
1:eJztysENg0AQQ9EZ2zOzi4iSFjiRQ6qhBBqgVjqCbBE55R2+ZMnLfmy7m9n5
zd8PPGKoyqCmnjGrpaS5tL4/a44TzQDBQCvgnoScHuFShWTekkGQgcGB9sre
mz9RKUaqcAEjiATP
       "], 
      Association["Book" -> 10, "Theorem" -> 84] -> CompressedData["
1:eJztkcENwzAMAylSchHbMdAB8uhKGcELZNZsVLUokA3y6h1EfvjTax77NADn
J/7cQf0SKrX0CO/Rmg+um2odY9U1VP5EMDD9NWH01OCEJ1JeigjRFYsVatCd
7fl4Awl+BFs=
       "], 
      Association["Book" -> 10, "Theorem" -> 85] -> CompressedData["
1:eJztissNAjEMRP1L4tgB1iINcOCAhDhQCiVsAYtSKh0RJGrgxNPMm8uc1udj
RQB4ffTnJ4xtjE3rJVr04y169OtSY8bcbT8PRDyLIkAZGiEzF1ROLMI5laQJ
uZlkmgjRd+piqnfw4gdJYruzvwE7oggP
       "], 
      Association["Book" -> 10, "Theorem" -> 86] -> CompressedData["
1:eJztibsNwkAQRPd7n90DvPI1QECAhAgohRJcgJFLdUccEjUQ8TTzJpjz8nou
CAD7R39+w7pta6nXaNHne/Tot6nGiLnbcfxEPIoiQAkaITNnLKwswkmzFkVu
JokGQvSdOlkpD/DsJ1Gxw8XfDUkHng==
       "], 
      Association["Book" -> 10, "Theorem" -> 87] -> CompressedData["
1:eJztybsVgkAUBND33X37E5bA3MDAhENgJZZAAWirduTDIoi4Z2aSua3v14oA
8N3ndJBt+1h69Nqnvniv85i6x1qrF7+J2IsiQAEqITNHNFYW4aBRTZFrlkCO
6U+I0pjNnlBiGUQlt3v5AeALBzE=
       "], 
      Association["Book" -> 10, "Theorem" -> 88] -> CompressedData["
1:eJzticsNQjEMBP1NHDvAs0gDHDggIQ6UQgmvABCl0hFBogZOjHZnD3tYH7cV
AeD10Z9f8bxbO2XPsb/kyHFeWs54hG/nS8SzKAJUoBMyc0VjZREuWtUUubsU
mgjRd9riZleIGjtR8c0x3rMTBr8=
       "], 
      Association["Book" -> 10, "Theorem" -> 89] -> CompressedData["
1:eJzticsNAjEMRP1NHDuwa5EGOHBAQhwohRK2AKiVjggSNXDiaebNYY7b874h
ALw++vMzHtbO2XMcrjlyXNaWMx7h+3kS8SyKABXohMxc0VhZhItWNUXuLoUm
QvSdtrrZDaLGIiq+O8Ubhg0GTg==
       "], 
      Association["Book" -> 10, "Theorem" -> 90] -> CompressedData["
1:eJztybsRAkEMA1Dbsne9P7i94HICAhKGgEoo4RqgVjrCUAQRbyQlOu3Px85E
9PrM3+94ucw+13mLbtelzIiP0Q/xiSDKqiSJujCAzA6DKpJlc2P0qkkC5EtF
ylLd79RyO6ppHef2BlvFBeU=
       "], 
      Association["Book" -> 10, "Theorem" -> 91] -> CompressedData["
1:eJzty7ENgEAMQ9HEdoKuoWADVmIEFmBWNsJXsQEVX7noSdHt53WcGRH3XH8f
Vt2SX1OLzK21jjHKJyCCGawADSSQJpzZE0lQ+eY/VZCUZXmahQfbqAOx
       "], 
      Association["Book" -> 10, "Theorem" -> 92] -> CompressedData["
1:eJzt0bEVwjAMBFCdopPtF9tKXgpqRoIRskBmZSNEwQpU/OKau+7u5/U8ISKv
T/z90i3GPI4W+6x14+yPEaUUzwbINxSkwKQpNIFGZYaz0sV8ccsdFF/dGH0I
1cdiVtdoby3RBP0=
       "], 
      Association["Book" -> 10, "Theorem" -> 93] -> CompressedData["
1:eJztidEJwkAQRGd3du7MfQSSEySiQtB0kEosIQ1Yqx15AWvwyzcw8Gbm7fXc
DMB7rz8/5d5NdVzmYeof1zhfdOvqWEs7zAwtJEDkJo6w8EPbPSSG4CWr0En/
QiqUU0UKnVJkHdfhA0+yBZw=
       "], 
      Association["Book" -> 10, "Theorem" -> 94] -> CompressedData["
1:eJzticENglAQBd++fbufgERKwC4sghMXThqCBXigAzu0I78U4ck5TDKZy/ac
NwPw/urPbyn3tj29mlvX71wfmnMYzs1xDGYkAUGsAacs6yDlHgKLe8nIzDiA
5HTPAkUsqjWN1w/QgQeO
       "], 
      Association["Book" -> 10, "Theorem" -> 95] -> CompressedData["
1:eJztkcERwkAMAyXbseMcd56QCmiDMighDVArHXFQBC92Rnrs6Kfb+XycBPD6
1J8fM65Hq2PYVjku915mVlOLEPOPCDBQIIUK99VVwyNmELnkIpQ5/aLktree
HW7WVW1tI98cLgTq
       "], 
      Association["Book" -> 10, "Theorem" -> 96] -> CompressedData["
1:eJztib0RQGAQRPf2FjcMApHgC2RynShBA2Z0ojQd+YlUIPKCndn3hmWdFwNw
3PPzNZFHir0qE/deWzRNnT/eYEYSEES7L2W6AqnMM4EhFi/g7nRXiau3cvnU
jSd2rAYl
       "], 
      Association["Book" -> 10, "Theorem" -> 97] -> CompressedData["
1:eJzt0bsNhDAQBNDZD2t7MUYkZCCRkiFRxwVXAg1crXSEjyKIeMFIM+ksx+97
EIDzH6/HfaZh74a5t5JLXtvOSx2Z6xsEM8CQiUiotiaKiFbBFdqwBRZmlRuL
pBBT2uDqRVktj34BQEMFcg==
       "], 
      Association["Book" -> 10, "Theorem" -> 98] -> CompressedData["
1:eJzt0bsRgDAMA1D5E4eEkDuYAApaCkZhhCzArGyEYQgqXqE7q1DjpZ1HIwDX
E7/vzdNexqVaGeqw5ZKqd8z+DYIZYChEJORX6EREXcwKDWyRhVnlxSIpdimt
yJqrsvpgfwMkowUU
       "], 
      Association["Book" -> 10, "Theorem" -> 99] -> CompressedData["
1:eJztiMsJgEAUA/O+uyr4AWEVvViCrViCVw/WakeuWIQXJxAmWfZz2wnA9dTP
B0xpaI9RU/Kpatom5ouIkCMCCGIeDCblbMRqJmrgwq0UFnF+cTe14DOCxt41
WLfWNyyyBUU=
       "], 
      Association["Book" -> 10, "Theorem" -> 100] -> CompressedData["
1:eJzticENg0AQA71e7x5SROgAiSYogT9JAxHwJ62moxw0wYd5WPLMsOyvxQD8
jrm5gubzaL/cVr2z6555OoMZSUAQ7biUqQbS3cPB4l4yMjNOUHVNaqCIWZRP
/fgHb6sGbA==
       "], 
      Association["Book" -> 10, "Theorem" -> 101] -> CompressedData["
1:eJzt0bsRgCAQhOG9BxwwB86YGxgY2Y0l0IC12pFoEyZ+wR9sums/j04Arie/
T9S2VGtefS+epzEwjzcIIQARTkRChKQmIjqkotDIwViYVV4sYilb2lC0NCWN
Pucb/ZwEpQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 102] -> CompressedData["
1:eJztyakNgAAUA9D+thzhCAKFQODQjMIILEDCHCzHRhyGETA80aTtsKzzEgCO
O37fKIuee+ctr+sqfZZABEnAMOOudPg6SFuJwdzM0hckUXIBJ2osa2rHEytC
BUQ=
       "], 
      Association["Book" -> 10, "Theorem" -> 103] -> CompressedData["
1:eJztiMENgDAMAx3qNE2KhJBYgBHgwx6MwALMykYUluDDWTrpPB/nfgiA69HP
RwzBMXyMdVuspYigTRUgokWHLJne/pTIXIg0VbPCUowv8kiz9XCtpkavHjcI
pgTG
       "], 
      Association["Book" -> 10, "Theorem" -> 104] -> CompressedData["
1:eJzth90NQEAYBPf72btDEImQS7woQStK0IBadeREEV7MJpuZ9Tj3QwBcz/18
RfZpjrnp+i6VEhGUmQGGVEKh4lpM1ElzQqvA2tQs6AtJZwwL6GkMHjls7Q3m
8wR2
       "], 
      Association["Book" -> 10, "Theorem" -> 105] -> CompressedData["
1:eJztyrsNgFAMQ1HHcQIVr6ZkBRagp0LUiOItwKxsxGcJGk5h6Uoe6rFWA3A+
8/sM901LltK1bxrMSAKCeAdIWT4/yj0ENu5NRmbGC5LTPVsoYtZdUz9eGe0F
Yg==
       "], 
      Association["Book" -> 10, "Theorem" -> 106] -> CompressedData["
1:eJztzLsVgDAUAlAQXn5Ha2tXcoQs4KxuZFI5go234NDA0a+zE8A94/cdKcuS
ttFtQVRuWBJW0slQKZGlGJKDIYb5ElnbXmtgvnhMKDzJDgO3
       "], 
      Association["Book" -> 10, "Theorem" -> 107] -> CompressedData["
1:eJztiTEOQFAUBPftLn6EKFRKndpRHMEFxFHdyP8qN9CYYpLJzPux7QHgKvr5
kMln6vuufiIQQRIwzChJh/MgXakymMzmBSRRcov8B8tax+UG7K8Ejw==
       "], 
      Association["Book" -> 10, "Theorem" -> 108] -> CompressedData["
1:eJztkcENgDAMAx07CqBW4tUBWIkRugCzshFpP4zAh7NkW/766NfZDcA97OdL
uEWtJd5B+Ylg4JSNlkFxocEJDXy6ECFmX03BRjn3Vh62vwOD
       "], 
      Association["Book" -> 10, "Theorem" -> 109] -> CompressedData["
1:eJzt0bERgDAMA0DLsZyQM2QGVmKELMCsbIQpmIGGL3Qq1Gmf5zEhItcTv09t
o9bqWYB8Q0EKTBaFJtCozHA2upgXt9xB8QrjiBCqRzFrfe03xTADzA==
       "], 
      Association["Book" -> 10, "Theorem" -> 110] -> CompressedData["
1:eJztyUENgDAUA9D+tmPAEoIFbghACRJmAK04gnFBAhfeoWnTpR57DQBni9+3
cilj97RABEnAMKNNOnwfpK1ksDdz94IkSh7gpMmytnm9ALv/A+s=
       "], 
      Association["Book" -> 10, "Theorem" -> 111] -> CompressedData["
1:eJzticsNgCAAQ/sDPOAQruQILOCsbiScHMGLfUmTvh7jOgcB3Kv+fJy+9/Ku
CDCW0MSskCjJ82CsZnkJTkCWtKQi9mbHpeUBsrIDSQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 112] -> CompressedData["
1:eJztkbsNgEAMQ53Yl/tswUosgHQFLbOyEYGGEWh4ki3brZd5rNMAnLf9fM22
t7cwP3FEyqFUh2cUWWQIFYpS5YOREdFGt+GqnrsGL+kkBCg=
       "], 
      Association["Book" -> 10, "Theorem" -> 113] -> CompressedData["
1:eJztkbERgDAMA2VLMQ5MwUoUDJAFmJWNMDSMQMPfve57rePYhgE47/n5nD3f
Zn3iiNKhssMrRTYZQo2ilHwwMiJy7ra4wqVJnRfHfwO5
       "], 
      Association["Book" -> 10, "Theorem" -> 114] -> CompressedData["
1:eJztkcENgDAMA53YDWm3YCVG6ALMykakfBiBDyfZOvnrfZ7HNADXqp/vyVdZ
nzii4lClw0tFNhlCjaK08cHIiMjRbbjS1z78Bq2IA2I=
       "], 
      Association["Book" -> 10, "Theorem" -> 115] -> CompressedData["
1:eJztxLsNgDAMBUA/+yV2goACKRIlKzFCFmBWNuIzBQVX3NaPvUNEzqffN5iZ
GOAummQB1LQi8uBk5BR5TAinE7c3wKjrPLVWJcpAVhYlLsVrBAM=
       "], 
      Association["Book" -> 11, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAAjHh4pAEAlWgCfA==
       "], 
      Association["Book" -> 11, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABwNhgBMcII/Z4ISK2gEoAlk8CgQ==
       "], 
      Association["Book" -> 11, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSABjAyQ6GBkhPDIBACWJAJ/
       "], 
      Association["Book" -> 11, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAB4swM7MzMjIwMbIwMLMwsbCxsDExAHhcrK1QB
E7oOYPQxMbMwMaFEIwCtXQLu
       "], 
      Association["Book" -> 11, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABrAzsTExAmp2BgYWZlY2FjYEJGEFcrKxQeSZ0
DUBZJmYWJiaUaAQApooC1Q==
       "], 
      Association["Book" -> 11, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweABXNycjIwMLCwMjDysjEyMDBwsjIxMbGzMQMDA
xMzBzMjAzIQEGIEARAJ1MoIRSBoAscwDHw==
       "], 
      Association["Book" -> 11, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweABwPhgBAIQixHGRUgR1A1WDACXZwKI
       "], 
      Association["Book" -> 11, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSACLMzs7AxsjAyMPKxMQMDMxM7MzsTECARAgpkR
KIECGBgYQbJAjRAmSCUAqt4C9w==
       "], 
      Association["Book" -> 11, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWACwkwsDMxsDIx8rIzMjAyCPCxMzGxsbKysrEws
rEJAcTYWJACMPkYQgFAMDExMLIyMALxeA2o=
       "], 
      Association["Book" -> 11, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaACzIwMzKwMTJysTMxMDBJ8DExMrKzsbGxsDKxs
/OwMDOysSAConJGRiREUi0xAJlAzCyMjALhRA2A=
       "], 
      Association["Book" -> 11, "Theorem" -> 11] -> CompressedData["
1:eJztyrERgDAMQ1HJlo251NzRshIjZIHMykaEhhkoeMVvpKOPsxPA9eT3LTuw
IIJ0muXWjKzMrEhWeM3h5aS0SoKb23w3Cje4wgNz
       "], 
      Association["Book" -> 11, "Theorem" -> 12] -> CompressedData["
1:eJztxLERgEAIBEDuOGB4A2NDW7KEb8Ba7cgxsgUDN9h9nseEmV1Pv89piwAc
ydwWAl1VIwod3sGXA1JLMqeTOVa53beyA4c=
       "], 
      Association["Book" -> 11, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAD4FhhIlc3AJWSAn0=
       "], 
      Association["Book" -> 11, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaADjAzMTKysTCwMTMAI4mRhgQozoatjBioFARRB
AJ5ZAq4=
       "], 
      Association["Book" -> 11, "Theorem" -> 15] -> CompressedData["
1:eJzt0UEBgDAMBMFcrrkkMrCEhBpAK45KX1jgwTzWwB7zOifM7N75fRATcGB0
6XlUkrI1KrxaL7gzSIZxc2XQFrRUA30=
       "], 
      Association["Book" -> 11, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAEFEQMAJUdAns=
       "], 
      Association["Book" -> 11, "Theorem" -> 17] -> CompressedData["
1:eJztxLENgDAMBEC//P6YwiiwAStlhCzArGyEqFiBgivumOeYMLPr6fdFQMjp
GdsitVCoCU0s4uVA1dp7WFY6yR1uN7dCA7A=
       "], 
      Association["Book" -> 11, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAErIzMjIwczEyMzGxsoGhiBAkxMqDHGBMjIxMT
iAaqAAIgCwCe+wK6
       "], 
      Association["Book" -> 11, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAFoPhhZGRE4mEHTEhsAJXhAoE=
       "], 
      Association["Book" -> 11, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAFPLyMzIyMjCzMzOBoYmRghCJkwMTEwsQEFARi
JrAuZgCgwgLO
       "], 
      Association["Book" -> 11, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAFLIyswAhiYWNjYGQEMhiZGRiZGEBsGACymZjZ
mJhAoiAJsCYAng4CxA==
       "], 
      Association["Book" -> 11, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAFjEzACGJmYoJxIQgFMDGxMDExwuRAAACZiAKk

       "], 
      Association["Book" -> 11, "Theorem" -> 23] -> CompressedData["
1:eJztkUERwDAQAoGDTGzUUiTEQLXWUS8u+ugOs4/9cu17bQJ4jn4+jfsjVUQF
slUjI07PYRXUlVCL5UnxBaKhAwU=
       "], 
      Association["Book" -> 11, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAG4kxMTKysXJycnAwcnDxcDAwcLEgAqICRkYkR
FItMQCYDAwszIyMAr2EDNw==
       "], 
      Association["Book" -> 11, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAGXJwsTEx8vLy87Fy8rHzMjLwcrAjAxMjIwcbB
wcHCwMzCzMzMxMbBwsIIALdrA6I=
       "], 
      Association["Book" -> 11, "Theorem" -> 26] -> CompressedData["
1:eJztxDERgEAQA8AkFzJcjwEsIeENoBVHzFdYoPgt9hz3NQjgmS2/dojsJHuC
3qqjT5F220bJktMyXqrtAz8=
       "], 
      Association["Book" -> 11, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAGzExMnOzsbGy87AycLGy8HCzMLCzMcMDOzcHO
ycrAxsHHDFQmyMzKCACt8AOH
       "], 
      Association["Book" -> 11, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAH7OxQBhO6DDD6GJmYGVFjEQCYTwKT
       "], 
      Association["Book" -> 11, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAHctBoYmFkRI0wII+ZlYWZmYkBLMMIEmAAAJzw
Aro=
       "], 
      Association["Book" -> 11, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABLAyMqBEG5DGzsjAzMwElGEFckDQAlzoCmg==

       "], 
      Association["Book" -> 11, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAH8hJi7EISDGYsjLI8HEDAxgEGTMzMLMwsLMzM
DExMLExMjCy8rCyMAMMhA/c=
       "], 
      Association["Book" -> 11, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAABpqsnAaM3EzMHILcCMDExsbMxMLMzMTAxARi
sHKysTIBAMKeA/4=
       "], 
      Association["Book" -> 11, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAA+mxcnozcTMwcQjw8PNxc3GDAxMbGyszCwsLE
wMzEzMTCwgrkMwEAwGMD9g==
       "], 
      Association["Book" -> 11, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAAbNyKjCxMbCzigpyCAmIcYMDEyc7CzMLCysLA
ysTCBGSwsrMxAwCzxwOu
       "], 
      Association["Book" -> 11, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAAnKwMrMyMrCyMCMDMyMjKxMLKyszAxMjMxMrI
xsHIzAAAnUEC4g==
       "], 
      Association["Book" -> 11, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAB7AwsrMxc7EwQwAwigAJAwMLKwMnMw8zMxsLO
zMIIAJ/iAww=
       "], 
      Association["Book" -> 11, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABjNxMzBxCPDw83FzcYMDExsbKzMLCwsTAzMTM
xMLCCuQzAQCrCANu
       "], 
      Association["Book" -> 11, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZABTAyMTCxIgImBgZGRiREYi4wMIJKBiYWJCQCb
ewLE
       "], 
      Association["Book" -> 11, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdABLIzyXGwIwMTIyMLMwsLMzMDExMLExMjCy8rC
CACkXgMr
       "], 
      Association["Book" -> 12, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRACeuxs3BzcnFwcHGwcHBzM7KwsbKxsbGwMfOzc
rGxcHNxM7IwArGMDlg==
       "], 
      Association["Book" -> 12, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVACzCyiQqLCIjwcXBwcHGzcnEwsDMzMTAwsjFxM
zGzszExMjACqjgNr
       "], 
      Association["Book" -> 12, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZACvBwcHOxQzMTIyMzCwszMysDMxsbExMgiwMrK
CACiaQMq
       "], 
      Association["Book" -> 12, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAC2hraSnJKSvJKikpMQgLMzEzMrKwMzMwsrKxM
7BycXKwAw5oEdg==
       "], 
      Association["Book" -> 12, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRADqdk+Hs622rbW1ozK0iABJkYGFhYmZgZmJmYW
ViYA5WkFng==
       "], 
      Association["Book" -> 12, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVADWvKKvq56rs5OjLzCID4TIwMLMxMLAzMTMzMb
IwDKGgSx
       "], 
      Association["Book" -> 12, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZADAZ4udlp2NjaMKjIgLhMjAwsLEzMDMxMzCysT
AM28BNk=
       "], 
      Association["Book" -> 12, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAD9sYmpiZmRozGWiAeEyMDCwsTEwMzEzMrKzMA
wbQEcw==
       "], 
      Association["Book" -> 12, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAEsvIy8oryjPz8IA4TEwMLM5BgZmJmZWMCAKmh
A3Q=
       "], 
      Association["Book" -> 12, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAEcTZx0dGM/AIgNhMTAyMLCyMDMxMzCzMjAMBa
BGY=
       "], 
      Association["Book" -> 12, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAEVuWZGYx8/CAmExMDIyszIwMzEzMLMyMAuy4E
OA==
       "], 
      Association["Book" -> 12, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAE8gryjHx8IBYTEwMjKzMDAzMTMysTMwCgVwMU

       "], 
      Association["Book" -> 12, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAFaWmMfPwgBhMTAyMrMyMDMxMzCzMjAKnrA4I=

       "], 
      Association["Book" -> 12, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAFeYx8/CCaiYmBkZWZgYGZiZmFmREAoWkDIw==

       "], 
      Association["Book" -> 12, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAFjHz8IIqJiYGRlZmRgZmJmYWZEQCX/QK2
       "], 
      Association["Book" -> 12, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAFjIwgxMDIwMTMxMTEwATkMgAAlasCjg==
       "], 
      Association["Book" -> 12, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAGBUDMyMjIwMrCxMDAxMLGwcQCAJ4lAxI=
       "], 
      Association["Book" -> 12, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHjIyMDKwsTAwMTCxsHEwsAJXVAqI=
       "], 
      Association["Book" -> 13, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAGnGw6GqwMrKyczCysrNyMzAwAnDoDFQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAGbJycTIysLEzMTIzMbIzMDACXOgKz
       "], 
      Association["Book" -> 13, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHnJysDMwszMwsjCxsjMwMAJb7ArE=
       "], 
      Association["Book" -> 13, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAHeqwMbKyczCysrLyMzAwAmRQC4w==
       "], 
      Association["Book" -> 13, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAHrAxsLJzMLKysXIzMDACWegKx
       "], 
      Association["Book" -> 13, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTAAHCycoszMLGzCdgIAl3MDFQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXACjAwAlOQCew==
       "], 
      Association["Book" -> 13, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAAnOz8LCwsbOwsrACWdQK7
       "], 
      Association["Book" -> 13, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAAcuycwixsrCwsAJcgAtI=
       "], 
      Association["Book" -> 13, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAAbBx8rBxsLKwAlfYCsg==
       "], 
      Association["Book" -> 13, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTABTCzMTLncTgCWvgM/
       "], 
      Association["Book" -> 13, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXABPNxsLCzMAJWQAqI=
       "], 
      Association["Book" -> 13, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbABzBxcLGwAlT0CmQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfABHCzsDACVHAKN
       "], 
      Association["Book" -> 13, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTACLKwMAJT4AoM=
       "], 
      Association["Book" -> 13, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAC3EEAlUoC1w==
       "], 
      Association["Book" -> 13, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbACGgCVCgKi
       "], 
      Association["Book" -> 13, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "]],
     SelectWithContents->True,
     Selectable->False], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"#", "[", 
           RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", "]"}],
          "\[Rule]", " ", 
         RowBox[{"N", "[", 
          RowBox[{"Mean", "[", 
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "]"}], "]"}]}], "&"}], "/@",
        "res"}]}], ",", "vals", ",", "acc", ",", "xval"}], "}"}], ",",
    "\[IndentingNewLine]", 
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", " ", "#1"}], 
         "&"}], ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListLinePlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"Axes", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"False", ",", "True"}], "}"}]}], ",", 
         RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
         RowBox[{"Frame", "\[Rule]", " ", "True"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", 
            "\"\<average shortening\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", " ", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}]}], " ", "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", "10"}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
}, Open  ]]
					

The rather unimpressive best result—an average shortening of 7.2—is achieved with 10.33 (which says that it’s possible to come up with numbers x and y such that and are irrational, while x y and x + y are rational).

The maximum shortenings are more impressive—with 10.41 and 10.78 achieving the maximum shortening of 165

Module
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"res", "=", 
   RowBox[{"{", 
    InterpretationBox[
     DynamicModuleBox[{Typeset`open = False}, 
      TemplateBox[{"Expression", "SequenceIcon", 
        GridBox[{{
           RowBox[{
             TagBox["\"Head: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["Sequence", "IconizedItem"]}]}, {
           RowBox[{
             TagBox["\"Length: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["465", "IconizedItem"]}]}, {
           RowBox[{
             TagBox["\"Byte count: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["5397840", "IconizedItem"]}]}}, 
         GridBoxAlignment -> {"Columns" -> {{Left}}}, 
         DefaultBaseStyle -> "Column", 
         GridBoxItemSize -> {
          "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
        Dynamic[Typeset`open]},
       "IconizedObject"]],
     Sequence[
     Association["Book" -> 1, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 1, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIYWBgYmRnBLBoBFihNSztGNkAOWQCuDgKK
       "], 
      Association["Book" -> 1, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgIWNhYWRbMDAQEABO5SG2cGAC2A1HdUq6gL8BoLt
I+w9agCI5xlgPibVSuTYAwDLWQPN
       "], 
      Association["Book" -> 1, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgJERyhgFQx8AAJpGAn0=
       "], 
      Association["Book" -> 1, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJAgJ2Tg5kJCBmZGJABEyoXC4ApwKuSHUozI+sYBfgB
M7mKAf41Ar4=
       "], 
      Association["Book" -> 1, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLAgJWFgXaAHUoz4lU1CsgHyCELALB1Aow=
       "], 
      Association["Book" -> 1, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIggJuVBQiZWRjhQoyMjCwsYAoqhFCMACwwmgVZFKgD
AiGAHVkxMxZTRgEDOKAZIGENxqwMMBY8SrBFAKYMAC5tAuc=
       "], 
      Association["Book" -> 1, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKggJ0VBNmYmBihgImJiY0NTEGF4DJIgBWIGRiANCuy
KFAHBEIAO5JiRhZGJEvRABYLGFEEceokE+A3EGwfI3ZnURkALQKGMwM4rBmY
gJ5mAgcaKATh4Y81AjBlAC8MBBg=
       "], 
      Association["Book" -> 1, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJggIOLjZOdgxEuxMjIyMEBpqBCyIphgAVGsyCLAnVA
IAQwQWlWEMGGxZRRwAAOaAZIWIMxFwOMBY8SbBEAAUxIbABHwQL7
       "], 
      Association["Book" -> 1, "Theorem" -> 10] -> CompressedData["
1:eJydUUESgCAIXJjk0C/6Uk/wA721H4WIpJN1aB1hZhEWcMvHngnAWUxAVlkS
MzmYOSVzTkWkg+gF1EvPakY9kYj6uKgMmiMmAoTevaf+w3dB06NnW7VT7xdT
vo0Dum0bJBirHYPpnmG7BivJ0lYe+59+gEfgjWi9CwMwA+M=
       "], 
      Association["Book" -> 1, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIQgJWXm4cRLsTIyMjDA6agQqiKIYAFRrMgiwJ1QCAE
MMHMR9YxCtAAKJwhYQ3GfAwwFjxKsEUABDAhsQFG6wL+
       "], 
      Association["Book" -> 1, "Theorem" -> 12] -> CompressedData["
1:eJydUMERwzAIkxDpHlmpI2SBzNqNGsBpyKXNo7qz8XFCMpqX9bkQwCuuhkku
WYCkJPcsYgAC2BETj5wj3D8iW18MJnZ1GzU5wm+c1FtFmZE3s3/hXvDwq3Wb
f61f/8NI4yJWzSTiiOybfTy21PNkRLQp6BZh2p6f4YqRUeu8AY+BA1g=
       "], 
      Association["Book" -> 1, "Theorem" -> 13] -> CompressedData["
1:eJy9kNENgCAMRHstfsIOruQILuCsbmSvoBKNfhlfCD2a0oOO8zLNEJGVW08u
Ztows5wVlAg8yJUEJhVpFxHMr0FFW5HGAgbqdOtxgg5hoxqrDvErvd+79xcP
88m3LxrtCifIk8//yeVIaJfcAMoOA2g=
       "], 
      Association["Book" -> 1, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQAB8zIxMUMDMz8vAwMTACAUQOQzEQsIBlmRhZWODK
gICZiZGJAQQhAEazgnVgMWUUAAEwvBkggQjGfKDwA7PgAYstAiCACYkNAFr6
AxM=
       "], 
      Association["Book" -> 1, "Theorem" -> 15] -> CompressedData["
1:eJydUMENgCAM7NHCkx1cyRFYwFndyFJAUYMhHklTyvVabknbmkBEew53BJaK
EBCjgNk5GBxI+agodGHmHKUk9S149sin8Go/vOaQ98wTTfbSL1dq9XHrP8wJ
4rGPlSa0MumThz5Rv4ltkhmp1ov9XP0fTUDbxHXFAxB0A6o=
       "], 
      Association["Book" -> 1, "Theorem" -> 16] -> CompressedData["
1:eJydUdsNwyAMvGKcWBBF+ckAXakjZIHO2o3qAyTSJs1HT8jY3Pkhc9+ej+0G
4EXzhWWacs5mNq+zqgldIWBA2IPqUVUBixodQBViTTFJQmolXSk8bkD6J2QH
D0Kot7Ssy9y/EC7Z3o+T0PbE4pctBLIMD8XqYxGir+ysPR0zRdkmFm+t0WSE
OmHD0FTxMGD7kY/HN8qGBSc=
       "], 
      Association["Book" -> 1, "Theorem" -> 17] -> CompressedData["
1:eJytUMkNAjEMHF/JWvBgP/xpiRK2AWqlI+yEVWDJk4mUxNbYY89te9w3AvDM
6wj3pSxmdlovZsJn9yIJcgJ9IsmVRAAjU+ZIaPAYWCvHQX13jIwCEicZE8kd
WZ1IOURDin+8UdmqIvgzZvsPDD2iXIsGva/fXEhXOMOfZj3ZiBiWzeTzU4rA
muQ1nGA1VqSnrrrPcxSQPsJ32xf6SwRy
       "], 
      Association["Book" -> 1, "Theorem" -> 18] -> CompressedData["
1:eJydUcENwzAIvMOBGsmVukJX6ghZoLN2oxocKZbS5NGTjMEcB8LP9f1aCeAT
5oBH8+bu93YTMbFaTQKoAGcEt5g74IVULcAgoqmqKHQT7C9Mi3TOIRN6QI47
zkhfFf+Dy2mmfjFJ2L0w/dwCIxvhQWw8JhH7yn61D8eMKEnpG+2CLjUV6rKc
TcvtRzAv5gvPEARC
       "], 
      Association["Book" -> 1, "Theorem" -> 19] -> CompressedData["
1:eJydkNkNAyEMRD02GBYtRaSllLANpNZ0FB9SQs6PPCEzAh8Mp+NyPkBEVw/v
zDHHGHNvzE20dxUHHeAVGNJaIxoFULUgUqsAe9UqFRUJxxKQawsfhzqyYL2Y
bafUMUy+Vv4J/7yNefZa9+0vSMtpKXT8Avstp8dn8jAS8fiyNeEubJAqKGzS
FmLIFhm9lNfGS322peXwBoEJBTQ=
       "], 
      Association["Book" -> 1, "Theorem" -> 20] -> CompressedData["
1:eJydUUESwjAIhAViGzvV8Qd+ySf0A77VH7nQquOMenAnISQQyG7Oy/WyqIjc
0nzA8XTofT/tgB6tNY+ENTMv1GJm4GRApJvqPDcIz+DAFGN4IMBaqMFDq9Ke
u2+IyFZrO9BJT1YfLBD+/ep/sJ/RpJbPTd7kb052G5J9WYrAJXL7im5wTzUq
EatyqLRnsj8uJbFhUKGOZjKahGrHKEqZ29qHUEuF359fn2FSVYQaQ+5i6QZJ

       "], 
      Association["Book" -> 1, "Theorem" -> 21] -> CompressedData["
1:eJydUcERwjAMcyQrpL2DHViJEboAs7IRttsCD8oDXezYia2zkutyvy3NzB7p
vuE89z4mAZfh7l0JOumF2kgibJBmXa3N84BJUU1OPEU1iKBCLQdUzJ7ZEaJd
q0lIrthtjQFQftz6H/jzVjlyjJu6Q39K2pHqy6dKUpm+bzd4qPY6JtaXQ5W9
in1vSmG901oyWgRi65Rtb84dMH6OHHF9Bq1YDBHYEw8WBhY=
       "], 
      Association["Book" -> 1, "Theorem" -> 22] -> CompressedData["
1:eJydUdsVQiEMa5u0esXjnwO4kiO4gLO6kaHXx4/6YQ6EQgqFcLpczxc3s9uk
jziOMfabiMO2qljVTGajB5JQU2S2wH0MwICITeY+h9alRYSkyZq6ubtyvha1
qqy1V0HBjGyNAbDy684/wZ+qiptBWZw3ERNPzNc3ywQNNadv9QG9GdnL7dO0
pNNeyfnclF0Nlu6kLbSCL7Ezl80VwRVOSPY3wr0/g9anOCh/7/RtByU=
       "], 
      Association["Book" -> 1, "Theorem" -> 23] -> CompressedData["
1:eJydUclxAzEMoygAXI3XfqSDtJQS3EBqTUcGd51HZuI8Ag2hg5AgUe/3z4/7
iIivpt9xu62FzLcdQK5GoQw9oySBQpERO0Zer8TgzLxIS7s2Oj3n9CFplmaG
hwjkS9O1tlWOBuxY7uMc+xpc28ud/4T+zNo8Alapb2IWvtGvP1hw84NF9grJ
Th/kejkgT3TUrqo1rW9uQfetLRtVzeAYrqcYa46dlxg+FnPyxCCSzB84PyQO
vwQT4wEkUQik
       "], 
      Association["Book" -> 1, "Theorem" -> 24] -> CompressedData["
1:eJydUcERwjAMs2TXoRwHb56sxAgswKxshBQ4eJUHSuo4ZzWSk8vtfr0hIh4O
Gziva5H7rirujJrI+iBJVi1LxGkHjtEEATS59r7HkgsToGiAQ4USKmxqtiAB
L822dFdoC039WmPb7n+o31WX5TbtSZeRaucNda+YAjSycqby6GYx78EkfWod
k5NvTsKcyai5cyKhMSAlFaM1gUMdXUa9ZOcpjvwCfgA/ULxc+eB4AhvDBf8=

       "], 
      Association["Book" -> 1, "Theorem" -> 25] -> CompressedData["
1:eJydUdERQjEIIxDwPe88XcGVHMEFnNWNDFQ/9cO0TenBEaDX++N2h5k9m75h
3+l+LJK+NWQwGZk5BhnuLivNLhv8cCiHAyj3vY5VGekBuMLEIpqMEH2VLCHI
vsqrpYumJ7SVg/Wj3L/A3952q9romjSMUDtvqHtxCNAKxpirz74mKFMH0V1P
6IoJzEAmYF6dTkLbBinJaaUNnPLcbnDJThasUX6A+QB9kK2qRvwF+p8F9Q==

       "], 
      Association["Book" -> 1, "Theorem" -> 26] -> CompressedData["
1:eJydUNsNwyAMPJuXSYUyQ1fqCFmgs3ajcJAqSEn56EkYPw6f8XN7vzYB8KH5
iTWHEDWXkpRAAWQEOSnmDJgL3nsHdCJW86YGOxrVjNBWAsRNFHVADUT6zdPL
s3H/wfT/gx4noT0fNr9tQVhleGnWk42Ic2V38nSWRRGb5IPSdYMJWgslhO88
N+Nf2+6XQgQ8
       "], 
      Association["Book" -> 1, "Theorem" -> 27] -> CompressedData["
1:eJydUVEOwyAIfaC0xCzZfnaAXqlH6AV21t1oPuxil3V+7EURAXmAy/ZYNwHw
pPiNq9mkF3dTAg7IEQyZkxngyXLOqfkVuJdctKDseapFuGoAJA0I9YB6qbni
5G7uYbl/YNx/52MllP1h6DEFoVfkJFkzRiD6yM7oqbgrLChvpM6uM1Q43PfI
vtqX/Uc+jC9qmgQa
       "], 
      Association["Book" -> 1, "Theorem" -> 28] -> CompressedData["
1:eJydUcERwzAIkwBfcsldM0NX6ghZoLN2oyKc9vypH9XDxkggbN/P5+MkgJeW
CVozP/ZtCcE2wki7wKpdoN3RogoySydvgfBg5Ik9K4Ku2Cd+chFkB+tmuWe3
aj+t/Qs2Zy9ac4QB5ECxiEwRdb2RHeoNJSxpVwwyG4N1DTRXdEDP5xnLZW/t
17TJ8vMTX7wBXgIEJA==
       "], 
      Association["Book" -> 1, "Theorem" -> 29] -> CompressedData["
1:eJyVUEESwkAIS4BevfgCv+QT+oG+1R+VgDqd1daanYElwEL2Ni/3mQAeMke4
kuHuVnCCGyhAyOQtojhVyjn7NJJy2Ump4P7Q7H55Q8/IiIYn/2vdv3HuwdIw
Up9FX/rGCQPzjkqmdJboScGl5RP5/3vbWu9G2Ca9ArkiA3o=
       "], 
      Association["Book" -> 1, "Theorem" -> 30] -> CompressedData["
1:eJy1UNsJgEAMS9rerzu4kiPcAs7qRvaBoIjn16VwzYWUlqx93zoBHPEMQZqK
MKHpfo6Y6+kq4t4wq7IqRUjMUticqw12lj373UVc+t+58zB/tSeETBotPkuw
KM/06wZ5EccJSdkDHQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 31] -> CompressedData["
1:eJyVUdtxAzAIA0s8TH/y0wG6UkfIAp01GwWc3PUr6VU2nM/CCPDX9ef7qiJy
G/cen2stflQvW2Z7527LHAsPYwEiQV17h2tf8FLl3AwQaUDvBTDTKFCl8LVo
1S4Pr4FVtUSFRBV6Ax7xZ7n/RL5lj960F5kVZt69PJHTGqzRDZoNeViSQx+H
vdtgnCH4AwFEx5+xDIk4sVNH55I+tihcyjTcBT3QHntnZXRmQkn9xVLN6K84
D0F1V8gdy+EINQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 32] -> CompressedData["
1:eJyVkLtxAzEMRPFbLInjXKDMoVtyCWrAtaojgedx4MDW+PEDAliCHLzfPz/u
KiKPvb0gwrGOqpWBPJhrTo4mswMIbI2LGuAhYWZvt5sp9JSSNckdIm0to5iq
CX9/tGrOZM5qoopkUVjlPd3B8/V3/wf/zu60bztGEUj4N+Q+o/Ee6ORXNrpd
21yiMXo52oHnBbKvtn5HtsA9L+2Q3WhIO4QYpFL9NHF2wW50g64crhH6A/Lo
v8j1ngbU5QmkvQgQ
       "], 
      Association["Book" -> 1, "Theorem" -> 33] -> CompressedData["
1:eJyVUcERwjAMkxwTcr22O7ASI3QBZmUjLAe43HEUqkcSW1LsOJftdt0I4K7l
F9apuWALYaRbB9N7hvaCk6c2shLMDhlSqmw3WNG57NSUSVA5qIT8EQaRt/sf
7R6C7bNPWn24AeRAMYlIUbz2j4fJb0hhSrtikNl4aK2g5lyXmIR5iZKK1lq/
dRtX8fUTbzwATsQEIg==
       "], 
      Association["Book" -> 1, "Theorem" -> 34] -> CompressedData["
1:eJyVUVsOwjAMS+xMG/3iD/HJlTjCLsBZuRF2i8bEBw9PnRLHkt30st6ua0bE
3b+vOLbJ4AlJYOYAkMKMqgjGVF0qFgUcGCzCHcRioOiaHzztYtgubaE61KZa
gfVL3H/Az1OP0/mVgZnjyh0KlyZE6fNN99NNpJNdmGMF2euXYCtk1BpjtiSa
NqFtFcLdeVmeefAe0Nn6S+zJB+pEBOc=
       "], 
      Association["Book" -> 1, "Theorem" -> 35] -> CompressedData["
1:eJydUMENAjEMix0Tjr5gBFZihFuAWdmIJNyjAnFIpFLT1HZt9brebyvM7FHb
7xqhXDzDAvAIkiJae6Rk5nZQM70hntzcHTUxb7mV19l3jCQ1MbtSBZQ+R1M/
v6v9q7iPbnDlEKtPEBrIKxRe/eMzK7WsiU19MSaa5hxjuEVbjkQoT8tyuSzL
t7SJgvFm/ARWigQ1
       "], 
      Association["Book" -> 1, "Theorem" -> 36] -> CompressedData["
1:eJyNUdsNAjEMs/No+YDPG4CVGOEWYFY2Is6dkPgA1ZWSNk6bOL3vz8dOAC+Z
BcSlVl4NbubuEWFBEsSsCGDIqDQyih3uWZRXghVOVxetQ47fRedMF2aBocfC
4ZHI1PPua+2uI/6yXY/y1ZJLBxu6GCUbkgWjtMkqu1keEisI8W3tHEbtj9H1
tj2pQpkTPiR0AmOEDVKDvG3bp1t+z65O9R8q03XZo30DroYEyA==
       "], 
      Association["Book" -> 1, "Theorem" -> 37] -> CompressedData["
1:eJyNkM1tRCEMhP03NhghlHfey7aUEraB1JqOYoiinBK9DzECj6UxPF8f7y8m
os8td7geb4/Rsl8j11qZubL38OYm24awmLGziUi6CwdP6tQcQJVKWxMnYQbh
79A5R/fwPgsboyJGUIypc6oqYt4c9zb5rxtRokStZc4AHPpD5j7Xu6C1UOa3
a2bbPqJr1VbUBeoHhGqr/l3ZDapxevccVn8pyhlkoMlsLsTp6DiE7SiG8S/K
3NvVO+iEMpyVvgAy2Ain
       "], 
      Association["Book" -> 1, "Theorem" -> 38] -> CompressedData["
1:eJyNkFtOBSEMhnsvBU5HdJ5N3JJLOBtwre7IcozxxZj5gKYtPxT6dv94vyMA
fG5ziafX9dxy5uw558js6R7SXGjvKiGJoJMQUWiFjgMcTJiNmZjVjAyYSOH7
yJ+cZ84WMc/C1jqOvgL6WrIOZvYYV597lf8vjCijAHNmnt0snH8YY/tecA0P
d/OdEakvs8pD1Hst9gqc2wOLnTHbbdHtcrCoalYhqcYQYzMQhRdErmZiVw+z
UqiJstYU/IURbyNvN4US7OoNGb4Av5UJbQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 39] -> CompressedData["
1:eJyNkMltQzEMRLkvWmxAp1zTkktwA641HYVSEOQU4z9BA4lDYCh9Pl+PJwLA
15ZrrI8eLe+9jbXGGGtkuoUJbVMJSQQNhYiaGaHjgIAwVa1SaQQJEKKC/B96
mz3NLWchfUd0B++T52Rm9XF53Iu0t657CVdXcXNVU/5ljH2udynX0jJ/XBHZ
9hFeqzZrXZTtoM4c1b8ru4HZT++eQ+ovidEbiMFEFCPACE09uOwoVME/GDHj
nqlwQlENGb4B98UIkA==
       "], 
      Association["Book" -> 1, "Theorem" -> 40] -> CompressedData["
1:eJyNkFtSBCEMRRNuHqQ7Q4szNf9uySXMBlyrOzL4KH8sqw+QIg8I3JfH2+uD
ieh9mZM8PfeRmVuOkXPmdA/pLm3ltHETYW/SWgst13knJxPAgAaoWVNCK/N1
5E/ut5E9Im+FzetxbDNomxPzAOCxn3/uOf6/MKKMEh3HGPfNLBw/ZK69F6jh
4W6+IiL1Zah8F9WCl+Pon1gAG8yWLLqSCIiqjmokJUwDWycxujKjxORuHmZV
oSYKrSn8C5gve14uSlWwuncGfQCQ2glc
       "], 
      Association["Book" -> 1, "Theorem" -> 41] -> CompressedData["
1:eJyVUNttBDEIBIaHDetd6TpISynhGkit6Sh4pShfiS5jgTAzYPDb8+P9yUT0
ud2rOEbl46jruirzzMwRM1Q2ZcKiyiEqIukuHLxo0nAzB8TMxhAniFiLf33i
XDU9fK6GHkdmVFDUwloALNY/xn0J9Scb0Q5EY1Sd0asYvlG1497L0MeabOuM
qm76djjPNlhfDH7DAhit35ktAOLW7jm0/1LAGaRGi1ldiNNtdu8uHepdxKb8
AzDnfGQa7W7G7gz6AsRHCHQ=
       "], 
      Association["Book" -> 1, "Theorem" -> 42] -> CompressedData["
1:eJyVUUtOxTAMdPwZf1qVlsJjzZU4wrsAZ+VGON2wAj0mkjXyWDOJ837//LgP
Ivqa5WHg7TXTMzOqfFFdNJJltIepiBlDwMxmRgOjKAmi6qqsagZW0iY9/GvC
7aWiA+LczxPPx3HUllTbk2wlIpH1n+s+gvhTzeziRMuy77eKSJ/PlImI5uIN
6ePpjqsDYPZxDbU4VcwaF5Ai1bwN5BIlBW2ydxDzoMEEEBudNKR3SdbWqhG9
PxgHxHX8oJe/Vq4raOb2J8QQ+gYKUwj+
       "], 
      Association["Book" -> 1, "Theorem" -> 43] -> CompressedData["
1:eJyVUMkRwkAMk2UxO2wHPGmJEtIAtdJRLIdHGIZM0GN9SD7W9+X5WALAy895
BDJiZNKIrh2UgMRFrTAl8ppIJVtYWW5Q2s+DmZI2ZQEe4foKoW5P/bXuCfCY
fdPeQ7TdUdFEpcK87dfHXE+0sKWbYifj3pkz65zuNOsSda0a6eg2xq9tq1XU
QT8Hr/f/A/0=
       "], 
      Association["Book" -> 1, "Theorem" -> 44] -> CompressedData["
1:eJyVUMltBDEMk3VbtmceqSAtpYRtILWmo1D7yWsRLA0I1GHq+Hx8fz0GEf20
eQMfERYRWuWlepkny4CGqYgZ63BmVlUaPiYlGYtMd1hTcELKyPWl/lk1s2ru
s3fsfc5aRau2niNQqvneuP/j9SiNKpggWuu+z8qc0WtKQxVccIsQvKgIj464
e8f9WdTcn05IPuETa4CLJGJIwsPHuNFo4JLClElstGjIYCab27qPNxhSYfwH
Yb4qrsup+6Iwh9AvVKcIfg==
       "], 
      Association["Book" -> 1, "Theorem" -> 45] -> CompressedData["
1:eJyVUclRBDAM8yHLTgaKoCVK2AaolY5QwoMXMKuZXPKtvD0+3h9uZp9newYc
FgkKwAJfAhFmQGZVwivcu8qjo53Ow0fEsTJgupZl/Jq+W24JtoC918Jqm9Ui
qg79ZLv/An9ab706J9kz03XHTBzUWQJKnNpjHeYKg+J1ITWLHilnXvRKNGYS
ozh8i6gkVCE/n5FWZQ5r98irLKQeXvcmdwCOqR+osuLXokE2eEpk/wJVLAeC

       "], 
      Association["Book" -> 1, "Theorem" -> 46] -> CompressedData["
1:eJyVjzFyAzEIRUEfEMLs2I3XjRtfKUfwBXLW3CiwkxlXScavQIgPQv/x/Px4
MhF9dXiL87rd73m57Ju76zKMroJ4iLARxhhuxqwcNMlN1YChKu7DaJRA8vvS
TF/T54qMsFPhy8hWaqYq1OPd7/5H/qmaUXujiG3LtlJefsjs3ArYcZh6V0Sk
5SPgugM7rC6GeaATCJQXaDeUPo/erRbVJIERk4ZSMosJIaxHtDBRVrAKvwDz
Oj3WUupXpEYY9A2bpQiD
       "], 
      Association["Book" -> 1, "Theorem" -> 47] -> CompressedData["
1:eJydUEFOBDEMS+PESaphpFmJEye+xBP2A7yVH5EWceCwSGC1bqs4VuPX+/vb
fYjIx6K/4WVW5ZyV59MxL4KjPaBDzez5dANmhKqrDwz61xMAaRC6Q+APzau8
nJFM0m/XdcVJibM8szs9H3f+E/PXKtlk6+yBgwziG5ncQ7ET2Me+Ap3CKm9C
VW/04K2JDRa2fvESrLW06x+qJmqdmCikbHiYuqm2p3ZwB30oOufxA0fWcbgs
tzkyBuQTW+EIDw==
       "], 
      Association["Book" -> 1, "Theorem" -> 48] -> CompressedData["
1:eJytUFFKBUEMa9OknXXfzkPwAl7JI7wLeFZvZGcFeQoKgvmYmZBpGvJ8e325
uZm9reOPEFkkcx6Xy2NBgBkCHhFPk61uY8DlBFGK2DIRgSrRAv2b+NlbKUYw
mal5XI+YZZpqV6lGO/wztl/VqpVpxRpjZUh+oor3aKbz0S2c1714En2gvmk9
ufjoRe40LNZ1mtK7PwSjywuktFGOBOhfsIv7TluLH3yUh70DiBMHQA==
       "], 
      Association["Book" -> 2, "Theorem" -> 1] -> CompressedData["
1:eJydkFFuhCEIhMGBEZRs/mQf+twr9Qh7gZ61N6qaJn3qJt0vEZUBdXx/fH48
VES+dvgv9/u83a6KCA+i7RREm5lS0FobEaquJSkjSAKNtDFaSFuC+N+XVkX2
6DlqDM5FJIVZXuUOj3rhuU+ZT1VStjfJrCq6by8/zLnX3PZ4JnrfGTPb8gm4
rjXAtSH6wTuQWF7gpwDop3YbW50CxVzf5FKq1k0wTosvaK4OddNfoJrjLdNl
n2JqVMg3M/MIKA==
       "], 
      Association["Book" -> 2, "Theorem" -> 2] -> CompressedData["
1:eJydUEFOBDEMS+rETTqdQdobIw7wJZ6wH+Ct/Ih0hMSJlcCHtI1jpfbb/eP9
riLyucqf8XrcbudTRHgQbXUg2syUgtZakKquKZSgO4HmbhHNpBUh+H3pPiN7
9BxzDG5z2yIpzOlzusMj//PdR9gesqQsbzLnvu/LSnn5xnGsOwvgddD76pjZ
oq+C5xM4wXoQ/YJ3IFFe4Gug+H7N7rWolJUN+pBGmapGE1TIJfECzdWhbvoD
qOZ4yaxQS20lUcgXPpMIVQ==
       "], 
      Association["Book" -> 2, "Theorem" -> 3] -> CompressedData["
1:eJydkEtOBDEMRO2UXbG7kx5pdrTYcCWOMBfgrNwIp4XEipHgLZxP2XHKb4+P
94eKyOcKf+e4389bRHgQbV1AtJkpBa21IFVdUyhBdwLN3SKaSStB8HvTOSJ7
9NzGtnEf+x5JYQ4fwx0e+a/vPmF/qpKyvMkYc85lpbx8cxxrzwK8FnpfN2a2
5Cvg5QROsA5Ev/AOJMoLfCWU3q/cWY2qsmaDvkmjDFWjCWrIVeIFzdWhbvoD
VHN7zayh1itWJQr5AgqqCDQ=
       "], 
      Association["Book" -> 2, "Theorem" -> 4] -> CompressedData["
1:eJydUEGOAjEMSxynmc4gjtznSzyBD/DW/dE6u4DEAQ7jVlGVOE7q/Xa/3tzM
fjocAJljzlyD/lBwAekB4LRtYljq1CA7xUSVyhDJ4/PQHBtFj8qqYGQgaciB
TIkQeWzdz7h8rZIKMBujSgsAE0/su96cAifXVXdZO8OhL8udfkZvnViUWfjf
tsgNvkRUlD3is6wtDHPl0hAmARsaHoTam9s+/vX6G+LMCDib4BHS+AWRrQZv

       "], Association["Book" -> 2, "Theorem" -> 5] -> CompressedData["
1:eJydkEtuwzAMREkOf4plu45WBbJpj9Qj5AI9a29Uyll0lS7yAI0wGlEffty/
v+5MRD9TXuFzvEd0j4BMK8Siyk4QkTBjNk4ycgUWMwHUXXTGU54ee+zrkq0t
676usb1t26UHZb/a9TCDt3zxuU+5/JtGlCjR7TbGkUV9DQ96j9IoEOcUnnNF
VU89N41RA1EmkCfeqiFwBxw6Q7RZoqMuqgICw5PEaWdWK5fuoWqFq7GpmPIf
YN56tclITWGiwaBfdowIww==
       "], 
      Association["Book" -> 2, "Theorem" -> 6] -> CompressedData["
1:eJydkEtyAzEIRIFuQPL8MtbGXvpKOYIvkLPmRmFmkaychV+VkLpaCNDj+fX5
VBH5PsJbjHvmGpmwQ5mokRoCM0t3VdcmLkGgZxrACKM4SQFfvrpvy9R6n5Zt
WXL9WNfLnNLmq193d0Rv77b7isu/bmaF6vZ2G2NvES1rkBqpmOc6IwvkueXp
lk38XhqjFrJEop1EByZEAAEeJvqRwlGFKkGgiCYWsqnSS1XVJL0IujrNqX9A
dZ3rm1zohBtTIT9Hnwif
       "], 
      Association["Book" -> 2, "Theorem" -> 7] -> CompressedData["
1:eJydUEFyAzEIAwtk8NrJsdcmT8oT8oG+tT8q3ulMT+khOmBjIbC4Pb8eTxWR
7x3ewzUiPIm2E4g2M6WgtRakqmsKJehOoLlbRIO0IgSvh64Z2aPnmGPwmMcR
SWFOn9MdHvn2d1/g+JclZXuTOdda20p5+cXlsu8sgOdB7/vFzDZ9BnzegTtY
CZEnvAOJ8gLfNcX3s3bVoFLWbtCHNMpUNZqgllwSL9BcHeqmf4Bqjo9qK7uL
lUQhP6JhCA8=
       "], 
      Association["Book" -> 2, "Theorem" -> 8] -> CompressedData["
1:eJydkEFuAzEIRYEPfFtNPJ7FKOteKUfIBXLW3qh4qqqbtos8y19YHwzi/fG8
P1REPpa8CDmYHbZiEzV3DYGZMUI1tElIOtDcDfBM82Uv+fPPOWZPss/rnHHd
9q1tXdq2Y9+AINvr4/7O5V+39xIXud2OY7bMRnwzxopZoA47+eW6+6ln0px1
wXoQ7aQWhjdkAglfJvoq8aMaVYFAlV0sZagaTLUx6R5FemgkwvWHSh+XMUZI
VEs3p0I+AekaCFo=
       "], 
      Association["Book" -> 2, "Theorem" -> 9] -> CompressedData["
1:eJylUMkRwkAMsyXLJNABP1qihDRArXSEzDAMD+ABStZeK/Gl03Y5bxkR1zG/
4lh7ISNdY6y9OPy6mC4/TIksElRI0ekwkvxYktzBheAMQipnIKodwijUH+O+
hb5+BWam8QDLpj3EeHg439kGm5LjQw+jWZmUr7wP7SQzzbpj5azxgtlz/nx0
s56FAIOJbEe0GtZ3OvqYnPcJy16Ly8Jau7f1TcYNy9kFaA==
       "], 
      Association["Book" -> 2, "Theorem" -> 10] -> CompressedData["
1:eJylUNERQjEIgwRo6xau5AhvAWd1I0PP8/xQPzTtQZsCDZyP6+VwM7u1+Rm5
im6uGm3li02vmfDQomeS2mBappXrbE5+rEguqBCUQYxR0wcspq4QAvGP3Heo
r69Aa2oPMGRKItqjxYkrgcVMZK7NRLespnXkFq0kMcXYmNFtvEBloyMfv8Et
YKDR4aUbNQ3rmbHEiuz9hMZepygFKp7pqaHbHbzDBW8=
       "], 
      Association["Book" -> 2, "Theorem" -> 11] -> CompressedData["
1:eJylUUtORDEMS+I4aSseaPROwJXmCHOBOSs3wn07JGABXliqm4/rvj+e94eb
2cemv+P1ZcUAzCIjq2pmJVDdHgx4eiHidhwRgWSGFQkL/jiQOTQJbHbzOM4b
3pb1IYUA58C/7H6D/vV2re3JbM4x5KircDkRdBDrzYWStsg5t6Kyi6+iXa/7
3twX5lLrpTe4ZSx1k0OL3PUb4aR5WoYrNXckYu+gMqFBaad/wTnqPNuo2Dpq
qPUTT5gHcw==
       "], 
      Association["Book" -> 2, "Theorem" -> 12] -> CompressedData["
1:eJylUdsNAjEMS2yne7ASI9wCzMpG2D0kkBD84L7TxE3cy3G7Hl1V90x/YLGN
Ku6VizFK6pYbW0OOBNLWmiZQ7fENgJJRC+bwASXm5L3pzflfup+Yn7dMPcg6
g1S4p41JkRSSpKtyPy32lLj2lpGFDdlPz7AXw1bMirRD5sxj/8aSZfCrSLif
tmId4RTt7P7O4CtobXEcND0mrAcqpwSq
       "], 
      Association["Book" -> 2, "Theorem" -> 13] -> CompressedData["
1:eJylUcENAzEIAxtIKt0SXakj3AKd9TaqyfVTVe3nrASsQIgh9/352N3MjjZX
AGemmQfluWVlRM4JBMjiTHK7TXHUyDACsMTPchEIlZIFxqjmMDDEqmIGL8r9
wv+CaKW5ZKGVQY20pgX0FkTadbiPV/ydtHohuosz5+zujDEX7aWr6zWXHqVo
nBbuClmX1B+xMreRGrZO/QOBHlMnYnhNp70AIrAFww==
       "], 
      Association["Book" -> 2, "Theorem" -> 14] -> CompressedData["
1:eJylUMsNQjEMy99Nt2AlRmABZmUjnAcICQkuWGqSNq5j5XS5ni8qIrcJf8EO
BRU1M01TNa1MMbfUECAr2SnYAu/kmLh9VQNWWHg0uh1oeLdUoyiElch/7X5i
/exyuEhM3ns8FOKFtaYGEYiqrmKfL3nEeBC7eWJewA+DUVjkPxk9OVlsDrLZ
iwq3Jy5QVeduI9wzfe+ducXTfcUbNhY2jUk6Z2ukhd4BqdwHNw==
       "], 
      Association["Book" -> 3, "Theorem" -> 1] -> CompressedData["
1:eJylUNsRgCAMa9J+eG7hSo7AAs7qRvYB6nmePwYoj5YQsrRtbRCRPcI/JIPK
PAGonXqnUCGDnWeZL+yDTB3Ugm/ImmNU+rfcB/iZHe8BoSTidTHXjMkbXaqn
szR/il4ULxAZE+XRcKqMQddhxk6bDppq3VUblr2pLdr7yQF9XQOY
       "], 
      Association["Book" -> 3, "Theorem" -> 2] -> CompressedData["
1:eJylUNERglAMa9o8BJnClRyBBZzVjUxbuFM/+CEfvaMJL2ke2+u5wczeOS5i
vplx9rGuC+w+jRFkcNBDEC8LlwI+lZx24ukC/BfWOyCJ63H/DM9ZbwmZQXKi
k1SWnsrWiYO50fG55y5CssShUScA0bNIBML7MKnVloQGHZ0K1mcaH4+F+Xd9
3pVpXX75I+wDM5MEaQ==
       "], 
      Association["Book" -> 3, "Theorem" -> 3] -> CompressedData["
1:eJy1UMENAjEMs+P0uAFYgJUY4RZgVjYicY+TTkLigYjaqErs2M1te9w3Anh2
+jWuQF6kdR2AUoI0Rg4sWjQRATAkixFfNN/tONdIxif436NUw8qdy0SwA3Y0
vU3HZa8b7h8gdjd4fNoIzuzypHh+L0imyMxEmpE1zY+6Om9v36+1sR+8AJNx
A44=
       "], 
      Association["Book" -> 3, "Theorem" -> 4] -> CompressedData["
1:eJy1UMENwkAMs+Nc2zVYiRG6ALOyEYmPVqqExAeiu+gUx7Evt/1x3wng2ekH
kau0bQNQSpDGyIFFiyYcAEOyGPFF84DjWiMZn9r/HqUaVu5cJoIdsKPpbTou
ew0YP5vYaPD8tDs4s8uT4vm9IJkiMxNpRtY0P+rqur1jI62N98ELdBwDeQ==

       "], Association["Book" -> 3, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 3, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 3, "Theorem" -> 7] -> CompressedData["
1:eJytUMkNwzAMo6RKcR55dYKu1BGyQGbtRiXloEAf7SsEJOuyTPOxH8/dALzk
rsBqXlUIN7NgWqOWjARTGFh1QAYE/PeaEDL7YOKuHdmx2rlcRPeDP1z0XtJR
ItL3SFoonhc9Tr4QO3Lsj7OulfOrzZpdb9/QberhetlUnur0/BjGKa0vdszu
tZlwm0VO8pSeXyARdzuFbrwBClUEVA==
       "], 
      Association["Book" -> 3, "Theorem" -> 8] -> CompressedData["
1:eJytUNsNwkAMs53rlTVYiRG6ALOyEXHuWvUDEEL4pCS6OA/nut1vGwE8bP6C
iN4vZEukiVhDrbcmgGLZaAtyHKH3XWhMFxVz//XPp9Lf8IV+WZxXkWVQmtug
VnWL0URRi1b+IBVR3DlwooT5KtMnW0OmkMkc40Bc8p0GJSdcC74Axml9aTwB
pdQDwg==
       "], 
      Association["Book" -> 3, "Theorem" -> 9] -> CompressedData["
1:eJy9UNsNgDAI5E4/XMOVHKELOKsbyQGNNTHxx0gaKMf71rZvDWZ2SH0jC+DN
KE1pGTJjMnBwkjO/zOxh3jFU+e9CzY4lYos8DuEnXBu7hy6ZgwQjsa7CIOwV
xZN/LJkLJudqfo2LjEf2MDw7ARBVAyI=
       "], 
      Association["Book" -> 3, "Theorem" -> 10] -> CompressedData["
1:eJytUMERwzAIkwSLZKWMkAU6azcKWHbau17zin0GWzIg2I7XfhDAu81DKzMA
1YUSE1lOMlWOtREgmfxftGgO87XgY/Y5ubPgPUt/iWghbSe2tFifFSsaUUTj
ccVzPFZXPZGA7SBrNqKT1diwOsaY08zdBAzqR7HTgp/IEzliA1s=
       "], 
      Association["Book" -> 3, "Theorem" -> 11] -> CompressedData["
1:eJytUEESwkAIg5Cs4/gBr37JJ/QDvrU/MlAPXvTUbJsuDIWQx/Z6bhkRe9NZ
uN8QqwCSWtdVoug05hGqpkod/QJZcxoHk07SPdGN60S5g/8NyaNkrRZi9nbl
F/2pgws+1lxUZyh1XlPUqjlB4bON7wsabqsAwf9WD5IicoY6ZOTKi6eTmTkl
3r7d+Jbs+/hkUxsB2t03n0AFSQ==
       "], 
      Association["Book" -> 3, "Theorem" -> 12] -> CompressedData["
1:eJytUNEVwkAIgxDOp1u4kiN0AWd1IxPOD3/0q7k2PXgUQu7H83FkRLxMp+GG
WAWQ7HVd1Wwqi3kaVVPUjn6BrDnGZlJJqifcuM6Ua/xvSO6StSxErO1KL/yp
zQUdaS62M+x2vqfIqjlB4bON7gs9bKuAhv4tD+qOyBmqkJErL5pOZuaUaHu7
8S1Z9/FJphoByt03ePkFMA==
       "], 
      Association["Book" -> 3, "Theorem" -> 13] -> CompressedData["
1:eJytUMERgCAMaxKOPVzJEVzAWd1I2oinH1+WIy2UljTLtq8bIuJI+NHQNaw1
jQVRzP4IRiig+kyf9cMKHhbezv5LN7l90/ETKYkkXneTi/mZMWs+KgeE7nrU
YU7FksJYyaEN4Wat4VYgfWd3TMuYMd/6cPrnPCdT2wN9
       "], 
      Association["Book" -> 3, "Theorem" -> 14] -> CompressedData["
1:eJytUMERwjAMsyXbKceLBwOwEiN0AWZlI5QQjvYBr+qhc07SxdZtfdxXN7Nn
pwNxZWWRRDAQABO8nBeQnqwwAm6Jn/lQSvGRjVahBwwMTVGUeuy6ZvyrAm8L
hOikQzTFHD7ghnfIJHInkRvzEKc2f4PpSnNaJ3k5GjNU5sL0sYh/ke6B1ntR
b2h+aoq+AJ39BTE=
       "], 
      Association["Book" -> 3, "Theorem" -> 15] -> CompressedData["
1:eJytUNsRgDAIS4BT13AlR+gCzupGQqmeH/auH80Hj5LQHHs5j0IAV4SZUDOj
QyAUz0pZt0U0OlPECDb+KVuI7Dsx2+6wh9dLRYfRFfOjZHY/uhj73dRjUhh1
0uSRa1uU8Gc0O3FYmN8INyjsAyk=
       "], 
      Association["Book" -> 3, "Theorem" -> 16] -> CompressedData["
1:eJytkLERhTAMQ2UrR5GKFViJEViAWf9GX7JTUFHhS8zFxNZTjus+rwDwc/o0
9jmZSYayvqpIIoGZ3NMX5lu3eyJXhEInpLNPGvk17vvA0tNmm6Epmiuar3kX
MYtR1lXnulQOGPF0pbehc/3UquES2rZ+rIRNQ3UWwxijtQKl9gB0h8cTpefG
wB/gTQQu
       "], 
      Association["Book" -> 3, "Theorem" -> 17] -> CompressedData["
1:eJytUEESAjEIIwGdXhyPnv2ST9gP+FZ/ZCDreNLT0k7aAg2B+/Z8bIiIV8Ox
drmCZKFwYiaCSRBY57pVx1fk78+QETaOw6v0LBEdLvc/4dSjZdky1VupwRY4
KL05ofh42AcH4eaNkxRfz0yGMbldqAou5u10XUpjHDIGmDuTBa5OYOwjSnHp
+gbVdAQS
       "], 
      Association["Book" -> 3, "Theorem" -> 18] -> CompressedData["
1:eJytUMERwzAIQxL99pl/V8oIWaCzdqMKcNNX+4rskzHGQsfjeO4HIuJVdDE2
SBSEpJIBxwQzdWc9Z/D3XxjEoMswS75aFZfb/S/Y/Ti2BjZRRlgmh+1X/RSf
DOtgM+z55C6Kb4ZN0bXVKBPTbPaUO0hplGuYay4Lt2VyRiRrOXwDqMED5g==

       "], 
      Association["Book" -> 3, "Theorem" -> 19] -> CompressedData["
1:eJytUMsVwzAMEoj03BW6UkfIAp21G1USTnrLKdgPf8A21mv/vHdExLfpbiCT
grAxlSgCCW35ZKuKvDhaIIxxw021VJZye9ZrtWU6llFfo0R2SHPlzZHi2GEP
HEZlPnlM8d/hUIy3H5Lgx9xtr4kyfTMDXHVZeKyQLlHZulo/nKgD4w==
       "], 
      Association["Book" -> 3, "Theorem" -> 20] -> CompressedData["
1:eJytT8kNAjEM9DmOIyHYD39aooRtgFrpCHsBIR7wWsenZuJMLuvtujIR3Tvt
bueTsPFCB5rhbiLiQ+aUQcIs5L8fzczhwKialgkgQcjUclXDsrfU+IsClbRr
RMLMXd8W0b2XaR2Hd7RGs4btRapQr8E36kZXaHM2RjTSXeuo/VQDjMQowXoU
KnJ9G5+VXP5lKGV15QmasdIDpmQGpA==
       "], 
      Association["Book" -> 3, "Theorem" -> 21] -> CompressedData["
1:eJytUDESwzAIA0uAr70M2bv0S3lCPtC39kcVuQxd2ik6DLaMjU7P/bXtbmbv
TtfjYUZbfPV1ZnKMkdOXhdPg7pa/h0YECVA1JgUQpmNGpID71Urj7y2gVIqS
HiiQWdVKJAgtSEgcBVHN8FBacTZpJZTPHjRTSTZfnRQs/a9B7jQM733ICbeb
KPkh/9hPCWc528MvMEo+ySTZ5hke9gFq8AaS
       "], 
      Association["Book" -> 3, "Theorem" -> 22] -> CompressedData["
1:eJy1UNsNAkEI5DHccIm5DzuwJUu4BqzVjhwuGr/0SyfswDKwS7jst+vuZnYf
+gtg7Sc/ryQigotvG9oUu3V87KoqIBPytUJIpOnKKpJL8tdj1lc1U9Rm1P+V
siS7OdBFnALzcFmcDI5Ju55FOpTyqslDJTD5HpKhu2cOd1iGs2asKrfFwrUP
7Q/TinS0wnjDJekBqFGrqmDF4g9bKQa7
       "], 
      Association["Book" -> 3, "Theorem" -> 23] -> CompressedData["
1:eJy1UNsNgDAI5GiNJPy4gis5QhdwVjeyUE3rI/1S0vIoBwed07okENFm6g/R
qKykR8REsBOyj9Ap40ZyABRrt6S/nrO/f+WzSUzXQvdhBpYFXpqVRwc6FHcE
WkeEaXDKyaij8EicExLOL3usjzLCte0OVh0Dbw==
       "], 
      Association["Book" -> 3, "Theorem" -> 24] -> CompressedData["
1:eJy1ULENwzAMI0ULGTt06dqXckIeyK35KLJQ24WBdGoImBJoi5L83vZ1I4Cj
0i0oLz7xAEEzGoMhCpLJdVnFQNIUwwBWs3+P+duw9eMXgFglpWnergzfvB3c
9vkoGJTHnfFHNVtqaXF66oqGacv+vEFtNsSvqigidAJ/wQOf
       "], 
      Association["Book" -> 3, "Theorem" -> 25] -> CompressedData["
1:eJy1UNsNwkAMy8MehJUYoQswazfCDqJUquhX69NZed3FyWN5PZeMiNV0D5o6
qExUiRvoThuJ/01JUIU08CGEDEW6W97VKnGedboty1LgEb7Y23YPEcOquT0Y
TMSlU73lPVjVXFITB6MgCrUtC4EXU9D2kD9on7Mn5PyYYHa8AW+eBWc=
       "], 
      Association["Book" -> 3, "Theorem" -> 26] -> CompressedData["
1:eJy1UEEOgzAMsxMHrrvwgH1pT+ADe+t+RJpRKiFtJ7BUp3Ib1+lzfb9WAvg0
ugm24AGCZjTSDU6HZD7pZw8TRaeaBrDmdXXI//OTI1ZHDmZf6ZT3UIZvnQ7u
8+wKBtWKYP5R282tVcEo3fPBsuVxvcN7NmiSQlmhDWSRA5U=
       "], 
      Association["Book" -> 3, "Theorem" -> 27] -> CompressedData["
1:eJy1UMsNQjEMs5v/EwNwQmIlRmABZmUjkl44wQncNHVrS7F6vT9udwJ4Tvsb
zkiPUHLFoZlMkBQcn4dWpXumV0OqIrISubk08vLriPldHVmByVRp5i6iKjtK
NhdvSC/v2GrzYmbd1bZpUpd4X3xbxzR8PCY6Yp+mqnuQElS6YQlq0U4Cuo88
u9Fmdr2xyIij/xijKqfwApWFBss=
       "], 
      Association["Book" -> 3, "Theorem" -> 28] -> CompressedData["
1:eJy1kNERwzAIQ3mgxgNkgqzUEbJAZ+1GxZzjXHrX/kUfMgYJg7f99dwxs3en
28AKhjsOyOVhEtHiv+vr7EgH4bcO+xOXV3MZInKdXijOuk/pkacoLznz5NFh
ZigyjkaSmypsXSeZyhYMCUN/4vzJpcVjEZmJDzBcA2Q=
       "], 
      Association["Book" -> 3, "Theorem" -> 29] -> CompressedData["
1:eJy1UMENAjEMc9rEORBPFmAlRrgFmJWNiNuKe8ELrDRN5aS2ctsf990APJX+
B7sw081antupAlY1ts+iEd7D6wg+Affw8Qq//tphfGVLHCCQKTuk7Mzsb3sH
BhGDX026uIiFGssxzMWERigftRxYR5agI5rZJgvJqjgVG2nsdqAo/aclaXOm
wAvohAYY
       "], 
      Association["Book" -> 3, "Theorem" -> 30] -> CompressedData["
1:eJy1UNsRgCAMS0qqc7iSI7CAs7qRyEPhw/OL3LUNNH1ct3jskQDO200EkQdY
YqRZgMTg+qkZY+WpweRlPzBMZUPhJc8uWzWoIhKPb5r2U2RgayQZlOmazKT6
CnxPwn4HdjdZXO6exAgX8f4DNA==
       "], 
      Association["Book" -> 3, "Theorem" -> 31] -> CompressedData["
1:eJy1ULsVAjEMsyPLse9WuIaVGOEWYFY2QuFR0EAFKpzETvTJ5bxdTzez+yr/
xF7FMUbNcRyYBvdh+Vk0kxFApLBTAGFgdmZ3J7Zf++PXKaAizSr5gdyge9t6
QYfuglB4LuBcnSCrqqnteiXTDXVed1aUro4QQW9rWNmxoknIXVkllstWwlVh
SQ53/UpkhFFkeMMQI4sMQ8iA7+nTHpPZB0E=
       "], 
      Association["Book" -> 3, "Theorem" -> 32] -> CompressedData["
1:eJy1UEEOwjAMi+0wbeKE4AN8iSfsA7yVH81Zq2oc4LZItRI3cd081/drRUR8
Ck6N24OZvF+xzJjjIiGkfwPoJ45dzqlk5JlWfwSPhUSQwO5yR5sdTgfPgir8
3YFdYTAsqIpkm5ZTmK03BaQ7mmCTsywnL+IrukFfetETvKUNkrQD1A==
       "], 
      Association["Book" -> 3, "Theorem" -> 33] -> CompressedData["
1:eJy9ULkRwzAMw6OcipSpUmYlj+AFPKs3CinqHLtJl0ASxaNICIfXui0rAewZ
fosnKT3u7B0N3Sbob/2cB+cuxZaY999x+TNVDEReESdVR12oLmqUQvlhdOUZ
iyrnNRkcaYwMA5y8HoT1HKSxHPZ9IHsaRdyk3ujGN189A58=
       "], 
      Association["Book" -> 3, "Theorem" -> 34] -> CompressedData["
1:eJy1UMERwjAMsyT74MWLBViJEboAs7IRllt68IBfdRfFseTYyW153BdExNN0
MEjxesH5hIqSENQ/O7YVn66OOaVHDvoDXz0zCRKYKYd72Hyre54mIyeViP2j
19hMiz6RW4/sEOxUO+a1nAtbpgXvFRpwkK7U2lpUFZR4AVXRA6o=
       "], 
      Association["Book" -> 3, "Theorem" -> 35] -> CompressedData["
1:eJy1kMENAjEQA3dt74YfHxqgJUq4Bqj1OsKBk7h7wAssxYo0jmL5utxvS0bE
Ou3fupwBZLEZJWVIH6MliKQdUHeVrwGqJTVLP6+L73RiRnQDhSnSxVzuZZu4
84PGIMYBkbvwE25s+w1eR5GMaWROmj6ewsoZV75VmYJH9Ui2wjj56QMvfgTz

       "], 
      Association["Book" -> 3, "Theorem" -> 36] -> CompressedData["
1:eJy1kMENw1AIQ7ENzRA5ZaWMkAU6azcq/pHaJFJ7ai2EkDDiwbLd1w0R8XD6
u2YCTN4YUxWj6qNTAi2oJ+gEBKCSVEzp12jf74fb7GiMhgEGnTQKvHQoz+pr
URcD33lv7vIimieZAYUCmeoXcOxS700PFXRQmcU47WbHZM8TsVAENw==
       "], 
      Association["Book" -> 3, "Theorem" -> 37] -> CompressedData["
1:eJy9UEEOgzAMs51UgMQn9qU9gQ/srfwIhyIx0LTb5rRWGketm8fyei4EsBb9
HhK9mjBPo6DxSyNYgOCkqJdD8k79x+4VLCsoR3Xo9nrpreMzMom8NfDkLp6K
76yfO8LEaB6Bw7ODmVlvNsQO7UhLhzNnmYMaYwM1bQN5
       "], 
      Association["Book" -> 4, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoB+hr20gCyCELAJZUAnw=
       "], 
      Association["Book" -> 4, "Theorem" -> 2] -> CompressedData["
1:eJy9UEEOgCAMW1uIHP2CX/IJfsC3+iM3IEQPetMmNGN0W9my7esGMzuC/sCM
MiFblmDUmxL92FXlMZW89EuPD7jNlAgSqC4ru9nhdOQZFFBN+afHolsczHiM
G9lnJA9BTzUFjbUhWzt/MuZQD6SoVBstppLhWzoBK08Dgg==
       "], 
      Association["Book" -> 4, "Theorem" -> 3] -> CompressedData["
1:eJy9UEEOwCAIo+D8h1/aE/zA3rofraCJ0WTLLlsjHLAFSqnHXiEip6dfgLwh
i6qC6ZHYQ7AWQemHG74Cd4iYi/dsDD8TuTtclIm3AS/kEyzcojn3TvxgwUQx
kILZRhlfqC/aVwL5
       "], 
      Association["Book" -> 4, "Theorem" -> 4] -> CompressedData["
1:eJy9ULENgDAMix1VYmBm5yVO6APcykfEaSlMTAgrcpo4ldOudd8qzOwQ/QNy
SrPlbcjdCZKRHUJURjhaSf98q3dVMpRjJRFxoZ2zlRsrc/TBPpQv6MyhtQ4e
JKNSIDPAPMIkyB79L2LMWtyYQ6frevqlbidr2wPF
       "], 
      Association["Book" -> 4, "Theorem" -> 5] -> CompressedData["
1:eJy9kMsNAkEMQ5N1vrPUgERLlEAD1EpHOCDECU6Id7AyTqR4crpczxcVkdvI
nzhaCVQ3qc9LI9wMsCC7EzgEHntE78uxfh3Kv3YBSk8sgmGtzu7MZDTqWIl+
tOzlUGskM2Jl0OrEY3SGns6i01OmJ6oquEiVfxVJZjIJKMNtEmabqj1vIZbg
+w1r9/I5UsBKD64pdwD4BpY=
       "], 
      Association["Book" -> 4, "Theorem" -> 6] -> CompressedData["
1:eJy9kMEVAkEIQ2ECgV2rsCVLsAFrtSPD7smDnnzmwLwHA/lwvT9udzez54R/
aZXBfRk/m5IZAQSlS0pIGJJNdjex/5opv1YBBXlWiQdD1r33wFFok5OIwvFy
MhEDvx0bSH00lko4P1Wz1NwMnpOKMavJyF27mm01WIQr6ljkcs8ZG2FZK7He
pFm6kyFk4zud9gK8vwZY
       "], 
      Association["Book" -> 4, "Theorem" -> 7] -> CompressedData["
1:eJy9UNsRgDAIC4EP13AlR3ABZ3Ujgfj60S/PtJe2kAu5jvMyzQZgLfoNUTTA
nxWWoAnsglbkMzw7X0d6N+x5VCzBnYwgK6Q483q3cFRYB5stM5/cIlwVNqG1
NSjCNExb8ryEu5wJK+dbwKEExP5Fnl553QAxzAOc
       "], 
      Association["Book" -> 4, "Theorem" -> 8] -> CompressedData["
1:eJy9UMkNwzAME0klmaMrdYQs0FmzUSXa9a99FSFgQgch0Xqcr+eJiLiabgQB
xPF9abVZmpZBGIgRU9qQ/zb0+//oNm1rQiL70U7tk8spVp2mSkQsnhNWhSZY
60VkUMhaqHBZPsVnZufcoWzIKC9A+kjqK2Vp4w1G6APr
       "], 
      Association["Book" -> 4, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoDNjpb+XQAIyMjGAMZIEwGwOMxQiLJdyxxYbEBgCj
JAKa
       "], 
      Association["Book" -> 4, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoCWTFGBlZpHFKMzIywmhGZhDBwAjmMYDFmRiY6eNK
VDdBnIMWUih8JiYwwtTKCNaOaSAGkxFhJCMTiMvIwMTICvI4E8xuRjS9YIsZ
gaqZgXZDLAJyuBk5ADnDAz4=
       "], 
      Association["Book" -> 4, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoCnQYGRlNccoyMkEcBFTEyMwEBAxMQB6QYgSJQ3hU
BUQYyAhyDhMTWkgxIvNB7sTmOJAizBBmxMJkQjWSEeRXRmawCCgs4GphAlDA
BETAQANiaJAxiDDzAgBbEgNx
       "], 
      Association["Book" -> 4, "Theorem" -> 12] -> CompressedData["
1:eJy9UMkNgDAMi50vH0bgwUIdoQswKxsRA604isQDYfWyY7VuhjylDDObtfwL
AONzkSgmOAPGYLFB+sa+TfPKApAXJ45cOVvhZLq/gMaR5yuhv8JXRb2o3iLs
YIxoWkypdFrv3QIHRwMx
       "], 
      Association["Book" -> 4, "Theorem" -> 13] -> CompressedData["
1:eJzNT8ERgDAII8HryyVcyRG6gLO6kQRa/fnyYY4rJaSEbv3YO8zs1PEjuDtB
MrJDiMoIR5Xk14bvA9MPyrGSDmKi7knlxsq8eZRQW3vyQzN6xUgwtTJqDTID
zCNsfFdTpswqHqzR56Ln6Zd9uwA8hAOp
       "], 
      Association["Book" -> 4, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAAjIyMYAxkgTArA4zFCIsl3LHFhMQGAJkUAo4=

       "], 
      Association["Book" -> 4, "Theorem" -> 15] -> CompressedData["
1:eJy9UNsNhDAMSxOnrZAY4lZiBBa4WdkIu0jwgbgvdFbk5tE2Tj7rd1mLmW2i
vwOPlYhAzQTPyIPS6DR6jYXpbSnxW6iU1iGLAgiIagWoECdy8D3DFEe5wjMz
fuKgKtIYHjrcwi2hDYWXou5sWySEFxOO5GO/ELTQtqzPvc09FewpGAWI
       "], 
      Association["Book" -> 4, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEWBkgjiIEQiYmYCAgQnIA1KMIHEIj6qACAMZQc5h
YkILKUZkPsid2BwHUoQZwoxYmEyoRjKC/MrIDBYBhQVcLUwACpiACBhoQAwN
MgZeZm4A0RoDBA==
       "], 
      Association["Book" -> 5, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBgBtZmQEU6OAugAAmjwCfg==
       "], 
      Association["Book" -> 5, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAjAyMgyU1cMeAACY8gJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBDAxoFsN5TOCAJzPCIKMaEroAhjpax1ORzCg+J6R
EYpgYoxwBtzBAKqrApo=
       "], 
      Association["Book" -> 5, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCKBbzQQRYQIBCB9MACETIxQw0dF5QEuh9gKdSi5g
ACO88jAWdkfAwgEUXGAOODjgIcIACSqQJFAMpAIoCAD6HgM0
       "], 
      Association["Book" -> 5, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDDACbR9QBwxPAACY4gJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDjAxMoAgCDAykukURqhuRjCTEWweI1iAEWomVBIF
MEAxRAuacYxQxsAGDjKAuZEoFwEAzJcCtg==
       "], 
      Association["Book" -> 5, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEDAyoDmAkRFKARlggiBgAhNAAGYCKTCDCQqQJFEA
AxRDtKAZx4TFGnoDVJ8zMTGCxZjQJcB8RhQeAwD8LwLx
       "], 
      Association["Book" -> 5, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMDKCSAoMABGMUFOAFCPEUAhAkkQBDFAM0YJm
3IAHCgaAuZEolwEAxeQCsA==
       "], 
      Association["Book" -> 5, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGOcDtRKAMIyNWFXARRkawCoi6EQIAoP4Chg==
       "], 
      Association["Book" -> 5, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjBiZRJSOgqgACNMAJitAn4=
       "], 
      Association["Book" -> 5, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjAyM7AwMzICncKMKclEf/dAACMYDQ2AGkgArtoC
kQ==
       "], 
      Association["Book" -> 5, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGHDCCERAzMjNiAKA4AjDT01VgN6G5AEkWLs2IJgLT
ywCVYWSEsWBsFBEoRLcdTsINgRiKrhIYJlAnMkDNAQDfGALi
       "], 
      Association["Book" -> 5, "Theorem" -> 16] -> CompressedData["
1:eJzNj8sNgDAMQ218YA5WYoQuwKxsRJImrVDviBysfJ5k52jX2QjgdvlBaTch
Ni4lgiPkp2klc5YsgLXvUHF1jaGWPSdZYhASQlH5R26m2fIZh2bX+XLiwHxU
rhjwA/bZAw8=
       "], 
      Association["Book" -> 5, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAWCEuGOQuGaIAwCYwQJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGA2ABEcxYXMMIIRgZYexRAA4HRjBAEkAAAKXMAo8=

       "], 
      Association["Book" -> 5, "Theorem" -> 19] -> CompressedData["
1:eJzNj8sNgDAMQ22isAcrdYQuwKxsRL5SJcQV4UPquD68HPMckwAuH38RsRsO
Xbm6tnDsxodSNRa16VZE2Kpff2PtXBJ7gVdUCd2qOyJZz5F3DnaDz7BsQnR6
A+3xAwc=
       "], 
      Association["Book" -> 5, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGB2BixS1FR2fAAQsYDQJAsu8BpmcCjQ==
       "], 
      Association["Book" -> 5, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCRhETgEBxkHnImIBAJecAn0=
       "], 
      Association["Book" -> 5, "Theorem" -> 22] -> CompressedData["
1:eJzNkFEOgCAIhmEw5zW6UkfwAp21G/XzQ1s110NPfSIiiArL2NahIrKH+g0W
ygMaqTCaFT5JYgUKaGKhocUleENqZsrjOjq6Wc9X8bGvCOU1flpTouDGEtAc
bBStcTdIZWWrIghfnIDzAMxHBKU=
       "], 
      Association["Book" -> 5, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGF2ACATADRDAyAzETIxQw0dMhjIxgAkQBbWaCOAJJ
Fi4NFQQ6E0kcIgQ3iBHJUEZUETDE4TOQIqgUEy41EGVgeyEqANTtAtc=
       "], 
      Association["Book" -> 5, "Theorem" -> 24] -> CompressedData["
1:eJzNULENgDAMi2uHP3iJE/oAt/IRTdIOSMwID5bjWJWbvZ9Hh5ldQf+CB1IU
0UiXOwD5hz3atjUFaKYHyOgiFQvTRjrFTAmZYBldKZSfDmM5ofcS8WGWVA1x
jgEljwFrOS4UifHWDZerBEk=
       "], 
      Association["Book" -> 5, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEcB0DhMOcdoDFhYGRkYWIAliMjExMUIAXBbEAgtA
hRiZwAxGJLeyMEAVoXkAKoIkyETQOagqGFGYjCjGAwDJtALT
       "], 
      Association["Book" -> 6, "Theorem" -> 1] -> CompressedData["
1:eJzNTkuuAkEIBKuhYXq6x6TNy5vE8bfwQB7BC3hWbyTj5u1cuXi1qAAFVZzv
j9udiei50n/CqW/HWrfLflmGw2We20+h8XeWeScCr/7tvPJZXWUlmqber6NZ
cZEUAFCrBbubw8W9RDWsE1EJfhMwNUGDR+cY3rAiGGEmsHWpASVOVHsEpbSh
DUONktKRWTQsTIuriuYwzixZPPEfwNx3rfdM2TLis8KgFwDKCSk=
       "], 
      Association["Book" -> 6, "Theorem" -> 2] -> CompressedData["
1:eJzNjTsOwjAQRNfO/rJeQ4IV8ZFAQkJQcRmOkAtwVm6EDQUdFQWvmGJmNHOc
77c5AMCjyV+xXQ/jOF1O5/Nyv5umsnHIu5Me1sxoY//rO/+etlgAUirlkFXd
iPCNe1/VrDc0MnMzleYQt4a8a9kJHa1ahumFpuaoEmorOWJCEpFSjxAjxNAx
AxJcQyDupFNa9cxSZ1k4iHKm+KGLMefFMAho1nqOJWJ4AmA7CbY=
       "], 
      Association["Book" -> 6, "Theorem" -> 3] -> CompressedData["
1:eJzNUMkRwkAM8yE7dkiYtEBLlJAGqJWO0JIBXvx4oMfaI40ley/77bqriNzH
81+YIqLOp3Ut1swJYlOhygwI/Dpt/i6pCqCqJpJZlUa0vbAs7L0Jb6+x4NyD
QYS7Z7A9SliTaT/GCoN5IzhgSPozEGDUOFPEZX6qQpfyTPiwHpql6QdcEhs/
zLSTeeqtLg+BNQXR
       "], 
      Association["Book" -> 6, "Theorem" -> 4] -> CompressedData["
1:eJzNjTsOAjEMRPOzHSfZxChaEAUFEohPxVk4wl6As3IjvOnpKHiSRx6PLR+X
13Oxxpj3Kn/G9XZ+lFrKqdaU7mU2+7ZRmEku9Otn8WvirHW9a1N0K6bUdrV2
oQEAxCiqooBQa701dTrhnFWHqGECBsmkkQxAbwlq1YK8huo4ckz6yHtvnPNM
xqGZnAX0zSMirBUQtsHilCGhDsEPLAZOmTmb7WGGIIAu2A996Apf
       "], 
      Association["Book" -> 6, "Theorem" -> 5] -> CompressedData["
1:eJzNjTkOAjEMRbN4SWwcMkzBUiAhUdFRcBGOMBfgrNwIZ3o6Cl7xZX/b35fl
9VxiCOE95N943O5Va722pnqXOZzqxtQNnm/061/fA1OMaZq82PoWMevBbNeL
w4yItXbX7mBns8nMu+GrutoQRC0FC3ZlH7UVIEJCs4KGbQy9q6LC42OCkFOu
HBIGyREotUx+AAAeRg0jNAEhNzGvRILCIlTC8bwH8vgE8QMqcQnz
       "], 
      Association["Book" -> 6, "Theorem" -> 6] -> CompressedData["
1:eJzNjbkNAlEMRP8xtv9prT5arUAQIFIi6IQSaIBa6Qjv5mQETPB8aub8fD2e
3jn3XvF3ut5qr/UyTa3d2+KOVVvvKfG4ya+jvhsG78M8W9PtS1LqB9XdYBZh
BlDKMA4TBqvOqiLrpvRubCusZEbG6IyBaRMLIFBlKNp6tKnkkpMFxRhdCDGL
C+Ra8CRRIzOTBRIRK3m0gsK2pLjJm0WuVYrbnxaQggP8ByvOChE=
       "], 
      Association["Book" -> 6, "Theorem" -> 7] -> CompressedData["
1:eJzNTTFuAzEMsyVRsmX5jByKogWyFOjQ5Dl5Qj6Qt/ZH1WXv1qEERIgiIX7c
H7d7LaV8H/T/cPVw/1xrjMt4KWefM6J3O33pXzf9/pBqpX3PZcuUmsV7xGmZ
tWYGoPeVvBJYNuc+Z6q8NPfkeRDgGTUsz8F8QhRQRBgC8zBTNR9uWcTMhYhV
CkkZVGG8sbAIQOlJcKVwasqq4CeqSrPuauXt/CrYREjqD5fACUM=
       "], 
      Association["Book" -> 6, "Theorem" -> 8] -> CompressedData["
1:eJzNjksKAjEQRDtJdU/+kzAMggjiTtfexCN4Ac/qjexk786Fr6HoD3TV5fl6
PA0RvYf8IbnmfOu9lHve6JTXUqv3sl+XXxt9f2iNsb2PLERBqcdatyYTADE2
1aagSSkaVWRsQkqqU4AcBAEtCRrqhD3gtRdUpHHUKXgtNXLOkbUuLGSFijW8
uNWpG6shM0thgxwRRZfsJkYQfcoh0OG8A4XFwnwAo+QJgg==
       "], 
      Association["Book" -> 6, "Theorem" -> 9] -> CompressedData["
1:eJzNjcEKwjAQRHc7m23SJG21QhAr4qE3v8ZP6A/4rf6Rm4J48+TBNzBsMgNz
XR/3lYnoWe0fWW63vpRj2Z0TxXnWSwFcGMKvd/LXNCUztVaeplNU7Tze9H29
vQGTT3a09UdEzFW2UuyADt4eHmFDI5CgahWThYgQVZ1sSKShhrltSRwtzJBG
2Os+WO5MrbJ6l4U/gDkPaRwdheTr+IFBL1dJCHM=
       "], 
      Association["Book" -> 6, "Theorem" -> 10] -> CompressedData["
1:eJzNTVsKwkAMTJpHs+7W0m2xIAVR6IU8Qi/gWb2RSUH888sPZyCZZAbmuj3u
GwLAM8ZfYl2HeZ6mac5QTotdFhFKNf26Jn93w9bY43guqgejN0oJbQ5yWnbR
xoeZfSrvoa54kMwPo7RDc3xUPeJ0SZlYVUcvYm6gQTeBBW6IosRkMiT3xdkq
qnLH+AEh9v2xVgHrLLorErwAEM8IPw==
       "], 
      Association["Book" -> 6, "Theorem" -> 11] -> CompressedData["
1:eJzNjcEKwjAQRHc7m+2mSdraHoJYKIJXf8ZP8Af8Vv/IbUG8efLgGxg2mYE5
3x+3OxPRc7P/5NrXeqyHNVNaFl0rEOIQf71SvqY5u6m3yjyfkmpneNP3220O
XJbNYrv9iIi7yl5KHdDB/GGIO5qADFWvuDxEgqjq7EMiDTXMbUsS6MIMaYRN
p+h5cLXKaqEIfwBzGfI4BorZtvGJQS/3UAgj
       "], 
      Association["Book" -> 6, "Theorem" -> 12] -> CompressedData["
1:eJzNjVEKwjAQRHc7m03SJG01H0G0iOCJPIIX8KzeyE1B/PPLD9/AsMkMzOX+
uN2ZiJ7d/pSptUPbrZnSetJzA1yc469Hytc0ZzO1Vqn1mFTHgDfT1O9gwBSy
Hb7/iIi5ylZKIzAi2CMgbmgCMlStYrIQCaKq1YZEBhqYvSdxdGWGDMJBa7Tc
mbyyBleEP4C5zHlZHMUc+vieQS++jwf1
       "], 
      Association["Book" -> 6, "Theorem" -> 13] -> CompressedData["
1:eJzNTjtuQzEMk21JFG28jx/SZO6VeoRcoGftjSojc7YO5UBKIgTy8/n99Swi
8rPov6L3iJueMhEP4LrO+2x/naHvrVJkjNSHiLsqNoBxHOc+BjuBSI5Ej50c
JGxd3D35YxEZdtAYvjP4gmOj59tB8L7MVJ/XfPVoUksNz0CJnLTM2oy1taZm
SpN2DQ0okIUWCswiN5ftNgxboPbyC7uGCKo=
       "], 
      Association["Book" -> 6, "Theorem" -> 14] -> CompressedData["
1:eJzNTTkSwkAM8yKfG0LJQDpghgfxhHyAt/IjFDIUNFQUyB75kI/TfL/NTUQe
C/0trpdzP5b0abTt1gwx+q9f5Fe1imR0633s7hV4IyJWDtCiItKWjqqSTV9D
lUAiWARyxUAR7oAzUsQANd7nIy5IU3aZyU6bp/LQxjRTo8ytFtW0fWCfOR1M
3BwKHRvkCRlTBv0=
       "], 
      Association["Book" -> 6, "Theorem" -> 15] -> CompressedData["
1:eJzNTkkOwkAMyxBnmdDhBELqpeJLPKEf4K38CLcSBy6cOGCNPEmcOLmtj/va
ROS50f9iWWouOc7DpslMc/ivN8RXtYpkfFY1yr2HvuG+xUFoaGZlhm0VAGTD
3lRdtWswCe07oihymAb8KWopjP5cxAFpYJWRnNA8QaODIRPRza03OAztA+fM
68V4oikUo6m8AOuyBtY=
       "], 
      Association["Book" -> 6, "Theorem" -> 16] -> CompressedData["
1:eJzNjjEOQkEIRNllYAG/25qvlbW38Qj/Ap7VGwlqY2Nl4SOZEJgwnLfbdWtE
dC/5Yy7jGOSrsxmzWODXAbuv24gUITKb013V84s3EdVbwlkWZq41AZAqeJrK
n1uU+ouFWVm1VGrMC0NVZwb1DurIC9SZAo3rCNI2BoaLijcMCNoHh7DTKpSW
Ct83pgfKqgbU
       "], 
      Association["Book" -> 6, "Theorem" -> 17] -> CompressedData["
1:eJzNjrsNgjEMhJ34kXMeNSAkJFZihH8BZmUj7HQ0VBRc8flkO+fcj+fjKET0
Svyz2hk0Ts4AM7cuv873r1MgoERmc/ow6xBW5RT2j5AFu6BbdlRkcy95dGIq
ybHVV/rWmI01LS9Wi/w4VKtQzYdB6lI4vVSNdWmupl5EI7p86DJxuxqZIM+u
IvQGjjUGsQ==
       "], 
      Association["Book" -> 6, "Theorem" -> 18] -> CompressedData["
1:eJzNjTtuAzEMRClpOPqsREOh4y5B3OU8OYIv4LP6RtZu7y6FH8AHfoDh9Xb/
uwUReex6a/qn/HQbY5TCj9/83/GvA2MI0X01JlJKrfY958XJnEkArfmyL+A0
czNy32xmy2MX0CtR4UY4TgdagII5iYmxH9e0ta3V9SilJDGmmiWqjBg0p5lI
qipWcWhAb+hcS00HgWhl67XK5esM7WBEeALe/AkU
       "], 
      Association["Book" -> 6, "Theorem" -> 19] -> CompressedData["
1:eJzNjc0NQjEMg/vjuGnySh8jIDERI7wFmJWNSOHAjRMHvkiWlVjO5bjfjpxS
eiz5b2bar8beAc4Tf93ev/+e70hr7jZUNwNEsGgtPCyAwX26O9eGZGjnK7Q8
YaGG7YWOaIAqoDFxxACbNo9HtdaUUYFwaUMGK9asuIiwSYarSPmQS5nkeWey
6qiKUZCf5fMHOg==
       "], 
      Association["Book" -> 6, "Theorem" -> 20] -> CompressedData["
1:eJzNjc0NwjAMRu04duzGEWkPSBUSpTN0E0boAszKRri9c+PAk/XkH+nzur+e
OwLA+9C/c2+tVDer82y/zv4emBDTtkXjAKqt1av72FWnSVVEhqGHeyBdl2Vb
llKOjdYaPiViRaVIrypd1pMYo9xVXLzwOamZtXhElAATMgMxXBLmTIqaJTMT
B8pIjxs3IRGmkziaaS0OI1fOQy6U8QM+IAni
       "], 
      Association["Book" -> 6, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLAA6npERQmPKQBmMYBUQdSMEAACfzQKF
       "], 
      Association["Book" -> 6, "Theorem" -> 22] -> CompressedData["
1:eJzNjUsOwjAMRPObdJyW0kBpVQJILLkOR+gFOCs3wumeHQsszdPII4/v6+u5
WmPMu+Lvp9s3u47k9ZJ+Xc2vibPWzbMaFdn3/WNZbkUkJREAImdlKeeCIjnP
OUfUzTCdlGMFkBpBgzIJCsZtQD3Fsqgw1hAMQz7kXh9574x1NgTjo8nV+GTp
GdRoWSSsPw4YCbLZ+hEYU8uOO5MxILQhergPO5oKIw==
       "], 
      Association["Book" -> 6, "Theorem" -> 23] -> CompressedData["
1:eJzNTssRAlEIg0eAx/u46tiALVnCNmCtdiS7Ox69eTDMZCBkAvf1+ViZiF4b
/T9mjAFYP9uvk+PrpjCXOZnZiVrrvQ33HvggIg4OZMWMaLopqprsupuq5deI
HAJtR52b4p6WrGwxoZ7JeVFEiCFAdnQCw8QFEqgVbuZhjOZqUkTKARa5drtd
jBYbkIpRwG+WxAcG
       "], 
      Association["Book" -> 6, "Theorem" -> 24] -> CompressedData["
1:eJzVjTEOAjEMBB3Hdhyfj3CHFIkGiZqWl/CE+wBv5Uc4R01HwxSrlb3avW7P
x5YA4DXkD1juN1Vpvf66+HshpoTuYRSgtd7PzezgHBCFqA7vATubudlcfL/r
rjRy8Yyv69Dpw6p8ijyzcWFyppW1Wu0xRISxmkWAGC4psWTNymsVKdEpzqkM
g4IYqUFGavNhWQpMx4lYyZDTG3gKCIs=
       "], 
      Association["Book" -> 6, "Theorem" -> 25] -> CompressedData["
1:eJzVjdENAjEMQ5OLnbY56G2AhPhjHEa4BZiVjUiL+OWLn3Ol18hO3ev+fOwq
Iq+BI4i83S+l2L97fxSqCpB3iKwrQAKwiFZa61vPOWmpzRoyo32dZCDRuyM6
0mrd8sFcsuFg+qeBbucsDXy+zGPLIgpxVcvRTDkCdxpddK0IVrLUKWOBzQSV
4XSWzd/inwdp
       "], 
      Association["Book" -> 6, "Theorem" -> 26] -> CompressedData["
1:eJzVjbsRwkAQQ+8neW+9cHh8w4xDaqATSnAD1EpH7JmYjIQXKFhppdv+fOwx
hPAa8hfc64TLtf66Vr46KcZk9om01vvWVM8GpxSSIuZqDg2qpnqa7LjLoSAB
M9B9GTof2FrZqQoqJw8QK6Vq7T5USvLVTIaCsMUIZsmCpXJ0eQfi+EFiSp4a
5FSWdm7LFOY2A1I0l/gGOscIZA==
       "], 
      Association["Book" -> 6, "Theorem" -> 27] -> CompressedData["
1:eJzVjcERQjEIRIGwEPK/yegY77ZkCTZgrXYk0bM3L77DMrMsy/X+uN2ZiJ5L
/oNaLY7x89avG2GWutaFqPc5Z+y7hyeqAGqN1EiwzOo+3k5rDWtkRjUvsCOa
I/Jg0eeGCz6+uqac0cxt5iNVya/FjBR0YoYVLRVbmFlWWoCtMCAmkqlFER2H
PkaQb64oaqL8Ar0pB5k=
       "], 
      Association["Book" -> 6, "Theorem" -> 28] -> CompressedData["
1:eJzVTsEJw0AMsyPZd7lQCDS5exc6ROfoCFmgs3aj+lJofn31UxmEkI3ky/a4
byoiz05/gtvVHb8O/RI4DEKqahEZR9IYQM7JU2qtAngzGjLAEIaPX1hbXVdj
bgwrN+ynHeNqC3d/ypV1wRmlFO6VGoPoVYqrIiSg1hfwyHfRKWGyA2rWe+li
tDletNNsLx4lB/o=
       "], 
      Association["Book" -> 6, "Theorem" -> 29] -> CompressedData["
1:eJzVTcERAkEIg0vAY0/H5WEB9mAllnANWKsdyd7N6M+XHwMTmMAk1/VxX1VE
noP+BTd3/Nrzi6GqkDWbSARpLCDi4BGZHcDOSATAWgxvvbGod2MkS4rE9jrQ
0jo3fRmUOKO1xj2yCtMkSnFV1AqojQNnwmbRZcbRPkB15dKFbienWbvwBaQn
Bz0=
       "], 
      Association["Book" -> 6, "Theorem" -> 30] -> CompressedData["
1:eJzVTkkOAkEIhIGCHnpMekm8+yWfMB/wrf5IZmLiyZsXC8JWbLf9cd+ZiJ6H
+Rus0F+vtO8UMwHpa2p1NwNUI4qv65zDXedsmugaqkmJviu9t3qErUFiysip
rmdrNsk2MYBIux1kl5yP8PMgU4oIkRIyWWhhWSzrrAWCQlyLFP2AAcm7ZuSK
S35occULoOAHag==
       "], 
      Association["Book" -> 6, "Theorem" -> 31] -> CompressedData["
1:eJzVj8sNAkEMQ/NzPCOxRdASJWwD1EpHOMuZGxd8iEa286K5n8/H6Wb2mvE/
WvlrIr5HEUa6uyoAiequnFeNgNRMSWaVTKVyCFzpVYpAxSwpwkeCqNqiTSlm
jd1NHcx0C31ymbe1uzYPC24XRl1uGHg0kzrKS2IL2FR/3RIbjPI3vUoFtQ==

       "], 
      Association["Book" -> 6, "Theorem" -> 32] -> CompressedData["
1:eJzVjc0NwjAMRp34s52kaRMFUZUjKzECCzArG+H0zo0LT/KTf2T7/nw9noGI
3lN/RP71Qfs6iSHEffekEYnU2o7er8MsJTMAKQ33cDC0tb01s9kp6+quU8CS
DRljVQxcTtR3Db17YJtDr0ouufoj5kghMkAsNGJQxcaJDSIsziaBa5GirCp8
AkXOZbGFbvWAbLCI8AHuyAhD
       "], 
      Association["Book" -> 6, "Theorem" -> 33] -> CompressedData["
1:eJzljdsNw1AIQ3FsE27UJbpSR8gCmbUbFfLTvy5QI/HyQTzP63UiIt6T/ln1
2x1bEftetcrOJCWO1uqe2WJHVqY8G9ud5Rs6VoPMHvJGB5p+GFNjdrWk+5EQ
ENKxMY4NfjAwN5I1UKNEEl9tQK2jSjGUQIPxAbjoBWY=
       "], 
      Association["Book" -> 7, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFGBkRHcySIARLAxmMEC5EADVgiIG5oMxAyMDI6aB
GKYPMQAAxXUCrg==
       "], 
      Association["Book" -> 7, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFmBiwiLAxAQmwRwoFwIgCphQxMB8MGYEQiZUAxmh
GMJhhFjHCBYlAzCAEV55KIMywIggGBkBJVEDRw==
       "], 
      Association["Book" -> 7, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGGGBC5zOBMJhkYGBkZAB6iZGJiREMwGpBDAiJAAxg
zMDECNGJGzAOvRACANOEAr0=
       "], 
      Association["Book" -> 7, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOGBmBmEwycDAyMgA9BIjMzMjGDCDFIAYEBIBGMCY
gZkJohM3YBx6IQQA4HUCyw==
       "], 
      Association["Book" -> 7, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHGBEB2ARoDhIClMWG2BAplHMBpuBbBedPUc5AADe
BQLH
       "], 
      Association["Book" -> 7, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHmAEOntoupweAACYNwJ9
       "], 
      Association["Book" -> 7, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIGACo1GAFQAAmVACfg==
       "], 
      Association["Book" -> 7, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJhi6Lqc1AACV/AJ7
       "], 
      Association["Book" -> 7, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKAA6nRGr87GLYjVgWAIAmv0CgA==
       "], 
      Association["Book" -> 7, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKmBhAWJGIGBhAJEMDCDECCGRAAucQAIMUAzhIAGI
GTAOI4QDJUgHDEi24JDHdAIZgBFBMDICABMDAzE=
       "], 
      Association["Book" -> 7, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASpgYhpoF1AGAJfgAn4=
       "], 
      Association["Book" -> 7, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLmBmJl4UizpqOmUQAQCj6AKJ
       "], 
      Association["Book" -> 7, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLGAFIiYgYGUAkQwMIMQEIZEAK5xAAgxQDOEgAYgZ
MA4ThAMlSAcMSLbgkMd0AhmACUEwMQEAeRAD0Q==
       "], 
      Association["Book" -> 7, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMmBjZmJkZGNgBAIGBhBiBJPMjFDAzMzMAaJZmRhR
AAMUg0lkAyFmwDhQWShBOmCA2YJbHsqgDDAiCEZGAB46Az0=
       "], 
      Association["Book" -> 7, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMGBhYMT0AQsLgs0Ik8fiURZMoWEBAKp7ApA=
       "], 
      Association["Book" -> 7, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGBiYgD6gREIGCAMBkYWFjAfDJhA8iBJJkYUABJk
YYCqw2k4WB2EZmBgJANATMArT8AJxAFGBMHICAAFjgMk
       "], 
      Association["Book" -> 7, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGBlZmTkYGAEAgYGEGIEk0yMUMDKysoGotmYGVEA
AxSDSWTzIGbAOFBZKEE6YIDZglseyqAMMCIIRkYAIcIDQQ==
       "], 
      Association["Book" -> 7, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNuBgZGJlYAICBgYQYgKTLExQwMnJyQyimTmYUAAD
FDMCIROycUAOIzxQgCywLFiEkQzAAEZ45aEMygAjgmBkBABXpgN8
       "], 
      Association["Book" -> 7, "Theorem" -> 19] -> CompressedData["
1:eJzVUIENgCAMa7c4lC98yRN8wFv9yG2IRr3AEjpWuiYwr9uyEsAe9HOQRHUA
E1BHFJEiZqWYQ1UZlTparA74dtM0kDF5wzxwuKOh6lUkycGOdk5JXkpykyIh
G7nG4nxyXGaX+a9XPTofSO3rCyObm+0zeAAetARy
       "], 
      Association["Book" -> 7, "Theorem" -> 20] -> CompressedData["
1:eJzVjMENgEAIBBduQ64NW7KEa8Ba7UgWNfczMb6cwAAhsIxtHQZgl/5OByJJ
AyRoxggWob2a0xONxhbsdTlpmX4P7npYVXoPKh73V/MNn3I/AMzbBC8=
       "], 
      Association["Book" -> 7, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAHAiDJwMDBysDKyMjKyc4KBuwgSRADQiIAiMvI
ysLOysXJzo5sEgsQwwOFmZkBpBVEgwjSAQMY4ZWHMigDzAiCmRkAJvUEww==

       "], 
      Association["Book" -> 7, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAiAJYoBJJkYoYGZmZgPRbHARCGCAYjCJYhID
kgBMFkqQDhhgtuCWhzIo9D+CYGQEAAf3Ayg=
       "], 
      Association["Book" -> 7, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAjQBhCCUZgRBGGCAYCwmQfEQBgCyEQKa
       "], 
      Association["Book" -> 7, "Theorem" -> 24] -> CompressedData["
1:eJzdUdsJgDAMvMZaGywW6l//XMkRuoCzupFNteADBH+9hEtyIYGQKS1zUgBW
oT8gxggEwDv0RNx5tpaZPRGpHJmagcUqIBrb4LQex/MiC1BbC61hzB6FvgPF
X/tHcsf1MfmKotHzciVe5oVU3rUBYSQGUA==
       "], 
      Association["Book" -> 7, "Theorem" -> 25] -> CompressedData["
1:eJzdUcENwjAQc+6sJgUKUis+/bESI3QBZmUj4itBpWyAEzm2o1xyym153JcE
4Cn6C8xXYAQuJxzN+jyUwGBmScL8XDQaoKzk8UBO07ZOBozNuKPr6koGVXjD
qiPiLgleI1UIw88x6TdrMxy5bwffH1O7iMx+G0+aHq+Vq3e8AAkOBdU=
       "], 
      Association["Book" -> 7, "Theorem" -> 26] -> CompressedData["
1:eJzdUYkNwyAQM3dWoE9SKVEH6EodIQt01mxUfClRmm5Qg4xtxMGJx/x6zgnA
IvoP3IERuF1xMTvlvgR6M0sS5kPRaICyksczOU37MhkwNuOOrqsrGVThDauO
iIckeI1UIQy3Y9If1mY48tgNvj+mdhGZ/fadND1eK1fveAPt4QW7
       "], 
      Association["Book" -> 7, "Theorem" -> 27] -> CompressedData["
1:eJzdUYkNwyAMPBwrJsJqJTJBVuoILNBZu1FtU6o8G+RAZ99ZGCy29n61BODj
dBtU4KkoRItUzVlVKxEli0rTQ30NwD0ttTCv675HASgPIYJ5tsgcZJCBnofF
Jye4W94hBP+Pef5jL4aK/kccP8amCI+uQyffEq91ZXd8AasvBwk=
       "], 
      Association["Book" -> 7, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGE+DjZuBiYuJk52YHA24mJiZGEIOJmZcdBGGAASTG
zibJycIiKopsABsDAxMzjMPMzMDKCqFBBOmAAYzwykMZ6AA1YoC+AIsxYfqY
EYTA+kEEI9AsAJaGBU8=
       "], 
      Association["Book" -> 7, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGESDPO8MsEACWtgJ8
       "], 
      Association["Book" -> 7, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGPBhmgQAAlbECew==
       "], 
      Association["Book" -> 7, "Theorem" -> 33] -> CompressedData["
1:eJzdUdEVwyAIPCCBJ6Y/2SArdQQX6KzdqEJqXpNs0FNP7hTF59Zez0YA3kF/
hQcW5jrNXqu7zyJCfXaS1aMNoA/2opOI2W9+AViHUEXpBnKLBXRgj9Oyi5O8
W3FCCjvSIv5yLKY6l5A4fwwzpcf3B1P0rDiI+h0fU7EGwQ==
       "], 
      Association["Book" -> 7, "Theorem" -> 34] -> CompressedData["
1:eJzdUdsNwjAM9KMOQU6ohJCqfrISI3QBZmUj7CtFrdiAS3L2XeI8lPvyfCxM
RK+kP8MsMg3Wbld3N1XliM568WytOUAx5FxPg2qt++oQYpsohcYxYu+ggG1Y
c1iZlJ0DXheRGUH0b1nmH85JKOx/xPFjRBie/D6Xs+PGSRxnvAHc8Qem
       "], 
      Association["Book" -> 7, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 36] -> CompressedData["
1:eJzdUdsNwjAM9KMOQU6oVIkf/liJEboAs7IR9pVWrdigl+Tsu8R5KM/5/ZqZ
iD5JZ4PIY7B2n9zdVJUjOuvNs7XmAMWQa70MqrXui0OIraIUGseIvYMCtmLJ
YWVSdg54WURmBNG3ssx/nJNQ2P+I48eIMDz5fy1nx42TOM74AsfMB5E=
       "], 
      Association["Book" -> 7, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHWBBMBkZoT7E4lEWTKFhAQCgAgKG
       "], 
      Association["Book" -> 7, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGM2BkhPoQi0dZ6OsUugEAnAYCgg==
       "], 
      Association["Book" -> 7, "Theorem" -> 39] -> CompressedData["
1:eJzdUYkNwyAM9BMTKluRKnWBrJQRskBn7Ua1L48SdYMecPYdmEfM63tZmYg+
RX+IweL1dHdTVc7orOFo4QDlkEcfB9Xer5UpxA7RGk1TxghQwg5sOaxK2sUB
b4vIjCDiLKt855qEwv533D9GhOHJ71O5Om5cxHnGF434B0I=
       "], 
      Association["Book" -> 8, "Theorem" -> 1] -> CompressedData["
1:eJzdkNENgDAIRA85GtZwJUfoAs7qRgJNmib6558v6bXAHR/d+3l0AXCl/BFv
lriIbPkQqdra6BvibEZvNPc1SEDmp5Dhi1u1JCD1CYuXgYKaOS6m4eMQTOM3
mFJLYukN8sIEnQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 2] -> CompressedData["
1:eJzdkd0NwyAMhI1jOYEDoSp97UNWyghZoLN2o9qmrRp1g36C43z8CMR23Pcj
EdHD5S+5lNJ7uzHz1A2elrrUua5rDcg6o+SrSM7f+woRz++iNVK1USTEAHRg
fqiohWjAJwkdiwhKVkBFEQz/Up+MKs4/c/4Y5hQZ/z40eYM7v2lS6BN4owb0

       "], Association["Book" -> 8, "Theorem" -> 3] -> CompressedData["
1:eJzdUVsOgzAMc5K1Ql1BVX9giJ9daUfgApyVGy0JA4G4wazWtd2mD/U9L5+Z
AKxG/4ncTdMrigiNCpJcc33Wcej7qoB27trwEAnhXNYCnHZTCppGxxidFGnH
pj0yUU6J87YIKcFNPMpM/9gm3fn+V1w/hpk84/s7yZrf2Ij0jC9c/AhX
       "], 
      Association["Book" -> 8, "Theorem" -> 4] -> CompressedData["
1:eJzdkesNgzAMhP1oZKOAfyH43ZU6AgswazeqfaAK1A36KbncOS9FeW77a2Mi
epf8KWvE2FSVI2Gd+tTHHtEBZZfB7aHqft2VQdo3OC1LjvMMScxaswLeUCrj
lwr0WERmhHBkO/2pNYmE8+/cP0aEUZPfZ3I1K1fCeccHW9gI9A==
       "], 
      Association["Book" -> 8, "Theorem" -> 5] -> CompressedData["
1:eJzdkd0NgCAMhPuHBFPQFVzJEVjAWd3I0geiQRfwy+USjlxTwlaPvSIAnM3+
Sik6MTMWAzkbRddFNasq5Kw0JxFmkXspAVDoh9QEEKObE0J8Ib2FVnN90y+H
5Z8fQ4Se0fhKbPJ+M7RZF0ikBto=
       "], 
      Association["Book" -> 8, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLVCVYWVmZmZUAAJGZm4hbiEuIVERcXEhIGAAYiY+
XpA8KyuyHl4GBiYuGIeHh4GDA0izsYEJIOCCAQgbLARi8CCJgEmIIgYuLgYw
hw2uDcSGkiBJMA9sPipAjRgmJkawGBOmJxlBCOxiEMEItAMANhYIKw==
       "], 
      Association["Book" -> 8, "Theorem" -> 7] -> CompressedData["
1:eJzdUckNgDAMc1K1QqHwqEB8+LASI7AAs7IRSTgEYgOs1nWcpocyLeu8EIDN
6L8YYwiBJgWFutRFSt8NQ1FAJ7eN5WN8ljQAyxXkjKrSNSUnhVw4tFsm8sNx
PjZBBB6ku8z0yZb0yM9/490YZnKPv38kG/5iI9I7dhYdCAo=
       "], 
      Association["Book" -> 8, "Theorem" -> 8] -> CompressedData["
1:eJzdkdENwyAMRG0jULg4leo/PrtSR8gCnbUb1TZt1agb5AkOn20QiNv+uO9M
RM+QE1NLKTwcLqutBrPrGOaQT7lsUa/1d8dGJPiY3mlZfG0txQF04vHUpp5E
B76Z1NlEUHIDbYpkxm+NYro8/8jxY0Q4c/L/RI6RN9ZwCn0B9uYIDQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 9] -> CompressedData["
1:eJzdkYEJwzAMBOWPlOYrCKQbZKWOkAU6azeqJLchIRv0sB+9/DYIr9vruTUR
eaf8MwCGOcAw+eQ3XxYvJDbu5EOVPF4Ig/FnIjemUS3RDFsn6q5q0aSTe6e0
h4QmYWhqLHr91TwsV++fOX8M0KqH64QtV41g6Yz2AdMKBho=
       "], 
      Association["Book" -> 8, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNeDmZmYCAmYOJhTAAMWMQMiErB7IYYQHCpAFlgWL
MJIBGMAIrzyUQRlgRBCMjAAb3gNB
       "], 
      Association["Book" -> 8, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGN+BmZgICZg4mFMAAxYxAyISsHMhhhAcKkAWWBYsw
kgEYwAivPJRBGWBEEIyMABFtAzY=
       "], 
      Association["Book" -> 8, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOWBmAgJmDiYUwADFjEDIhKwayGGEBwqQBZYFizCS
ARjACK88lEEZYEQQjIwABwcDKw==
       "], 
      Association["Book" -> 8, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGO2AEAlYmRhTAAMVgEkUxA5IATBZKkA4YYLbglocy
KPQigmBkBADl0gMF
       "], 
      Association["Book" -> 8, "Theorem" -> 14] -> CompressedData["
1:eJzdUdsNgCAMvBZCTAl+gPGfP+dxBBdwVjeS1kc0buCFHHfX8gp1WeeFAGxK
v0edKrlYYpE8DOOYG1By5j5555z3z94EsNwmoevaHIJRg1w4tEUq0iMxPpog
AjPhXqb6ZC2as/3feH8MM1nG38eRDruxErUzdtP4B8U=
       "], 
      Association["Book" -> 8, "Theorem" -> 15] -> CompressedData["
1:eJzdUdsNgCAMvBZCTEE/iPHfT9dxBBdwVjeyrY9o3MALHHdHeYVxWeeFAGxG
/8c4Ucg1V6l9PwxVAe3ctTGEEOOztAVYLlMKmkbHlJwUcuHQHpkoj8T5KIII
3KR7memTbdKd7//G+2OYyTP+vo2s+Y2NSM/YAbFtB58=
       "], 
      Association["Book" -> 8, "Theorem" -> 16] -> CompressedData["
1:eJzlUdsNgCAMvBZCTAl+gPHflRyBBZzVjaT1EYwjeCHH3bW8wlK3tRKAXekH
WMjFEovkaZrn3ICSM4/JO+e87zsTwPKYhGFocwhGDXLj1BapSF1ifDZBBGbC
s0z1xVo0Z/u/8f4YZrKMv08jHXZjJWpnHI/5B3w=
       "], 
      Association["Book" -> 8, "Theorem" -> 17] -> CompressedData["
1:eJzlT0EOgCAMK4MQM9EDId79kk/gA77VH7kNNWr8gQ2UtgwYc12X6gBsSn+A
833uM+dSpikLIJPGIXjvQ7gXDgDxaVJC18kao5GATzRtkYp0S4xbEZhhJl7H
VB+sm+bs/lfDD0fkLKOPn+mwjpWcvLEDbMoHVQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBGBiRAEMEAwUZWBiQlYHDBBGeKCA1TAwwAjSAcQE
vPJQBmWAEUEwMgIA3/IC/w==
       "], 
      Association["Book" -> 8, "Theorem" -> 19] -> CompressedData["
1:eJzlUNsRwCAMCmSSrtQRXKCzdqNCrPbx2d9yJwJBz3Np29oQEbvpH0gmBVNm
hjYvQnyvyWB+ipSq0Sus+kDXFfGVFPfIN5ThPGZ9soflnk/4AFwEHAvJA0M=

       "], 
      Association["Book" -> 8, "Theorem" -> 20] -> CompressedData["
1:eJzlUdsRgCAMC1WQdgA//HMlR2ABZ3UjaXwcnCOYg1ySltexln0rAcDh9BPo
orOaKYE6RXMah2Ga2q4MSHxMSjDzMJMq4oNLM3KRmoR8NSFG0OR3meubvUjH
/Xv0HyMSmMn3WcEHb+wU6hknl4EF8Q==
       "], 
      Association["Book" -> 8, "Theorem" -> 21] -> CompressedData["
1:eJzlkVEOAiEMREuBbBjT7GoAf90fD+QR9gKe1RvZFjVuPIIvZJgZGhLCut1v
WyCih8m/gOvhclx6P9daaW0n7i3FGFP6HmpEvLxDKTRNuufsogDzLIb6oVm0
RAE+jesYIghpgGSBM/xL7dCT379n/zHMwTv+fVWwBXNiSSBPsdYHyw==
       "], 
      Association["Book" -> 8, "Theorem" -> 22] -> CompressedData["
1:eJzlkdENgzAMRC8GH44X6HdXYgQWYNZuVNsBVNQReEpOvrMVJcp729etAfik
PIb+6u7eE8SWbpynaVl+ZwwQPQ0J9wytJFDlIOqhxgiVqldSOoagRBilxUAy
6kOzWa7Ov3P/GJFWmfw/quWqGzMdlV+KKgX1
       "], 
      Association["Book" -> 8, "Theorem" -> 23] -> CompressedData["
1:eJzlkVEKAjEMRNO0ZelIWBdp9Hc9kkfYC3hWb2SSqrh4BB9lOjMNhdJ1u9+2
REQPl//hsC5H1Uvvna564rOWnHMp3yNKxMs7tEbTZHutIQYwz+KYH1rFSjTg
04SOIYKQBUgVBMO/1A8jxf179h/DnKLj3zclX3AnngTyBJCYB6c=
       "], 
      Association["Book" -> 8, "Theorem" -> 24] -> CompressedData["
1:eJzlUcERgDAIo1TCUadwJUfoAs7qRgJV73qOYK4NJORBr1s/9l6I6Az6EWxt
zZqZkVduhqVW1SlBxPIIH3k0a5BDBAPeD1a4KX5fJ3mESEAuBApJjP7mGKaa
V0jMH8Nc0uPvk0qc3BihILgAYJoFtg==
       "], 
      Association["Book" -> 8, "Theorem" -> 25] -> CompressedData["
1:eJzlkdENwyAMRI0BBd2HEyUV/DYrZYQs0Fm7UWyTVI06Qp/QcXdYSIh1f217
IKK3yT8xTa09am30XBZuNcUYU/oeqEQ8X6EUGgbdc3ZRgHEUQ33XLFqiAJ/G
tQ8RhDRAssDp/lQ79OT337l/DHPwjn9fFGzBnFgSyAFz6geY
       "], 
      Association["Book" -> 8, "Theorem" -> 26] -> CompressedData["
1:eJzlUcENhDAMc4Mi8gDJ5dUHn1uJEViAWdmIJgUEYoSzKsd28kjU37otawKw
O/0V5mnKJJEzZRxUu0712R8AscuUgr6vNUbUQVpD1Y3VashC3klwGwIN1dDU
GGj6ZG+Ge68QeH+MSIpMvgclf3TlayejHRTRBwU=
       "], 
      Association["Book" -> 8, "Theorem" -> 27] -> CompressedData["
1:eJzlkYEJwzAMBGXFIjIJvB3wAF2pI2SBztqNKslJacgIPcz7/2UMxo/99dwT
Eb1d/ottq601qrXyumSZppx/xwsRlzP0TvNsu0iIAZSijvmholaiA98mdBwi
KFmAiiIY/lAfRor7r1w/hjlFx/f3JF9wp54U+gHqsQbJ
       "], 
      Association["Book" -> 9, "Theorem" -> 1] -> CompressedData["
1:eJzlkdEJwzAMRM8yIhiOCBzIf6ATZYQs0Fm7USW5LQkZoQ9z1p30IePteO5H
AfAK+TMe3cHau9istVbVc3cGhF/TGqbJ7xzRgFwWC7wequYhG/lLUscQaHBD
U2My6o9GM911heT6MSIlM7k/p8TJjS2c0d5JIweO
       "], 
      Association["Book" -> 9, "Theorem" -> 2] -> CompressedData["
1:eJzlUUEKwCAMixURoWziYfd9aU/YB/bW/Wg2OlD2hAVJk7Rgxf28jtMBuI3+
hlKBrRRZl+C9D2FsLoDoa1JCjLVyJBhUc1YDtTIykYaE3IagCprmtevO1qSb
VyDmjxFxzOT7GmeHGxu5escD94oG+w==
       "], 
      Association["Book" -> 9, "Theorem" -> 3] -> CompressedData["
1:eJzlkdENwyAMRA8DyiE3H8lHfvKVjtQRskBn7Ua1TVM1ygh9QsfdGSEhtv35
2BOAl8vfsa4blvsi81RyzqX8ziZAbkdoDcNge60hBjmO6pjvWtVKNvLbhPZD
oMICtSqD7j/qw0hx/5nzx4ik6OT6mOSL7tSTUt/lNAar
       "], 
      Association["Book" -> 9, "Theorem" -> 4] -> CompressedData["
1:eJzlUdsJgDAMvKYt5iOCgv74I365jyN0AWd1I5P4QHEEj3C9u4SS0qmsSwkA
NqP/YR7RDT21TYoxpvRsNQDJZZhRVXrm7KRgrmsxqD44i4aGO3E+hsACNSxZ
riHTJ1vTnd//xvtjiIJn9H1LsGJTtnYQlh3X9gas
       "], 
      Association["Book" -> 9, "Theorem" -> 5] -> CompressedData["
1:eJzlUUEKgDAMy7oNe6gwQS9exC/5BD/gW/2Rbaei+ARDyZK0jI7N67asAcBu
9ENM6MeBupJijCk9OwUguQwzmkbPnJ0UzG0rBtWVs2houBPnOgQWqGHJcg2Z
Ptma7vz+N94fQxQ8o+9TghWbsrWDsBy2JAaF
       "], 
      Association["Book" -> 9, "Theorem" -> 6] -> CompressedData["
1:eJzlUcENgCAMLAVCHyXBxI8PH67kCCzgrG5kW8RIHMFLc9xdG1LCVo+9OgA4
lX6JeV1wKsF7H8I7LwDI3RBBSnLGaCQgypkVohtHllDxJMZtCIhBDHHkPqT6
Zm2as/tHjB+D6CzD70OcFqnStR0TX6FTBm0=
       "], 
      Association["Book" -> 9, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 8] -> CompressedData["
1:eJzlkdEJgDAMRNO0RXoQ/FE/9MuVHMEFnNWNTFIVxRF8lOvdpRRK53Vb1kBE
u8k/GTqexhRjTOlZj0TcX6EUahrdc3ZRgLYVQ33VLFqiAHfjWg8RhDRAssCp
/lQbevL737w/hjl4x993BFswJ5YEcgDJ6QbG
       "], 
      Association["Book" -> 9, "Theorem" -> 9] -> CompressedData["
1:eJzlkdENgCAMREuBeKQxwT9/XckRXMBZ3ci2qJE4ghfyuDsIgbBs+7oFIjoM
P9XMU00xxpTebSViuUMpNAw65+xQAeMoJvWNWbREAZ7G2TYRhDRAssDV/EVb
9OTn9+o/hjl4x99nBBswZ9cOAjkBaIUGGg==
       "], 
      Association["Book" -> 9, "Theorem" -> 10] -> CompressedData["
1:eJzlkYsNgCAMRMun4UIanMGVHMEFnNWNbIsYiSP4Qo67oyEhrPux7YGITpO/
EpeWU0o5v8umfR0BoFJ0Z3ZRgFrFUN+VRUvjaVz7EEFIA4RlDJm/1Q49+f0z
88fEGLyL31cEWzAnlgRyAVgtBhU=
       "], 
      Association["Book" -> 9, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 12] -> CompressedData["
1:eJzlkesNgCAMhMtDjvvtArqSI7iAs7qRfagJcQQv5KN3bQiEdT+2PYnIafiv
ljqVUusQieT5MaT0rntrDhXAkNbBRg1B4E2cMSSgqAGbDpiivmlNd37+qPFj
ck6e5e8bki34bc0RvABNygXg
       "], 
      Association["Book" -> 9, "Theorem" -> 13] -> CompressedData["
1:eJzlkd0NgCAMhEuhudwMPrmAwziCCzirG9kWNRJH8As5ekdD+Jm3fd2KiBwh
P6a1Wlt7P8IiotNtSAF8NktxAHa87mr0EASeJLU3CShuQPOGoNeXxmK63H9k
/BjVkpl+r1BiIE8bjuAJLwAFrQ==
       "], 
      Association["Book" -> 9, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGMuBkYREVRRZgY2BgYoZxmJkZWFkhNIggHTCAEV55
KAMdoEYMExMjWIwJ0weMIATWDyIYgWYBAGEgA/8=
       "], 
      Association["Book" -> 9, "Theorem" -> 15] -> CompressedData["
1:eJzlj7ENAzEMAyWKpv/xRZoAqbPSj/ALZNZsFMpdikyQs0FLoi3Bz+t1XhkR
75a/ppL8aUqRiT7nHAVwL+vicRdQknZKx9F76wqbGnRYTgrEJnFjLVRdsYkl
ROdmelpmRQJjBCrcKxSOCJFwo17rbX7Bm6f4F+2lbzM/o8cFAA==
       "], 
      Association["Book" -> 9, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNuDm5kbmsgAxPFCYmRlYWSE0iCAdMIARXnkogzLA
jCCYmQFFogPT
       "], 
      Association["Book" -> 9, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOODiQuaxADE8UJiZGVhZITSIIB0wgBFeeSiDMsCM
IJiZATr7A8Y=
       "], 
      Association["Book" -> 9, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOpBE5rAxMDAxwzjMzAysrBAaRJAOGMAIrzyUgQ5Q
I4aJiREsxoTpekYQAusHEYxAswBFMwPd
       "], 
      Association["Book" -> 9, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARJgY2BgYoZxmJkZWFkhNIggHTCAEV55KAMdoEYM
ExMjWIwJ08GMIATWDyIYgWYBADDjA8Q=
       "], 
      Association["Book" -> 9, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQMjI4RkHA7hAQCZkQKA
       "], 
      Association["Book" -> 9, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARQMk8AAAJWpAns=
       "], 
      Association["Book" -> 9, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQwwMzMPtBOoAACbzwKD
       "], 
      Association["Book" -> 9, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQgwMg6XsAAAlz0CfQ==
       "], 
      Association["Book" -> 9, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARgwDpewAACWcwJ8
       "], 
      Association["Book" -> 9, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQQMl7AAAJWqAns=
       "], 
      Association["Book" -> 9, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARwwMw+0C6gAAJl9AoA=
       "], 
      Association["Book" -> 9, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQKwDLQDqAAAl/ICfg==
       "], 
      Association["Book" -> 9, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 32] -> CompressedData["
1:eJztkd0JgDAMhNNAOG4Gn7qSI3QBZ3Uj86NCcAU/yvUuCSXQuY59DRE5Q346
uj2OFMBvsxQHYOG+1OhFEHgrqTUkoHgAzQeC8rdGM1O+3+kfozqypt9tRxzk
tpEIXvkBBWg=
       "], 
      Association["Book" -> 9, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARpgHWgHUAYAmJ0Cfw==
       "], 
      Association["Book" -> 9, "Theorem" -> 36] -> CompressedData["
1:eJztUckJwDAM86EM0pU6QhborN2okepPCNkgwghbGMngqz93dzN7SQc7ACyz
TNEA0BoI9ZCUmBXxL8mBQ+1UXyz7lMkSPT8mwqXFeqOzoGs5jYwPQ4QD9Q==

       "], 
      Association["Book" -> 10, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASZgHMIBAwCWWQJ8
       "], 
      Association["Book" -> 10, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARbANNAOIB8AAJZYAnw=
       "], 
      Association["Book" -> 10, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAVbAzMzNCAYMDIzkAgYwwisPY2EFLEDMxsAANgbE
YWRiYmBiApIwXUA+CAAlWVggKhgZAdrxAzA=
       "], 
      Association["Book" -> 10, "Theorem" -> 6] -> CompressedData["
1:eJztkdEJAzEMQy3bVTjCfXWCrtQRboHO2o0q5ejP0RH6IALJwgnkcbyeByLi
bfnzm3uS+5wdMc0wZHcVAHmpI4xe1OYkh7VuMDOIUIkYOEuZifwmHsopUvPC
+THUaTtgW7uqoiqrmIvQUzxlUCtV960fqYEEyw==
       "], 
      Association["Book" -> 10, "Theorem" -> 7] -> CompressedData["
1:eJztkdENwyAMRO847AYzRVfqCFkgs3aj2CRfkbpBn8QJni2DxHs/PjsBfCv+
/KC5zTkFRIwxvDDrXSIZ4Zml6H3RtjKpMmS1ZcCINEbn1bT0bVTFm/G8+/oY
z9XrRG7wbJcgNcnaAvmUqhr8lWVf805/MgR/
       "], 
      Association["Book" -> 10, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAS7ACAYwmgzAAEZ45WEsrIAFiNnADmFgBHEYmZgY
mJiAJEwXkA92KAMLCwtEBSMjAM7BAx8=
       "], 
      Association["Book" -> 10, "Theorem" -> 9] -> CompressedData["
1:eJztzdEJAjEMBuD0mjSJnq1JFeHgPERwhQPHcIB7cAFndSOrPruB38MfCPnJ
6f643QMAPN/x99PSIW4ARHLuh1L2dZoul93O3UWqu9XG6jjPyzwLvzft0sz6
vngbzJOz13y0atcPlsHVS5m8+GY0bgUXVc3tWYwdhC4QQSTIXUCMGgQTEkUi
JKUQq6EmTEnwK0VRWfEZnPstKnI+rF8zFQ0Q
       "], 
      Association["Book" -> 10, "Theorem" -> 10] -> CompressedData["
1:eJztzdEJAjEMBuC0TdpEewlnEUFFq+IQBafw2adbwFndyJ4+u4EfJIGQn5ym
531yAPCa299vHlEBmFXz3mxdar2ex9HMRIqZlk7LtrVHa5zmTb/sch6sD07V
khU9atHbR+KdSY/XXsNBUw8Yi0h/AiF4cN4RQSBQ7xCDOMaIRIEISciFMqJE
jJHxKwYWXqQLrFIeUDAtN/IGzSoMYg==
       "], 
      Association["Book" -> 10, "Theorem" -> 11] -> CompressedData["
1:eJztkcERwyAMBE8+kDQyjwwdpCWX4AZSazqyRPzKIxVkB25gJQ0PnufrOAXA
u+LPDyIc2GPO6YVZa72THMMzS9FVteWKMqkyutWRA0akMTp1sfRtehVv5ve7
n4/R3K1uIg+oQFgjGxnbAhSpqkFNme3ZwAvbCAVU
       "], 
      Association["Book" -> 10, "Theorem" -> 12] -> CompressedData["
1:eJztkTEOwkAMBNdxyGkV3J4CbsilzWt4Qj7AW/kRdpKKghcw0lnavZFceN5e
z00AvHP8+cXaA8tC0hMzchwjteZnR7/u3B7ZVM853Zk0GBGS0XlItYZyNlN+
RqoBv9ceh7nE6zOJEEUgqlDtVIduByqSrqEMJQxIbP0A6XYJlA==
       "], 
      Association["Book" -> 10, "Theorem" -> 13] -> CompressedData["
1:eJztkcENwkAMBNcYcnbwPU8ByxKKriNKSAPUSkf4kh8PKmAeK3m90j523V7P
jQC8h/z5yRnoq6qGe4SZ6jyLSO+RGonEtVar9fYYTnNPXe4y6DCBhJuE1J3W
MiKejssynnm1RL9bj2EuRz+ISFEIxAzmE/N02gETjayhTEVQQNn6AbnxCTw=

       "], 
      Association["Book" -> 10, "Theorem" -> 14] -> CompressedData["
1:eJztjcENAkEIRWGHAYa/TlbjTS/GmzdbsYRtwFrtSGYasAEfyQt8SLjt79fO
RPQZ+vOL1twdwIrez8dt8wgzpDNDKgZWMXNLP4YiXLpLIE8xTwFtB7fc9Eye
Y+kt7Hq/eH5hZsqqlUgIOSy0skpkXhYp6kLlFNYkKTJhVdGq7rRqSPYQK1/J
xggD
       "], 
      Association["Book" -> 10, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 17] -> CompressedData["
1:eJzt0bERwjAMBVDJcSLZ/iI+cmlScDgcaViEmhGyALOyEXYWYAFeodP9X6hQ
2d+vnYno08bfT895PsVtizpN3o+jmeWsZkkr01EkiJRHS5ZSgKTXuiKdLZtB
SzaFHi4Fdse6ZtygrUTBEoLO9YhzTPUfGomFQMyOHQ2qQ9dJ36tITxK8eOZW
HTp2SAmYSIYB3nuNCF+7kgnv
       "], 
      Association["Book" -> 10, "Theorem" -> 18] -> CompressedData["
1:eJzt0bERgzAMhWHJGOuBbRmcQMGlCOZScNkmI7BAZs1GARbIAvkKNX/xCs3b
+7UxEX2O8/fbMKhf1xbjaG3OXdf1PVKK2CX0Io3I/ETSOM1FNUrZo/eXmFNU
LDlBcbqVkFZdSqcPlSP6e5iAZtg3jGHa/4GWWCgQs2FDDnBVJXUNkZqksWKZ
j3Sq2ATvQ7iSOBettfARX7j8CgM=
       "], 
      Association["Book" -> 10, "Theorem" -> 19] -> CompressedData["
1:eJztjcsRwyAMRBEskgCDnM94JscU4iZSghtIrekowhWkgLzDO2h3Vs/j/Too
hPCZ+vMD+731fjFVEWYA62puc2AyRhujXOeFW3OXism6KBZYU9hZNVNPO3pX
t84QDczCu79IKYVEJBJiDjeimGIl5SaAclbumVQgIOcUUUJ82Ni2GrT4ckWJ
oC+tuAfj
       "], 
      Association["Book" -> 10, "Theorem" -> 20] -> CompressedData["
1:eJztjdERAiEMRAksSYCDMOrcjJ92Yg2WcA1Yqx0ZrgIL8H28j+zO5nG8XweF
ED5Lf37h1nqfpirCDGBOc5sDkzHaGOWyLtyau1Qs5qbYYE1hZ9VMPe3oXd26
QjQwCz/9Q0opJCKREHO4EsUUKyk3AZSzcs+kAgE5p4gS4t3GvtegxZcrSgR9
AYU3B6g=
       "], 
      Association["Book" -> 10, "Theorem" -> 21] -> CompressedData["
1:eJztjbsBwjAMRGX7cpZt5DhQUFBRsg4jZAFmZSPksAAD8IpX3Olz31/PPYjI
e+rPT2xmo6vmTAIYo7u7g55tUraZsDV3qZicu8L7dvhAvT3BTGHQGaOBzHz4
AyBKDIkqoNxCWJhqUlrhnLjSGKhcMpJvpS+IF+tjVGmrX64oEeEDejwHug==

       "], 
      Association["Book" -> 10, "Theorem" -> 22] -> CompressedData["
1:eJztjMsNwjAQRP3bZOP1xqDYKHADJJQzEoGAfaEESkgD1EpHGCqgAJ5G7zAj
zXZ+PmYphHh99Oc3pumeQ2D2nh15n4pTTtmlLnab2NV4K80wnorjsYhogYGR
0hg4U9+vSipqy3I5B75SDITERMPusPfl3xglpJLGCFWJpZIalJWoEQA0gKka
kJqsaREQHXxRNTTOEq6Fqx0ba8q/fQNDDw1h
       "], 
      Association["Book" -> 10, "Theorem" -> 23] -> CompressedData["
1:eJzt0b0NwjAQhuHzX2zn7uw4cSJkUoCUjhEAISRKKkbIAszKRiRI9AzAU7zl
13z7+fmYBQC81vz9aN4M01TXpeQUEYecqB+6IXRbXFnsU6TD6RyY8NgmQiQ7
ZsvdpcTM4cOGrm34di3hzs0YLRPTwaNPy7yUyxsCjAGogIUQSkhw2iqldGWs
QwO6rrRXUikjvzx573ZQO4paas/BvwFdhwt4
       "], 
      Association["Book" -> 10, "Theorem" -> 24] -> CompressedData["
1:eJzt0bsNwkAQRdHZn/czM2uvF1tYKyRInLgEIICEEIkS3AC10hE2EjkFcIIb
vuQd5udjFgDwWvP3q20/jiGU0qaI2KdEXZ/7mHe4stg1NU2nc2TCY06ESLa0
lvN1qDccP2xs24ZvlyHeuSnJMjFNHl1a1qVc3hBgDEAFLIRQQoLTVimlK2Md
GtCh0l5JpYz88uS920NwVGupPUf/Bh/QCxs=
       "], 
      Association["Book" -> 10, "Theorem" -> 25] -> CompressedData["
1:eJztzMsNwjAQBFB/1l5n10s+chy4IAUJBeIeoAdKSAPUSkcYKqAAnkZzmMPM
2/OxaaXU61N/P1v3E1GMEbnvS+3relldGdPQpSH4InU5zeJ4Ou6QmYeQY+Dl
nOTCOacalFYavt9GKTxlCRyFly61bT231ihtNIDSTvVWGzCkg0Wo0HvfeA3o
wBtnDLgvg46JhA6KmihAgBybNxCGCxI=
       "], 
      Association["Book" -> 10, "Theorem" -> 26] -> CompressedData["
1:eJztzD0KwkAQBeCd2Tczm2xi0EIJsbExggcQtLAXA2JhKZha8KzeyHVP4AH8
isf8MLMaX9eRnHPvb/z9zgyJArPpoIiP2+k5GeZda10bynOabA77uorLXSpj
LL2qx+W4aO61ZGaajmOhUiMoPCpD32/Xkr9755jBjr0T5tyCmERIAFE48kTI
4TNiDo2GoqBK0p7EYPwBOT0Kmg==
       "], 
      Association["Book" -> 10, "Theorem" -> 27] -> CompressedData["
1:eJztjDsOwjAQRNf2etefkA0icWzFDVfgFLRAk4IqF6DmmNwII3EBet5II80r
5rg9rpsCgNen/vzA8yDLUiXHGKukEH0Y/I2oI/IuNBN7KUMKUiTNqcN7xRz7
S4mz+3KuLk/jWqcsa8E8c+5kEG7XSiloMQbAQGxDAyrUrnlNzIYYdMcUjDUG
rSVrrUZ0zgYu4IlH1t7tT7s3J0QLZQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 28] -> CompressedData["
1:eJztjDEOwjAMRZ3EsRMX6gKlqUUWrsA12FgYQAy9AGflRgSJC7DzvvSl/4Z/
XJ6XxQHA61N/fmHbm1WdRaTqlCXlId2IVkSZczPSqw1F1HQqU4dLxVn6q0lJ
X86V53H3qHvTu+Fc2DodlNuzcw5aQgAI0LXhAR361Lwn5kAMfsUkIYaAMVKM
0SOmFIUPkIlH9jltTus33R4K+g==
       "], 
      Association["Book" -> 10, "Theorem" -> 29] -> CompressedData["
1:eJztjEEOwjAMBB1iJ6lRgkojFXFAQmYP/Q5nTv0Ab+VHuPAC7sxK3l1b8nV9
3tdARK9t/PmJx7IAUM2AZcdyrzXXmmTbqKqZTWpw23MHI+to2cYPSSYMaK2j
Yp5uDBPoobXij0MI5BIhEhq87KgEZk8hRmYpTLFKKuwlftkOSSQdqUi5SORy
nk9vK4ELLg==
       "], 
      Association["Book" -> 10, "Theorem" -> 30] -> CompressedData["
1:eJztjEEKAjEMRVObNJ3AVKeDgriQCQFrj+MR5gKe1RuZcePave+HD/8nZFmf
jzUAwGuzP7/RWjcbBu6m7CjP45h8aGtEpKlOcjNVFZw7GsuhsU4fEtU+WClz
H+1Y72iNFtmXkv1vCAFcRAAE2cMOKCAm72NEJEaIEonjF1/ERJgqZMxXv+bL
6fwG9UAKzA==
       "], 
      Association["Book" -> 10, "Theorem" -> 31] -> CompressedData["
1:eJztzE0KwjAQBeDEeZPMtPSHpDmAa8GFEkl3uhK69Ai9gOfweN7I2Bu493sw
PHgw+/X5WK0x5v09fz96TcssotOSVRu9aRBhEXiN9+xFxmvufVryfFZKkYpq
GPXSboAYpbRNClqGFKgMOLlu6LA93tVY1E7GWWusYQuSOhAYYDbUeDiuwBsC
aAf2vXGOO3Ysx/7wAWswC6M=
       "], 
      Association["Book" -> 10, "Theorem" -> 32] -> CompressedData["
1:eJztzE0KwjAQBeCJ8yaZaekPSXMA14ILpRJBV9ILeIRewLN6I2Nv4N7vwfDg
wezX13N1RPT+nr9fTctV1aZlNmvsblFVVBEsPeagOt7mPuRlLmfjnLiYxdEu
7QZISUvb5GhlyJHLgJPvhg7b312NQ+1M3jlyJA6sdWAIIELcBHipIBsGeAcJ
PXkvnXjRY3/4AAbnCv4=
       "], 
      Association["Book" -> 10, "Theorem" -> 33] -> CompressedData["
1:eJztzDEKwkAQBdDZP7Mzy67JEQyBHMETiHaCsFhIKklhkyKNhTZ6Ar2bNzKJ
J7D3FX/4fJi6G3LniOg9xd/PmjMQnk22YFaFFCfK4VFnr9ZXOdq9yds1ULzQ
BouDHdLMa7rZMcbFJbSpuKI9+Z0vy8Lmt46cA0AkJBgLAeJ0HMYr7IVgzKai
Kl+kymBWI+95Lyy8Wa4+2S0Pjw==
       "], 
      Association["Book" -> 10, "Theorem" -> 34] -> CompressedData["
1:eJztjDEKwkAQRWdn5u/sZlUQG0mxYOsJPICtQrSQVIH0oofwFp7KGznJDex9
xYf/H/zd+LiMgYg+U/z5nYE51neHCDzRlKKlJEV9dTlZe+9W5vJ0ZG4q98i5
xbXMSFzUfDNL1fpldrm1s/me5tdAITAzkZKyFxLWEF2IqIgpiQkbFDCdIUDY
ZeMGm2k5rPdfIM0NSg==
       "], 
      Association["Book" -> 10, "Theorem" -> 35] -> CompressedData["
1:eJztjTsKwlAQRefd+b3kxUJbSUIQcQGuQxCXkM5KQjoR9+iOnGQJ1p5imMsZ
5g7j4zYmIvos488PADIcJxXRTjbulXvsw2HKrm031RbyegHqHm/JpdXZV5RL
b0/33OdXqULu5e5NU/L6NFFKAIiEBBGIIcnWNhU2JTgjilSjbIHMGMxWU9id
xNF5e/oCY1gMJw==
       "], 
      Association["Book" -> 10, "Theorem" -> 36] -> CompressedData["
1:eJztjEEOwjAMBJ1dO0VKinoliBQCiMdw4wn9AG/tj3Aq8QHOzMFarVdzWd6v
JYjI2s+fX3g2MrXW7rlNtQ61zo/elFrHMc0nj44pjdfzlG/ZNo5Kb1SVHvoz
KQ+llJ0bQ6AEwExAGQCJAlAt+rqjBKOC8BHCF913WYimhk2JDwBsCAk=
       "], 
      Association["Book" -> 10, "Theorem" -> 37] -> CompressedData["
1:eJztjMsNAjEMRP2NkBzB1Su0KAm7N6qhhG2AWukIOyXsmXcY2TOj6cfnfSAA
fFP+nGJjrvuzDxtX9+LeXuksrZnV+xqnmamwcn/cbJhOVuFwRITjyLAKL+5+
iUFEBiRSBWIoRFDijVIRIeIkwhAMpkzEcgwjUMq20g/B8geK
       "], 
      Association["Book" -> 10, "Theorem" -> 38] -> CompressedData["
1:eJztjD0KwkAQhed/s84mlUKIlY1ICntBUtsEYSEXSGflWb2Rmz2CtR+84b2v
mNP6zisCwGc7f37DJOVlnj0fht6GfuevYsb7lDwdb6W6RzZjeU777uFaiWEz
Hk1baUxY2iCX8XrW+pEBiISAGISoTkFCVWRmVQZkrAaZKkgUOgtNRBc1wZJA
X8iACc0=
       "], 
      Association["Book" -> 10, "Theorem" -> 39] -> CompressedData["
1:eJztzD0KAjEQBeDJ/IbE7BHULfYEXkAsrLQJFpJG2H6RLbQXBM/njcyuN7D2
K97weDBtf829A4D3FH8/6rJ5s7WPYaLkX20WtWGVgz27vN8ipgcWb2G0U5yJ
xpudQ1jcfYlpxHKRgzRNsvmhA+cQEYCBsRZAZKd1qJdJGNCITFmVv0CVkEgN
ROjIxLRbbj4qtg5k
       "], 
      Association["Book" -> 10, "Theorem" -> 40] -> CompressedData["
1:eJztjMsJAkEQRHu6u7bno4J4EYVZPBuAEXhdGD3IXjcAxSQMw5zMyNkJwbPv
UFD1oA7T4zI5IvrM8edXCjrgiZiSpuQV+VWCt929rKx/l+HMHHseEcIe19SQ
bpHDzcxnG5ehyq0NVnff/hw5x8xESsq1kLC6rgoRFTElMWGDAqYNAoSrjNVg
My+n9fELecMMNQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 41] -> CompressedData["
1:eJztkbsNwlAMRf29yXs2iD6iQMo+SAxAkQUYgCmomICB2AgnDT0150i2LF3d
xqfldlmYiN7r+PMzgGNGDCtuyHz2Ecf50YfM1/UsjOTwNk64tw1X9MrD06L5
nmPS1jLDvp1aP1Fikk0mI6nFLOA6hMzLQktSVTGVYHPdCZvEwT8kUgim
       "], 
      Association["Book" -> 10, "Theorem" -> 42] -> CompressedData["
1:eJztzLsRwjAQBND77J0kJBOBx0NGwGdogMQ0QODA44AUN0CtdARWCcS8YOd2
g9vPr3FmInov8fe76f4sw6ZrvWvjanSU860vueyu3zPnpO6Kod82j2xVCO5A
Tm4NokPRBJyOl4PVb0okAiFRgkitYGEzNsAMxMq8LKxSsUhce4yJi5mDLcD0
A4B8CVM=
       "], 
      Association["Book" -> 10, "Theorem" -> 43] -> CompressedData["
1:eJztjM0JwlAQhPdndt/G99BTIOTmQSQHC1DBDgQxkALSQGq1I5PXgle/wzAz
MHOcl/fMRPTZ5M8PvKYytn3nfRd5NOThfis599fVltyom2J6tIdntkpKbsCu
cSsIh6IknIfLyeqZEolASJQgUiNY2IzXkRmIlXlrWKXCIrH3iOBi5mBLSPIF
U9kJBw==
       "], 
      Association["Book" -> 10, "Theorem" -> 44] -> CompressedData["
1:eJztjD0KwkAUhN//Zn272oiEWFn5dwUhhQhpVPAIuYBn9UYm7wq2fjDDzBSz
G9+vEQHgM9ufX7iX56ZrrWsX/jDx4/WyKr7tp1g8sxnLrV/XwTXIaV48m1Zp
TFhqktP+fND4YgAiISAGIYoqSKiKzKzKgIyxIFOARGlpqcnooiY4KdEXIJwI
pw==
       "], 
      Association["Book" -> 10, "Theorem" -> 45] -> CompressedData["
1:eJztjLsNwkAQRPczu3f23eEMZJmEBCNRABIENIAlRAEEboBa6QhzLZDygqcZ
jTS7+fWYmYjeX/35idW0Hvow9LG9OdLhci45bU9LTKlVd8X9uumexSohuAOp
cSuIDkUOGMfj3uqVEolASJRMpFawsBkbYA5iZUaVVlgkdh6bhrMtO1tAkA/7
yAh8
       "], 
      Association["Book" -> 10, "Theorem" -> 46] -> CompressedData["
1:eJztkb0NwlAMhP3/sF9iFNHQIMQy1IgmQnRZgEGYjo0waRiAlu+su6uu8Wl5
XBcEgNfH/vyGRe9eJ5Z9drPDcx4143Y5E/bEsObHdpcVtRjYW4tJ3WOLsTff
ZKZ957h+woBAqxAUqIKQjBCEkIVriFegClUfUJQnYqFxp2/JTAf8
       "], 
      Association["Book" -> 10, "Theorem" -> 47] -> CompressedData["
1:eJztjL0NwkAUg9+P/e6SC6FCitJRJRITQINoqGhghCzArGxEcivQ8hWWbck+
Lu/XoiLy2eTPjwzjEOOQ2ydR5utl35XxvNpSGg86HrdDfy+spBQE2ibYIQcc
u4R5Ok2sRy5iBhNzgVmNUFNS1xEJUVfdGnWrqFnuI+esHRlQJtC/vAEH+A==

       "], 
      Association["Book" -> 10, "Theorem" -> 48] -> CompressedData["
1:eJztjMkNwjAURP/u5cvBOLngCAEtceeSBqg1HZGEErjyNHoaaaR5LO/nggCw
7vrzK6VoKWYleY+neq9zq9F7vyhOGX1or9swhy/WsnkMU1S3cRuvllMbGx8/
tIUY96JEIJsRDESAVZgUISixsIjxATCnIO4ZI4mJauAzfQBCLgcZ
       "], 
      Association["Book" -> 10, "Theorem" -> 49] -> CompressedData["
1:eJztzDsOwjAQBND9jWPZOMnaSkE6TsBdOEL6CHFUboSTggvQ8qTZYkba2/Z8
bExE7+P8/WyfX3vMV8Bzq6350gzuc9QCxbrc61r9FGKBASg9uJjCI/I0jehP
RJiEyIRMaSDhs2DtGGY2GDFUcA5faQwpFc4hWN+CFP4AsxwHvg==
       "], 
      Association["Book" -> 10, "Theorem" -> 50] -> CompressedData["
1:eJztzM0NwjAMBWD/PTdNCYqbQ3tlBFZhhN4RYlQ2IuTAAlz5JPvwnuXL8bgd
TESvz/r7Xb0/07IBMUdtLbYwRNSkBYq9Xdd9jcFTgQEofXAyRSTM/RL9hwiT
EJmQKU0kPALWjmFmkxFDBaP4ymfPufDibr1zKfwGcnEHSQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 51] -> CompressedData["
1:eJztzMENAkEIBVBmgQF2BcwYY2J0E+1iE6uwhG3AWu3IUVvw6Dt8Dj+fy/q4
rwUAnu/4+4FlUW2Zoe4RsXHPfkTmlGx+jha3D9FjWq/mjPRTSB+kmln0F4gD
lKEwAzLEUIhQilIlZmQmVi6439FUqdZKXxXVdJQrbHl0EpLpYC/avQkb
       "], 
      Association["Book" -> 10, "Theorem" -> 52] -> CompressedData["
1:eJztzDsOwjAQBND9jWPZOGFtpUg6TsBdOEJ6xFm5EY4LTkDJk2aLGWlvx+tx
MBG9z/P3C8+YN8Bzq6352gzu16gFin291736EGKBASg9uJjCI/KyzOgfRJiE
yIRMaSLhUbB2DDObjBgqGMNXmkNKhXMI1rcghT8w+AbQ
       "], 
      Association["Book" -> 10, "Theorem" -> 53] -> CompressedData["
1:eJztzLsRwkAMBFD99nw+I8byBXZKCbRCCW6AWukIcQEVEPJmpGBXo9v5fJxM
RK/P+vuJuuxAzLH2HnsYItaqDsXR79uxxVCqwwB4Di6miIo5L5EPRJiEyIRM
aSLhEbAmhplNRgwVjOKrXUtrzkspll0R5zfxiQZb
       "], 
      Association["Book" -> 10, "Theorem" -> 54] -> CompressedData["
1:eJzty9sNwlAMA9DEzkPlplBGQEJMwCSM0AWYlY3I7Qx8ciJFlqPc9vdrVxH5
zPX3G8vqrMd1W7NyxlGLuRvrPNaqPDy9McMd9Hm8OO9jnKL/ARGq0AUUaE8X
IFpHeCc1gtqgB1G1RERoqgVAM+cXmxEFXQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 55] -> CompressedData["
1:eJzti70NAmEMQxM7Px9cGqhPSAgQYh1GuAWYlY3IXcEElLziyZbs8/J6Lioi
71V/fsSerNP9UVV57VjTFGRw3I41KjYuRjq/BMs4Z+bouypFAbYpAYh3Jdwj
iO3WhkGh2ruVDnHIXVDDzAFzOj6t+QWb
       "], 
      Association["Book" -> 10, "Theorem" -> 56] -> CompressedData["
1:eJztizEOwlAMQ5PYye9XfjuxMbEg1IEdCXVjZeAIvQBn5Ua0/wyMPMmWbcmn
9f1aVUQ+u/35FcE235eW7XjbYmZFBPhcDtMjvVPLvmQNHzkEwbHwMl/P3v8Q
MaOJQWjWK9XUXQG4QxTaF4V11KxMUYaqSQ/qJscXRCYHFw==
       "], 
      Association["Book" -> 10, "Theorem" -> 57] -> CompressedData["
1:eJztyz0KwkAQBeDZN39h180Zoh7BCwgWNmIhEkQIQqoIoqClp/RGrvEKln7F
Gx6Pmfa3XR+I6PWJv5+Zt2o+NG3062y/XgL5jq7yePE2jdTS0w8xTh7VMeUz
upNutK6zj++BQgBAJCQohQAJVoZyhVUIzuwmZvJFZgxmc1LlrbDwqlm8AT2b
DLY=
       "], 
      Association["Book" -> 10, "Theorem" -> 58] -> CompressedData["
1:eJztizsKwlAURO838178gFgYMAhuwNIdWIrogyDYpbG0dpnuyJu3BktPMTBz
mP34uo1MRJ8p/vyOkhI2z7JE/75eTiJtLw/PufMyq2gz3+UBSFvcFzlkhzNi
T/XNxCwiREYmUUjFuAmhaqowUqjAzR1WIXeVkG0YX0/LcXX4AmLbCjM=
       "], 
      Association["Book" -> 10, "Theorem" -> 59] -> CompressedData["
1:eJztkbsRwlAMBHX62e9JMM49BHZBBBRA4AZogD6gLTpCOKEBQnZndHPJJVq2
22UDEb0+588P6aOf1mcfMh/XM8MTYW2c/d52TLx7DG6p0eyImKW1zNDvgtRP
hEC8C1LiCoAdVZjUykJKEhFW4YCaHBjKMdkbf2gHcA==
       "], 
      Association["Book" -> 10, "Theorem" -> 60] -> CompressedData["
1:eJztjcENwlAMQxPbSX5FYAcG6oURugCzshFBPbAAR56lyJal+H48H4eb2etz
/vyS7lBtHVHrFkqJ3RmXOlmp2FiVKmaGxnLva+v7QDCjxTiM6DLAAXAKFxA4
s4/MfdaUZQFupJRLb1TtBOA=
       "], 
      Association["Book" -> 10, "Theorem" -> 61] -> CompressedData["
1:eJzti8sNAkEMQxPHk8zOT3tdceJMKSAhUcI2QK10xLASHXDkHSw9yz7vz8eu
IvL6xJ+fcl97H1ePXls2D+Ppto2tlYPsHoll8VRZPRlH4qWN6sfVRABC1ITA
VAAKIZEsaNMjKWcF6pdcIi9ZK9znzBl4A+hqBj8=
       "], 
      Association["Book" -> 10, "Theorem" -> 62] -> CompressedData["
1:eJzti7ENwlAMRO3z2U6+f9KjVPTsQYnECFmAWdmID4gNKPOKk57u7rw/7ruK
yPMdB/9l6X29RfaqySKN2/W0bL19iUhnzeHFCjcuzktfKz5PEwEIURMCQwEo
hAQtacPTR69jpD+yZU6zNkQQHkx7Ab5LBeo=
       "], 
      Association["Book" -> 10, "Theorem" -> 63] -> CompressedData["
1:eJztkbENAlEMQxM7CbkjX+gKBkBiEiZA1EiI66ioGZON+FxzC1DyLNmF5cqH
+XmZVUTeX/vzY4737ZC72/mkGqmVQ3vldVwgMlkRnl7pvXxgjNbK1zX7JxQV
LFJxQQ8C0RuD0FbEjDB6qDknmKH2mw9aMgdV
       "], 
      Association["Book" -> 10, "Theorem" -> 64] -> CompressedData["
1:eJztkbsNAkEMRO3xZ/HeYXQiQWRUQ45IEETXALXSEWYTGiDkWRq/xE7mtD4v
KxPR6xN/fs1taxmP6xk8JYe3OLa7Dsz7LNFaXyyi7zgOHpvM9O+xVCdCTBjD
ZIRaYDiYFCwq9UgGVILymdVkgSimvb8BAo8GeA==
       "], 
      Association["Book" -> 10, "Theorem" -> 65] -> CompressedData["
1:eJztkbERwkAMBHU6Sfb79Qy5IxoicE7iBiiAKiiNjhBOaICQ3RndXHCRLvt9
2yEir8/583OWKfN5uyoiMbzNazzagTOWGFN42mh+wljZW2a375b1EwpEDyEm
WgFooIqKeVmwFJJq1A5zDoVpP/sbCOkGgw==
       "], 
      Association["Book" -> 10, "Theorem" -> 66] -> CompressedData["
1:eJzti9kNwkAQQ+fwjCeLBBs6oIzkkxIoIQ1QKx2x2Rr45Ek+JMuP4/06VEQ+
p/35PVfWunYa09hv/dLvmFQyiwATLI69Jdv+3DBfJuLmepbwIQlTKQHU4T5C
YxQEkDERs6jIJbRZpCUKi38BfN0FlQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 67] -> CompressedData["
1:eJzty7ENwlAMBFD7fLbzv/NJjVJRMwg9I2QBZmUjkkiMQMcrTjqd7ra9npuK
yPuIvx+IHHNNFmlcH9dlHe3UIzJYLbxY4cbhvM+X4nkyEYAQNSGwVwAKIUFz
2t6DcFMz16/smVPTjji2YNoHfhMFZg==
       "], 
      Association["Book" -> 10, "Theorem" -> 68] -> CompressedData["
1:eJztizsKwlAUROfN/RmuZgU24hYsLbQTgoWmsk1hFQi4U3fkS1yDnaeYYTjM
bpjuQwHwnuPPL9h35yO5frGPaJ5xbRbMcvRbZo6rPmf50Iu17SaWT0EpJAGF
sg6QWryK2iqmYIiEq7t+gbtQxANm0qmonLaHDx9UClk=
       "], 
      Association["Book" -> 10, "Theorem" -> 69] -> CompressedData["
1:eJzti7ENwlAMRM9nO84PpIjSICEFUTMAE9BGAgqUNgswKxvh/B3oeMVJd093
Xt+PVQB8tvjzE+7zjexOXLyUoz93FW32U3lFtFMsfUl5iDlyb+tFIEISMBiz
QGnSpFA11TBoKMPNPawCd2XKLo2P23IdLl93QAhW
       "], 
      Association["Book" -> 10, "Theorem" -> 70] -> CompressedData["
1:eJztkbERwlAMQy3LNjj+4ehTsRAFA1BkAQbIlGyEk4YJ6HjvzjoVqnxbX48V
IvLez5/f8LwrYqA8z0tseeCMKeoUPqzSL6iFmWOUfWfsn1Ageggx0Q5AA11U
zNuGrZBUoxbMOStM6+ofrWsFwA==
       "], 
      Association["Book" -> 10, "Theorem" -> 71] -> CompressedData["
1:eJztkb0NwmAMRH3nv3yOAKUNFSulT5MFmJWNcGiYgI73JJ+uuMqP47kdEJHX
ef78iA2IKyoy19zjgzHmKDO/eA2/oe6sHGOK70r7JyoQtmeakAKAgS4U87bR
VjpoqglzXUjjvPgbaeQFKw==
       "], 
      Association["Book" -> 10, "Theorem" -> 72] -> CompressedData["
1:eJztkbsNgEAMQxM75PgcSIiGlpUY4RZgVjbigpCYgI5XxEqsuPFWjr2oiJwx
fr5CtZl0pHcLVz5Y5pDcso8Mc2HvKSV7n1g7gUQxuJWCKmqgapggaEQIpAai
Lo04OMetndoLEo8ERQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 73] -> CompressedData["
1:eJztyssNwjAQBNBd7ydeJiiCC0KEGFwSJaQBaqWj2FADt7zDjDSa5/p+rUxE
nx67v6kTKurS8/51m4EFjzKhoPQZM67NqZ1VhYRlOFByOjKrK0vOPoi4Wbgb
u6Y2NsL8qxTjJeJM5gYV9RjzBiIVB3o=
       "], 
      Association["Book" -> 10, "Theorem" -> 74] -> CompressedData["
1:eJztzLERAkEMQ1GJb3tvbyAmZuiIEq4BaqUj1iRUQMYL7MCWbsfzcVjSq8ff
78CdCijmRw6YZEISfWT05rJ+IxBm7DqVznZUiG3rSC4V6cQZ/sKe+3XVqlti
RYzeIFQElQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 75] -> CompressedData["
1:eJztyksKwkAQBNDq/ySKE8E4UZmFR/AqrgR3WQs5qzdy1DO48xU0VdDnebnO
BOD5Pn8/VLWkfJv6/foj4l69lPFxGqdhOWgpflzlIXftlYjQIgIIUhsMJuXW
iNVM1MCdWy8s4vzlbmrhFaFp5xq2vWxeH+YHqg==
       "], 
      Association["Book" -> 10, "Theorem" -> 76] -> CompressedData["
1:eJztzL0JAkEUBOB58/6OXfU6ECzCEswPMZQFrwExNTSyPTvybnsw8wsGBoY5
zPfzLAA+a/z9EltmfcSldB71lddSNs+h1e2b7eZTjONu6FuBCEnAYFwKlCax
ntBU3cBUzfCI8A5mStVImPtkSzvtj1+uewmX
       "], 
      Association["Book" -> 10, "Theorem" -> 77] -> CompressedData["
1:eJztjLERwkAQAyX27t/+gZiYwA25BDdArXTEPQkVOPMGuhnppNfx3g9L+ky5
OBVaQGf9kR0GmZCUvc2kLo96jECYPnRrutvRQizLrGTRIp04w3+w1/GsWc2V
qIrRFw9/BHA=
       "], 
      Association["Book" -> 10, "Theorem" -> 78] -> CompressedData["
1:eJztyrENwlAQA1Cfz/f/KfyIgirFL+ioGYURsgBCYgwahmAgNiJJxQJ0vMKS
ZR/n62U2AO81/n4rapviXjfhrZdnluz52A2dr0m3HMdWtqfBjCQgiLZWyrQM
pMJDYIr1C9yd7hqw7Hu5/Hw4fQAdqgfL
       "], 
      Association["Book" -> 10, "Theorem" -> 79] -> CompressedData["
1:eJztyrENwkAQRNGd2dm9O8uAnJGSgERAD27BkhtwA9RKRxwISiDjBSON9E/b
fdlgZo/X/P3YfJzWffnIcA0tY1RNucaiy/V2znfoZqRodAuyX1BwREBeQv3X
hIge4asdsraKHSPVS4U/AXfNBX4=
       "], 
      Association["Book" -> 10, "Theorem" -> 80] -> CompressedData["
1:eJztyrERhDAMBEDpdJLsMQwZCUNA/tV8CTTwtX5HGPdAxgY3c9Id5+97qoj8
73g9bV+3pQ4lI4Othje2cOMc/MzLxLEzEYAQNQmgVwPUhYTzJv2JQL/SBjUr
U9ZStQ+C9GTaBTe/BNw=
       "], 
      Association["Book" -> 10, "Theorem" -> 81] -> CompressedData["
1:eJztisENwzAMAyWRsuT4UbhIH0H66UoZIQtk1m5UJ52hvx6IAwHytR/briLy
PvXn5zxvy3Thec+sLXuJyp5a18w2P2ZeNxsx6FncTDisEkIKSIAqWRQOd3wZ
UwSzTQqzwnCi2wc/TwT4
       "], 
      Association["Book" -> 10, "Theorem" -> 82] -> CompressedData["
1:eJztkcENwzAUQr8Bo9StfWoGyEoZIR3As3aj2rlkgtz6kEDi/xvb0fcjRcR3
2p/7eXzKCbG8WG0/c11c0TqKW6u+Xjk2YaTA0MwcQCQSeVw0OlEST8ImxOxE
4Q0JbS0/UkIFKA==
       "], 
      Association["Book" -> 10, "Theorem" -> 83] -> CompressedData["
1:eJztysENg0AQQ9EZ2zOzi4iSFjiRQ6qhBBqgVjqCbBE55R2+ZMnLfmy7m9n5
zd8PPGKoyqCmnjGrpaS5tL4/a44TzQDBQCvgnoScHuFShWTekkGQgcGB9sre
mz9RKUaqcAEjiATP
       "], 
      Association["Book" -> 10, "Theorem" -> 84] -> CompressedData["
1:eJztkcENwzAMAylSchHbMdAB8uhKGcELZNZsVLUokA3y6h1EfvjTax77NADn
J/7cQf0SKrX0CO/Rmg+um2odY9U1VP5EMDD9NWH01OCEJ1JeigjRFYsVatCd
7fl4Awl+BFs=
       "], 
      Association["Book" -> 10, "Theorem" -> 85] -> CompressedData["
1:eJztissNAjEMRP1L4tgB1iINcOCAhDhQCiVsAYtSKh0RJGrgxNPMm8uc1udj
RQB4ffTnJ4xtjE3rJVr04y169OtSY8bcbT8PRDyLIkAZGiEzF1ROLMI5laQJ
uZlkmgjRd+piqnfw4gdJYruzvwE7oggP
       "], 
      Association["Book" -> 10, "Theorem" -> 86] -> CompressedData["
1:eJztibsNwkAQRPd7n90DvPI1QECAhAgohRJcgJFLdUccEjUQ8TTzJpjz8nou
CAD7R39+w7pta6nXaNHne/Tot6nGiLnbcfxEPIoiQAkaITNnLKwswkmzFkVu
JokGQvSdOlkpD/DsJ1Gxw8XfDUkHng==
       "], 
      Association["Book" -> 10, "Theorem" -> 87] -> CompressedData["
1:eJztybsVgkAUBND33X37E5bA3MDAhENgJZZAAWirduTDIoi4Z2aSua3v14oA
8N3ndJBt+1h69Nqnvniv85i6x1qrF7+J2IsiQAEqITNHNFYW4aBRTZFrlkCO
6U+I0pjNnlBiGUQlt3v5AeALBzE=
       "], 
      Association["Book" -> 10, "Theorem" -> 88] -> CompressedData["
1:eJzticsNQjEMBP1NHDvAs0gDHDggIQ6UQgmvABCl0hFBogZOjHZnD3tYH7cV
AeD10Z9f8bxbO2XPsb/kyHFeWs54hG/nS8SzKAJUoBMyc0VjZREuWtUUubsU
mgjRd9riZleIGjtR8c0x3rMTBr8=
       "], 
      Association["Book" -> 10, "Theorem" -> 89] -> CompressedData["
1:eJzticsNAjEMRP1NHDuwa5EGOHBAQhwohRK2AKiVjggSNXDiaebNYY7b874h
ALw++vMzHtbO2XMcrjlyXNaWMx7h+3kS8SyKABXohMxc0VhZhItWNUXuLoUm
QvSdtrrZDaLGIiq+O8Ubhg0GTg==
       "], 
      Association["Book" -> 10, "Theorem" -> 90] -> CompressedData["
1:eJztybsRAkEMA1Dbsne9P7i94HICAhKGgEoo4RqgVjrCUAQRbyQlOu3Px85E
9PrM3+94ucw+13mLbtelzIiP0Q/xiSDKqiSJujCAzA6DKpJlc2P0qkkC5EtF
ylLd79RyO6ppHef2BlvFBeU=
       "], 
      Association["Book" -> 10, "Theorem" -> 91] -> CompressedData["
1:eJzty7ENgEAMQ9HEdoKuoWADVmIEFmBWNsJXsQEVX7noSdHt53WcGRH3XH8f
Vt2SX1OLzK21jjHKJyCCGawADSSQJpzZE0lQ+eY/VZCUZXmahQfbqAOx
       "], 
      Association["Book" -> 10, "Theorem" -> 92] -> CompressedData["
1:eJzt0bEVwjAMBFCdopPtF9tKXgpqRoIRskBmZSNEwQpU/OKau+7u5/U8ISKv
T/z90i3GPI4W+6x14+yPEaUUzwbINxSkwKQpNIFGZYaz0sV8ccsdFF/dGH0I
1cdiVtdoby3RBP0=
       "], 
      Association["Book" -> 10, "Theorem" -> 93] -> CompressedData["
1:eJztidEJwkAQRGd3du7MfQSSEySiQtB0kEosIQ1Yqx15AWvwyzcw8Gbm7fXc
DMB7rz8/5d5NdVzmYeof1zhfdOvqWEs7zAwtJEDkJo6w8EPbPSSG4CWr0En/
QiqUU0UKnVJkHdfhA0+yBZw=
       "], 
      Association["Book" -> 10, "Theorem" -> 94] -> CompressedData["
1:eJzticENglAQBd++fbufgERKwC4sghMXThqCBXigAzu0I78U4ck5TDKZy/ac
NwPw/urPbyn3tj29mlvX71wfmnMYzs1xDGYkAUGsAacs6yDlHgKLe8nIzDiA
5HTPAkUsqjWN1w/QgQeO
       "], 
      Association["Book" -> 10, "Theorem" -> 95] -> CompressedData["
1:eJztkcERwkAMAyXbseMcd56QCmiDMighDVArHXFQBC92Rnrs6Kfb+XycBPD6
1J8fM65Hq2PYVjku915mVlOLEPOPCDBQIIUK99VVwyNmELnkIpQ5/aLktree
HW7WVW1tI98cLgTq
       "], 
      Association["Book" -> 10, "Theorem" -> 96] -> CompressedData["
1:eJztib0RQGAQRPf2FjcMApHgC2RynShBA2Z0ojQd+YlUIPKCndn3hmWdFwNw
3PPzNZFHir0qE/deWzRNnT/eYEYSEES7L2W6AqnMM4EhFi/g7nRXiau3cvnU
jSd2rAYl
       "], 
      Association["Book" -> 10, "Theorem" -> 97] -> CompressedData["
1:eJzt0bsNhDAQBNDZD2t7MUYkZCCRkiFRxwVXAg1crXSEjyKIeMFIM+ksx+97
EIDzH6/HfaZh74a5t5JLXtvOSx2Z6xsEM8CQiUiotiaKiFbBFdqwBRZmlRuL
pBBT2uDqRVktj34BQEMFcg==
       "], 
      Association["Book" -> 10, "Theorem" -> 98] -> CompressedData["
1:eJzt0bsRgDAMA1D5E4eEkDuYAApaCkZhhCzArGyEYQgqXqE7q1DjpZ1HIwDX
E7/vzdNexqVaGeqw5ZKqd8z+DYIZYChEJORX6EREXcwKDWyRhVnlxSIpdimt
yJqrsvpgfwMkowUU
       "], 
      Association["Book" -> 10, "Theorem" -> 99] -> CompressedData["
1:eJztiMsJgEAUA/O+uyr4AWEVvViCrViCVw/WakeuWIQXJxAmWfZz2wnA9dTP
B0xpaI9RU/Kpatom5ouIkCMCCGIeDCblbMRqJmrgwq0UFnF+cTe14DOCxt41
WLfWNyyyBUU=
       "], 
      Association["Book" -> 10, "Theorem" -> 100] -> CompressedData["
1:eJzticENg0AQA71e7x5SROgAiSYogT9JAxHwJ62moxw0wYd5WPLMsOyvxQD8
jrm5gubzaL/cVr2z6555OoMZSUAQ7biUqQbS3cPB4l4yMjNOUHVNaqCIWZRP
/fgHb6sGbA==
       "], 
      Association["Book" -> 10, "Theorem" -> 101] -> CompressedData["
1:eJzt0bsRgCAQhOG9BxwwB86YGxgY2Y0l0IC12pFoEyZ+wR9sums/j04Arie/
T9S2VGtefS+epzEwjzcIIQARTkRChKQmIjqkotDIwViYVV4sYilb2lC0NCWN
Pucb/ZwEpQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 102] -> CompressedData["
1:eJztyakNgAAUA9D+thzhCAKFQODQjMIILEDCHCzHRhyGETA80aTtsKzzEgCO
O37fKIuee+ctr+sqfZZABEnAMOOudPg6SFuJwdzM0hckUXIBJ2osa2rHEytC
BUQ=
       "], 
      Association["Book" -> 10, "Theorem" -> 103] -> CompressedData["
1:eJztiMENgDAMAx3qNE2KhJBYgBHgwx6MwALMykYUluDDWTrpPB/nfgiA69HP
RwzBMXyMdVuspYigTRUgokWHLJne/pTIXIg0VbPCUowv8kiz9XCtpkavHjcI
pgTG
       "], 
      Association["Book" -> 10, "Theorem" -> 104] -> CompressedData["
1:eJzth90NQEAYBPf72btDEImQS7woQStK0IBadeREEV7MJpuZ9Tj3QwBcz/18
RfZpjrnp+i6VEhGUmQGGVEKh4lpM1ElzQqvA2tQs6AtJZwwL6GkMHjls7Q3m
8wR2
       "], 
      Association["Book" -> 10, "Theorem" -> 105] -> CompressedData["
1:eJztyrsNgFAMQ1HHcQIVr6ZkBRagp0LUiOItwKxsxGcJGk5h6Uoe6rFWA3A+
8/sM901LltK1bxrMSAKCeAdIWT4/yj0ENu5NRmbGC5LTPVsoYtZdUz9eGe0F
Yg==
       "], 
      Association["Book" -> 10, "Theorem" -> 106] -> CompressedData["
1:eJztzLsVgDAUAlAQXn5Ha2tXcoQs4KxuZFI5go234NDA0a+zE8A94/cdKcuS
ttFtQVRuWBJW0slQKZGlGJKDIYb5ElnbXmtgvnhMKDzJDgO3
       "], 
      Association["Book" -> 10, "Theorem" -> 107] -> CompressedData["
1:eJztiTEOQFAUBPftLn6EKFRKndpRHMEFxFHdyP8qN9CYYpLJzPux7QHgKvr5
kMln6vuufiIQQRIwzChJh/MgXakymMzmBSRRcov8B8tax+UG7K8Ejw==
       "], 
      Association["Book" -> 10, "Theorem" -> 108] -> CompressedData["
1:eJztkcENgDAMAx07CqBW4tUBWIkRugCzshFpP4zAh7NkW/766NfZDcA97OdL
uEWtJd5B+Ylg4JSNlkFxocEJDXy6ECFmX03BRjn3Vh62vwOD
       "], 
      Association["Book" -> 10, "Theorem" -> 109] -> CompressedData["
1:eJzt0bERgDAMA0DLsZyQM2QGVmKELMCsbIQpmIGGL3Qq1Gmf5zEhItcTv09t
o9bqWYB8Q0EKTBaFJtCozHA2upgXt9xB8QrjiBCqRzFrfe03xTADzA==
       "], 
      Association["Book" -> 10, "Theorem" -> 110] -> CompressedData["
1:eJztyUENgDAUA9D+tmPAEoIFbghACRJmAK04gnFBAhfeoWnTpR57DQBni9+3
cilj97RABEnAMKNNOnwfpK1ksDdz94IkSh7gpMmytnm9ALv/A+s=
       "], 
      Association["Book" -> 10, "Theorem" -> 111] -> CompressedData["
1:eJzticsNgCAAQ/sDPOAQruQILOCsbiScHMGLfUmTvh7jOgcB3Kv+fJy+9/Ku
CDCW0MSskCjJ82CsZnkJTkCWtKQi9mbHpeUBsrIDSQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 112] -> CompressedData["
1:eJztkbsNgEAMQ53Yl/tswUosgHQFLbOyEYGGEWh4ki3brZd5rNMAnLf9fM22
t7cwP3FEyqFUh2cUWWQIFYpS5YOREdFGt+GqnrsGL+kkBCg=
       "], 
      Association["Book" -> 10, "Theorem" -> 113] -> CompressedData["
1:eJztkbERgDAMA2VLMQ5MwUoUDJAFmJWNMDSMQMPfve57rePYhgE47/n5nD3f
Zn3iiNKhssMrRTYZQo2ilHwwMiJy7ra4wqVJnRfHfwO5
       "], 
      Association["Book" -> 10, "Theorem" -> 114] -> CompressedData["
1:eJztkcENgDAMA53YDWm3YCVG6ALMykakfBiBDyfZOvnrfZ7HNADXqp/vyVdZ
nzii4lClw0tFNhlCjaK08cHIiMjRbbjS1z78Bq2IA2I=
       "], 
      Association["Book" -> 10, "Theorem" -> 115] -> CompressedData["
1:eJztxLsNgDAMBUA/+yV2goACKRIlKzFCFmBWNuIzBQVX3NaPvUNEzqffN5iZ
GOAummQB1LQi8uBk5BR5TAinE7c3wKjrPLVWJcpAVhYlLsVrBAM=
       "], 
      Association["Book" -> 11, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAAjHh4pAEAlWgCfA==
       "], 
      Association["Book" -> 11, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABwNhgBMcII/Z4ISK2gEoAlk8CgQ==
       "], 
      Association["Book" -> 11, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSABjAyQ6GBkhPDIBACWJAJ/
       "], 
      Association["Book" -> 11, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAB4swM7MzMjIwMbIwMLMwsbCxsDExAHhcrK1QB
E7oOYPQxMbMwMaFEIwCtXQLu
       "], 
      Association["Book" -> 11, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABrAzsTExAmp2BgYWZlY2FjYEJGEFcrKxQeSZ0
DUBZJmYWJiaUaAQApooC1Q==
       "], 
      Association["Book" -> 11, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweABXNycjIwMLCwMjDysjEyMDBwsjIxMbGzMQMDA
xMzBzMjAzIQEGIEARAJ1MoIRSBoAscwDHw==
       "], 
      Association["Book" -> 11, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweABwPhgBAIQixHGRUgR1A1WDACXZwKI
       "], 
      Association["Book" -> 11, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSACLMzs7AxsjAyMPKxMQMDMxM7MzsTECARAgpkR
KIECGBgYQbJAjRAmSCUAqt4C9w==
       "], 
      Association["Book" -> 11, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWACwkwsDMxsDIx8rIzMjAyCPCxMzGxsbKysrEws
rEJAcTYWJACMPkYQgFAMDExMLIyMALxeA2o=
       "], 
      Association["Book" -> 11, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaACzIwMzKwMTJysTMxMDBJ8DExMrKzsbGxsDKxs
/OwMDOysSAConJGRiREUi0xAJlAzCyMjALhRA2A=
       "], 
      Association["Book" -> 11, "Theorem" -> 11] -> CompressedData["
1:eJztyrERgDAMQ1HJlo251NzRshIjZIHMykaEhhkoeMVvpKOPsxPA9eT3LTuw
IIJ0muXWjKzMrEhWeM3h5aS0SoKb23w3Cje4wgNz
       "], 
      Association["Book" -> 11, "Theorem" -> 12] -> CompressedData["
1:eJztxLERgEAIBEDuOGB4A2NDW7KEb8Ba7cgxsgUDN9h9nseEmV1Pv89piwAc
ydwWAl1VIwod3sGXA1JLMqeTOVa53beyA4c=
       "], 
      Association["Book" -> 11, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAD4FhhIlc3AJWSAn0=
       "], 
      Association["Book" -> 11, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaADjAzMTKysTCwMTMAI4mRhgQozoatjBioFARRB
AJ5ZAq4=
       "], 
      Association["Book" -> 11, "Theorem" -> 15] -> CompressedData["
1:eJzt0UEBgDAMBMFcrrkkMrCEhBpAK45KX1jgwTzWwB7zOifM7N75fRATcGB0
6XlUkrI1KrxaL7gzSIZxc2XQFrRUA30=
       "], 
      Association["Book" -> 11, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAEFEQMAJUdAns=
       "], 
      Association["Book" -> 11, "Theorem" -> 17] -> CompressedData["
1:eJztxLENgDAMBEC//P6YwiiwAStlhCzArGyEqFiBgivumOeYMLPr6fdFQMjp
GdsitVCoCU0s4uVA1dp7WFY6yR1uN7dCA7A=
       "], 
      Association["Book" -> 11, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAErIzMjIwczEyMzGxsoGhiBAkxMqDHGBMjIxMT
iAaqAAIgCwCe+wK6
       "], 
      Association["Book" -> 11, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAFoPhhZGRE4mEHTEhsAJXhAoE=
       "], 
      Association["Book" -> 11, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAFPLyMzIyMjCzMzOBoYmRghCJkwMTEwsQEFARi
JrAuZgCgwgLO
       "], 
      Association["Book" -> 11, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAFLIyswAhiYWNjYGQEMhiZGRiZGEBsGACymZjZ
mJhAoiAJsCYAng4CxA==
       "], 
      Association["Book" -> 11, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAFjEzACGJmYoJxIQgFMDGxMDExwuRAAACZiAKk

       "], 
      Association["Book" -> 11, "Theorem" -> 23] -> CompressedData["
1:eJztkUERwDAQAoGDTGzUUiTEQLXWUS8u+ugOs4/9cu17bQJ4jn4+jfsjVUQF
slUjI07PYRXUlVCL5UnxBaKhAwU=
       "], 
      Association["Book" -> 11, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAG4kxMTKysXJycnAwcnDxcDAwcLEgAqICRkYkR
FItMQCYDAwszIyMAr2EDNw==
       "], 
      Association["Book" -> 11, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAGXJwsTEx8vLy87Fy8rHzMjLwcrAjAxMjIwcbB
wcHCwMzCzMzMxMbBwsIIALdrA6I=
       "], 
      Association["Book" -> 11, "Theorem" -> 26] -> CompressedData["
1:eJztxDERgEAQA8AkFzJcjwEsIeENoBVHzFdYoPgt9hz3NQjgmS2/dojsJHuC
3qqjT5F220bJktMyXqrtAz8=
       "], 
      Association["Book" -> 11, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAGzExMnOzsbGy87AycLGy8HCzMLCzMcMDOzcHO
ycrAxsHHDFQmyMzKCACt8AOH
       "], 
      Association["Book" -> 11, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAH7OxQBhO6DDD6GJmYGVFjEQCYTwKT
       "], 
      Association["Book" -> 11, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAHctBoYmFkRI0wII+ZlYWZmYkBLMMIEmAAAJzw
Aro=
       "], 
      Association["Book" -> 11, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABLAyMqBEG5DGzsjAzMwElGEFckDQAlzoCmg==

       "], 
      Association["Book" -> 11, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAH8hJi7EISDGYsjLI8HEDAxgEGTMzMLMwsLMzM
DExMLExMjCy8rCyMAMMhA/c=
       "], 
      Association["Book" -> 11, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAABpqsnAaM3EzMHILcCMDExsbMxMLMzMTAxARi
sHKysTIBAMKeA/4=
       "], 
      Association["Book" -> 11, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAA+mxcnozcTMwcQjw8PNxc3GDAxMbGyszCwsLE
wMzEzMTCwgrkMwEAwGMD9g==
       "], 
      Association["Book" -> 11, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAAbNyKjCxMbCzigpyCAmIcYMDEyc7CzMLCysLA
ysTCBGSwsrMxAwCzxwOu
       "], 
      Association["Book" -> 11, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAAnKwMrMyMrCyMCMDMyMjKxMLKyszAxMjMxMrI
xsHIzAAAnUEC4g==
       "], 
      Association["Book" -> 11, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAB7AwsrMxc7EwQwAwigAJAwMLKwMnMw8zMxsLO
zMIIAJ/iAww=
       "], 
      Association["Book" -> 11, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABjNxMzBxCPDw83FzcYMDExsbKzMLCwsTAzMTM
xMLCCuQzAQCrCANu
       "], 
      Association["Book" -> 11, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZABTAyMTCxIgImBgZGRiREYi4wMIJKBiYWJCQCb
ewLE
       "], 
      Association["Book" -> 11, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdABLIzyXGwIwMTIyMLMwsLMzMDExMLExMjCy8rC
CACkXgMr
       "], 
      Association["Book" -> 12, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRACeuxs3BzcnFwcHGwcHBzM7KwsbKxsbGwMfOzc
rGxcHNxM7IwArGMDlg==
       "], 
      Association["Book" -> 12, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVACzCyiQqLCIjwcXBwcHGzcnEwsDMzMTAwsjFxM
zGzszExMjACqjgNr
       "], 
      Association["Book" -> 12, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZACvBwcHOxQzMTIyMzCwszMysDMxsbExMgiwMrK
CACiaQMq
       "], 
      Association["Book" -> 12, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAC2hraSnJKSvJKikpMQgLMzEzMrKwMzMwsrKxM
7BycXKwAw5oEdg==
       "], 
      Association["Book" -> 12, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRADqdk+Hs622rbW1ozK0iABJkYGFhYmZgZmJmYW
ViYA5WkFng==
       "], 
      Association["Book" -> 12, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVADWvKKvq56rs5OjLzCID4TIwMLMxMLAzMTMzMb
IwDKGgSx
       "], 
      Association["Book" -> 12, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZADAZ4udlp2NjaMKjIgLhMjAwsLEzMDMxMzCysT
AM28BNk=
       "], 
      Association["Book" -> 12, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAD9sYmpiZmRozGWiAeEyMDCwsTEwMzEzMrKzMA
wbQEcw==
       "], 
      Association["Book" -> 12, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAEsvIy8oryjPz8IA4TEwMLM5BgZmJmZWMCAKmh
A3Q=
       "], 
      Association["Book" -> 12, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAEcTZx0dGM/AIgNhMTAyMLCyMDMxMzCzMjAMBa
BGY=
       "], 
      Association["Book" -> 12, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAEVuWZGYx8/CAmExMDIyszIwMzEzMLMyMAuy4E
OA==
       "], 
      Association["Book" -> 12, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAE8gryjHx8IBYTEwMjKzMDAzMTMysTMwCgVwMU

       "], 
      Association["Book" -> 12, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAFaWmMfPwgBhMTAyMrMyMDMxMzCzMjAKnrA4I=

       "], 
      Association["Book" -> 12, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAFeYx8/CCaiYmBkZWZgYGZiZmFmREAoWkDIw==

       "], 
      Association["Book" -> 12, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAFjHz8IIqJiYGRlZmRgZmJmYWZEQCX/QK2
       "], 
      Association["Book" -> 12, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAFjIwgxMDIwMTMxMTEwATkMgAAlasCjg==
       "], 
      Association["Book" -> 12, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAGBUDMyMjIwMrCxMDAxMLGwcQCAJ4lAxI=
       "], 
      Association["Book" -> 12, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHjIyMDKwsTAwMTCxsHEwsAJXVAqI=
       "], 
      Association["Book" -> 13, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAGnGw6GqwMrKyczCysrNyMzAwAnDoDFQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAGbJycTIysLEzMTIzMbIzMDACXOgKz
       "], 
      Association["Book" -> 13, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHnJysDMwszMwsjCxsjMwMAJb7ArE=
       "], 
      Association["Book" -> 13, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAHeqwMbKyczCysrLyMzAwAmRQC4w==
       "], 
      Association["Book" -> 13, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAHrAxsLJzMLKysXIzMDACWegKx
       "], 
      Association["Book" -> 13, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTAAHCycoszMLGzCdgIAl3MDFQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXACjAwAlOQCew==
       "], 
      Association["Book" -> 13, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAAnOz8LCwsbOwsrACWdQK7
       "], 
      Association["Book" -> 13, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAAcuycwixsrCwsAJcgAtI=
       "], 
      Association["Book" -> 13, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAAbBx8rBxsLKwAlfYCsg==
       "], 
      Association["Book" -> 13, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTABTCzMTLncTgCWvgM/
       "], 
      Association["Book" -> 13, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXABPNxsLCzMAJWQAqI=
       "], 
      Association["Book" -> 13, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbABzBxcLGwAlT0CmQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfABHCzsDACVHAKN
       "], 
      Association["Book" -> 13, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTACLKwMAJT4AoM=
       "], 
      Association["Book" -> 13, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAC3EEAlUoC1w==
       "], 
      Association["Book" -> 13, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbACGgCVCgKi
       "], 
      Association["Book" -> 13, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "]],
     SelectWithContents->True,
     Selectable->False], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"#", "[", 
           RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", "]"}],
          "\[Rule]", 
         RowBox[{"Max", "[", 
          RowBox[{"#", "[", 
           RowBox[{"[", "2", "]"}], "]"}], "]"}]}], "&"}], "/@", 
       "res"}]}], ",", "vals", ",", "acc", ",", "xval"}], "}"}], ",", 
   
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    "\[IndentingNewLine]", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", "#1"}], "&"}],
         ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    "\[IndentingNewLine]", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListLinePlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"Axes", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"False", ",", "True"}], "}"}]}], ",", 
         RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
         RowBox[{"Frame", "\[Rule]", "True"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", 
            "\"\<maximum shortening\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}]}], "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", "200"}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

although this shortening is very concentrated around “nearby theorems”:

ListLinePlot
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"res", "=", 
   RowBox[{"{", 
    InterpretationBox[
     DynamicModuleBox[{Typeset`open = False}, 
      TemplateBox[{"Expression", "SequenceIcon", 
        GridBox[{{
           RowBox[{
             TagBox["\"Head: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["Sequence", "IconizedItem"]}]}, {
           RowBox[{
             TagBox["\"Length: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["465", "IconizedItem"]}]}, {
           RowBox[{
             TagBox["\"Byte count: \"", "IconizedLabel"], 
             "\[InvisibleSpace]", 
             TagBox["5397840", "IconizedItem"]}]}}, 
         GridBoxAlignment -> {"Columns" -> {{Left}}}, 
         DefaultBaseStyle -> "Column", 
         GridBoxItemSize -> {
          "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
        Dynamic[Typeset`open]},
       "IconizedObject"]],
     Sequence[
     Association["Book" -> 1, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 1, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIYWBgYmRnBLBoBFihNSztGNkAOWQCuDgKK
       "], 
      Association["Book" -> 1, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgIWNhYWRbMDAQEABO5SG2cGAC2A1HdUq6gL8BoLt
I+w9agCI5xlgPibVSuTYAwDLWQPN
       "], 
      Association["Book" -> 1, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgJERyhgFQx8AAJpGAn0=
       "], 
      Association["Book" -> 1, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJAgJ2Tg5kJCBmZGJABEyoXC4ApwKuSHUozI+sYBfgB
M7mKAf41Ar4=
       "], 
      Association["Book" -> 1, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLAgJWFgXaAHUoz4lU1CsgHyCELALB1Aow=
       "], 
      Association["Book" -> 1, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIggJuVBQiZWRjhQoyMjCwsYAoqhFCMACwwmgVZFKgD
AiGAHVkxMxZTRgEDOKAZIGENxqwMMBY8SrBFAKYMAC5tAuc=
       "], 
      Association["Book" -> 1, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKggJ0VBNmYmBihgImJiY0NTEGF4DJIgBWIGRiANCuy
KFAHBEIAO5JiRhZGJEvRABYLGFEEceokE+A3EGwfI3ZnURkALQKGMwM4rBmY
gJ5mAgcaKATh4Y81AjBlAC8MBBg=
       "], 
      Association["Book" -> 1, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJggIOLjZOdgxEuxMjIyMEBpqBCyIphgAVGsyCLAnVA
IAQwQWlWEMGGxZRRwAAOaAZIWIMxFwOMBY8SbBEAAUxIbABHwQL7
       "], 
      Association["Book" -> 1, "Theorem" -> 10] -> CompressedData["
1:eJydUUESgCAIXJjk0C/6Uk/wA721H4WIpJN1aB1hZhEWcMvHngnAWUxAVlkS
MzmYOSVzTkWkg+gF1EvPakY9kYj6uKgMmiMmAoTevaf+w3dB06NnW7VT7xdT
vo0Dum0bJBirHYPpnmG7BivJ0lYe+59+gEfgjWi9CwMwA+M=
       "], 
      Association["Book" -> 1, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIQgJWXm4cRLsTIyMjDA6agQqiKIYAFRrMgiwJ1QCAE
MMHMR9YxCtAAKJwhYQ3GfAwwFjxKsEUABDAhsQFG6wL+
       "], 
      Association["Book" -> 1, "Theorem" -> 12] -> CompressedData["
1:eJydUMERwzAIkxDpHlmpI2SBzNqNGsBpyKXNo7qz8XFCMpqX9bkQwCuuhkku
WYCkJPcsYgAC2BETj5wj3D8iW18MJnZ1GzU5wm+c1FtFmZE3s3/hXvDwq3Wb
f61f/8NI4yJWzSTiiOybfTy21PNkRLQp6BZh2p6f4YqRUeu8AY+BA1g=
       "], 
      Association["Book" -> 1, "Theorem" -> 13] -> CompressedData["
1:eJy9kNENgCAMRHstfsIOruQILuCsbmSvoBKNfhlfCD2a0oOO8zLNEJGVW08u
Ztows5wVlAg8yJUEJhVpFxHMr0FFW5HGAgbqdOtxgg5hoxqrDvErvd+79xcP
88m3LxrtCifIk8//yeVIaJfcAMoOA2g=
       "], 
      Association["Book" -> 1, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQAB8zIxMUMDMz8vAwMTACAUQOQzEQsIBlmRhZWODK
gICZiZGJAQQhAEazgnVgMWUUAAEwvBkggQjGfKDwA7PgAYstAiCACYkNAFr6
AxM=
       "], 
      Association["Book" -> 1, "Theorem" -> 15] -> CompressedData["
1:eJydUMENgCAM7NHCkx1cyRFYwFndyFJAUYMhHklTyvVabknbmkBEew53BJaK
EBCjgNk5GBxI+agodGHmHKUk9S149sin8Go/vOaQ98wTTfbSL1dq9XHrP8wJ
4rGPlSa0MumThz5Rv4ltkhmp1ov9XP0fTUDbxHXFAxB0A6o=
       "], 
      Association["Book" -> 1, "Theorem" -> 16] -> CompressedData["
1:eJydUdsNwyAMvGKcWBBF+ckAXakjZIHO2o3qAyTSJs1HT8jY3Pkhc9+ej+0G
4EXzhWWacs5mNq+zqgldIWBA2IPqUVUBixodQBViTTFJQmolXSk8bkD6J2QH
D0Kot7Ssy9y/EC7Z3o+T0PbE4pctBLIMD8XqYxGir+ysPR0zRdkmFm+t0WSE
OmHD0FTxMGD7kY/HN8qGBSc=
       "], 
      Association["Book" -> 1, "Theorem" -> 17] -> CompressedData["
1:eJytUMkNAjEMHF/JWvBgP/xpiRK2AWqlI+yEVWDJk4mUxNbYY89te9w3AvDM
6wj3pSxmdlovZsJn9yIJcgJ9IsmVRAAjU+ZIaPAYWCvHQX13jIwCEicZE8kd
WZ1IOURDin+8UdmqIvgzZvsPDD2iXIsGva/fXEhXOMOfZj3ZiBiWzeTzU4rA
muQ1nGA1VqSnrrrPcxSQPsJ32xf6SwRy
       "], 
      Association["Book" -> 1, "Theorem" -> 18] -> CompressedData["
1:eJydUcENwzAIvMOBGsmVukJX6ghZoLN2oxocKZbS5NGTjMEcB8LP9f1aCeAT
5oBH8+bu93YTMbFaTQKoAGcEt5g74IVULcAgoqmqKHQT7C9Mi3TOIRN6QI47
zkhfFf+Dy2mmfjFJ2L0w/dwCIxvhQWw8JhH7yn61D8eMKEnpG+2CLjUV6rKc
TcvtRzAv5gvPEARC
       "], 
      Association["Book" -> 1, "Theorem" -> 19] -> CompressedData["
1:eJydkNkNAyEMRD02GBYtRaSllLANpNZ0FB9SQs6PPCEzAh8Mp+NyPkBEVw/v
zDHHGHNvzE20dxUHHeAVGNJaIxoFULUgUqsAe9UqFRUJxxKQawsfhzqyYL2Y
bafUMUy+Vv4J/7yNefZa9+0vSMtpKXT8Avstp8dn8jAS8fiyNeEubJAqKGzS
FmLIFhm9lNfGS322peXwBoEJBTQ=
       "], 
      Association["Book" -> 1, "Theorem" -> 20] -> CompressedData["
1:eJydUUESwjAIhAViGzvV8Qd+ySf0A77VH7nQquOMenAnISQQyG7Oy/WyqIjc
0nzA8XTofT/tgB6tNY+ENTMv1GJm4GRApJvqPDcIz+DAFGN4IMBaqMFDq9Ke
u2+IyFZrO9BJT1YfLBD+/ep/sJ/RpJbPTd7kb052G5J9WYrAJXL7im5wTzUq
EatyqLRnsj8uJbFhUKGOZjKahGrHKEqZ29qHUEuF359fn2FSVYQaQ+5i6QZJ

       "], 
      Association["Book" -> 1, "Theorem" -> 21] -> CompressedData["
1:eJydUcERwjAMcyQrpL2DHViJEboAs7IRttsCD8oDXezYia2zkutyvy3NzB7p
vuE89z4mAZfh7l0JOumF2kgibJBmXa3N84BJUU1OPEU1iKBCLQdUzJ7ZEaJd
q0lIrthtjQFQftz6H/jzVjlyjJu6Q39K2pHqy6dKUpm+bzd4qPY6JtaXQ5W9
in1vSmG901oyWgRi65Rtb84dMH6OHHF9Bq1YDBHYEw8WBhY=
       "], 
      Association["Book" -> 1, "Theorem" -> 22] -> CompressedData["
1:eJydUdsVQiEMa5u0esXjnwO4kiO4gLO6kaHXx4/6YQ6EQgqFcLpczxc3s9uk
jziOMfabiMO2qljVTGajB5JQU2S2wH0MwICITeY+h9alRYSkyZq6ubtyvha1
qqy1V0HBjGyNAbDy684/wZ+qiptBWZw3ERNPzNc3ywQNNadv9QG9GdnL7dO0
pNNeyfnclF0Nlu6kLbSCL7Ezl80VwRVOSPY3wr0/g9anOCh/7/RtByU=
       "], 
      Association["Book" -> 1, "Theorem" -> 23] -> CompressedData["
1:eJydUclxAzEMoygAXI3XfqSDtJQS3EBqTUcGd51HZuI8Ag2hg5AgUe/3z4/7
iIivpt9xu62FzLcdQK5GoQw9oySBQpERO0Zer8TgzLxIS7s2Oj3n9CFplmaG
hwjkS9O1tlWOBuxY7uMc+xpc28ud/4T+zNo8Alapb2IWvtGvP1hw84NF9grJ
Th/kejkgT3TUrqo1rW9uQfetLRtVzeAYrqcYa46dlxg+FnPyxCCSzB84PyQO
vwQT4wEkUQik
       "], 
      Association["Book" -> 1, "Theorem" -> 24] -> CompressedData["
1:eJydUcERwjAMs2TXoRwHb56sxAgswKxshBQ4eJUHSuo4ZzWSk8vtfr0hIh4O
Gziva5H7rirujJrI+iBJVi1LxGkHjtEEATS59r7HkgsToGiAQ4USKmxqtiAB
L822dFdoC039WmPb7n+o31WX5TbtSZeRaucNda+YAjSycqby6GYx78EkfWod
k5NvTsKcyai5cyKhMSAlFaM1gUMdXUa9ZOcpjvwCfgA/ULxc+eB4AhvDBf8=

       "], 
      Association["Book" -> 1, "Theorem" -> 25] -> CompressedData["
1:eJydUdERQjEIIxDwPe88XcGVHMEFnNWNDFQ/9cO0TenBEaDX++N2h5k9m75h
3+l+LJK+NWQwGZk5BhnuLivNLhv8cCiHAyj3vY5VGekBuMLEIpqMEH2VLCHI
vsqrpYumJ7SVg/Wj3L/A3952q9romjSMUDtvqHtxCNAKxpirz74mKFMH0V1P
6IoJzEAmYF6dTkLbBinJaaUNnPLcbnDJThasUX6A+QB9kK2qRvwF+p8F9Q==

       "], 
      Association["Book" -> 1, "Theorem" -> 26] -> CompressedData["
1:eJydUNsNwyAMPJuXSYUyQ1fqCFmgs3ajcJAqSEn56EkYPw6f8XN7vzYB8KH5
iTWHEDWXkpRAAWQEOSnmDJgL3nsHdCJW86YGOxrVjNBWAsRNFHVADUT6zdPL
s3H/wfT/gx4noT0fNr9tQVhleGnWk42Ic2V38nSWRRGb5IPSdYMJWgslhO88
N+Nf2+6XQgQ8
       "], 
      Association["Book" -> 1, "Theorem" -> 27] -> CompressedData["
1:eJydUVEOwyAIfaC0xCzZfnaAXqlH6AV21t1oPuxil3V+7EURAXmAy/ZYNwHw
pPiNq9mkF3dTAg7IEQyZkxngyXLOqfkVuJdctKDseapFuGoAJA0I9YB6qbni
5G7uYbl/YNx/52MllP1h6DEFoVfkJFkzRiD6yM7oqbgrLChvpM6uM1Q43PfI
vtqX/Uc+jC9qmgQa
       "], 
      Association["Book" -> 1, "Theorem" -> 28] -> CompressedData["
1:eJydUcERwzAIkwBfcsldM0NX6ghZoLN2oyKc9vypH9XDxkggbN/P5+MkgJeW
CVozP/ZtCcE2wki7wKpdoN3RogoySydvgfBg5Ik9K4Ku2Cd+chFkB+tmuWe3
aj+t/Qs2Zy9ac4QB5ECxiEwRdb2RHeoNJSxpVwwyG4N1DTRXdEDP5xnLZW/t
17TJ8vMTX7wBXgIEJA==
       "], 
      Association["Book" -> 1, "Theorem" -> 29] -> CompressedData["
1:eJyVUEESwkAIS4BevfgCv+QT+oG+1R+VgDqd1daanYElwEL2Ni/3mQAeMke4
kuHuVnCCGyhAyOQtojhVyjn7NJJy2Ump4P7Q7H55Q8/IiIYn/2vdv3HuwdIw
Up9FX/rGCQPzjkqmdJboScGl5RP5/3vbWu9G2Ca9ArkiA3o=
       "], 
      Association["Book" -> 1, "Theorem" -> 30] -> CompressedData["
1:eJy1UNsJgEAMS9rerzu4kiPcAs7qRvaBoIjn16VwzYWUlqx93zoBHPEMQZqK
MKHpfo6Y6+kq4t4wq7IqRUjMUticqw12lj373UVc+t+58zB/tSeETBotPkuw
KM/06wZ5EccJSdkDHQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 31] -> CompressedData["
1:eJyVUdtxAzAIA0s8TH/y0wG6UkfIAp01GwWc3PUr6VU2nM/CCPDX9ef7qiJy
G/cen2stflQvW2Z7527LHAsPYwEiQV17h2tf8FLl3AwQaUDvBTDTKFCl8LVo
1S4Pr4FVtUSFRBV6Ax7xZ7n/RL5lj960F5kVZt69PJHTGqzRDZoNeViSQx+H
vdtgnCH4AwFEx5+xDIk4sVNH55I+tihcyjTcBT3QHntnZXRmQkn9xVLN6K84
D0F1V8gdy+EINQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 32] -> CompressedData["
1:eJyVkLtxAzEMRPFbLInjXKDMoVtyCWrAtaojgedx4MDW+PEDAliCHLzfPz/u
KiKPvb0gwrGOqpWBPJhrTo4mswMIbI2LGuAhYWZvt5sp9JSSNckdIm0to5iq
CX9/tGrOZM5qoopkUVjlPd3B8/V3/wf/zu60bztGEUj4N+Q+o/Ee6ORXNrpd
21yiMXo52oHnBbKvtn5HtsA9L+2Q3WhIO4QYpFL9NHF2wW50g64crhH6A/Lo
v8j1ngbU5QmkvQgQ
       "], 
      Association["Book" -> 1, "Theorem" -> 33] -> CompressedData["
1:eJyVUcERwjAMkxwTcr22O7ASI3QBZmUjLAe43HEUqkcSW1LsOJftdt0I4K7l
F9apuWALYaRbB9N7hvaCk6c2shLMDhlSqmw3WNG57NSUSVA5qIT8EQaRt/sf
7R6C7bNPWn24AeRAMYlIUbz2j4fJb0hhSrtikNl4aK2g5lyXmIR5iZKK1lq/
dRtX8fUTbzwATsQEIg==
       "], 
      Association["Book" -> 1, "Theorem" -> 34] -> CompressedData["
1:eJyVUVsOwjAMS+xMG/3iD/HJlTjCLsBZuRF2i8bEBw9PnRLHkt30st6ua0bE
3b+vOLbJ4AlJYOYAkMKMqgjGVF0qFgUcGCzCHcRioOiaHzztYtgubaE61KZa
gfVL3H/Az1OP0/mVgZnjyh0KlyZE6fNN99NNpJNdmGMF2euXYCtk1BpjtiSa
NqFtFcLdeVmeefAe0Nn6S+zJB+pEBOc=
       "], 
      Association["Book" -> 1, "Theorem" -> 35] -> CompressedData["
1:eJydUMENAjEMix0Tjr5gBFZihFuAWdmIJNyjAnFIpFLT1HZt9brebyvM7FHb
7xqhXDzDAvAIkiJae6Rk5nZQM70hntzcHTUxb7mV19l3jCQ1MbtSBZQ+R1M/
v6v9q7iPbnDlEKtPEBrIKxRe/eMzK7WsiU19MSaa5hxjuEVbjkQoT8tyuSzL
t7SJgvFm/ARWigQ1
       "], 
      Association["Book" -> 1, "Theorem" -> 36] -> CompressedData["
1:eJyNUdsNAjEMs/No+YDPG4CVGOEWYFY2Is6dkPgA1ZWSNk6bOL3vz8dOAC+Z
BcSlVl4NbubuEWFBEsSsCGDIqDQyih3uWZRXghVOVxetQ47fRedMF2aBocfC
4ZHI1PPua+2uI/6yXY/y1ZJLBxu6GCUbkgWjtMkqu1keEisI8W3tHEbtj9H1
tj2pQpkTPiR0AmOEDVKDvG3bp1t+z65O9R8q03XZo30DroYEyA==
       "], 
      Association["Book" -> 1, "Theorem" -> 37] -> CompressedData["
1:eJyNkM1tRCEMhP03NhghlHfey7aUEraB1JqOYoiinBK9DzECj6UxPF8f7y8m
os8td7geb4/Rsl8j11qZubL38OYm24awmLGziUi6CwdP6tQcQJVKWxMnYQbh
79A5R/fwPgsboyJGUIypc6oqYt4c9zb5rxtRokStZc4AHPpD5j7Xu6C1UOa3
a2bbPqJr1VbUBeoHhGqr/l3ZDapxevccVn8pyhlkoMlsLsTp6DiE7SiG8S/K
3NvVO+iEMpyVvgAy2Ain
       "], 
      Association["Book" -> 1, "Theorem" -> 38] -> CompressedData["
1:eJyNkFtOBSEMhnsvBU5HdJ5N3JJLOBtwre7IcozxxZj5gKYtPxT6dv94vyMA
fG5ziafX9dxy5uw558js6R7SXGjvKiGJoJMQUWiFjgMcTJiNmZjVjAyYSOH7
yJ+cZ84WMc/C1jqOvgL6WrIOZvYYV597lf8vjCijAHNmnt0snH8YY/tecA0P
d/OdEakvs8pD1Hst9gqc2wOLnTHbbdHtcrCoalYhqcYQYzMQhRdErmZiVw+z
UqiJstYU/IURbyNvN4US7OoNGb4Av5UJbQ==
       "], 
      Association["Book" -> 1, "Theorem" -> 39] -> CompressedData["
1:eJyNkMltQzEMRLkvWmxAp1zTkktwA641HYVSEOQU4z9BA4lDYCh9Pl+PJwLA
15ZrrI8eLe+9jbXGGGtkuoUJbVMJSQQNhYiaGaHjgIAwVa1SaQQJEKKC/B96
mz3NLWchfUd0B++T52Rm9XF53Iu0t657CVdXcXNVU/5ljH2udynX0jJ/XBHZ
9hFeqzZrXZTtoM4c1b8ru4HZT++eQ+ovidEbiMFEFCPACE09uOwoVME/GDHj
nqlwQlENGb4B98UIkA==
       "], 
      Association["Book" -> 1, "Theorem" -> 40] -> CompressedData["
1:eJyNkFtSBCEMRRNuHqQ7Q4szNf9uySXMBlyrOzL4KH8sqw+QIg8I3JfH2+uD
ieh9mZM8PfeRmVuOkXPmdA/pLm3ltHETYW/SWgst13knJxPAgAaoWVNCK/N1
5E/ut5E9Im+FzetxbDNomxPzAOCxn3/uOf6/MKKMEh3HGPfNLBw/ZK69F6jh
4W6+IiL1Zah8F9WCl+Pon1gAG8yWLLqSCIiqjmokJUwDWycxujKjxORuHmZV
oSYKrSn8C5gve14uSlWwuncGfQCQ2glc
       "], 
      Association["Book" -> 1, "Theorem" -> 41] -> CompressedData["
1:eJyVUNttBDEIBIaHDetd6TpISynhGkit6Sh4pShfiS5jgTAzYPDb8+P9yUT0
ud2rOEbl46jruirzzMwRM1Q2ZcKiyiEqIukuHLxo0nAzB8TMxhAniFiLf33i
XDU9fK6GHkdmVFDUwloALNY/xn0J9Scb0Q5EY1Sd0asYvlG1497L0MeabOuM
qm76djjPNlhfDH7DAhit35ktAOLW7jm0/1LAGaRGi1ldiNNtdu8uHepdxKb8
AzDnfGQa7W7G7gz6AsRHCHQ=
       "], 
      Association["Book" -> 1, "Theorem" -> 42] -> CompressedData["
1:eJyVUUtOxTAMdPwZf1qVlsJjzZU4wrsAZ+VGON2wAj0mkjXyWDOJ837//LgP
Ivqa5WHg7TXTMzOqfFFdNJJltIepiBlDwMxmRgOjKAmi6qqsagZW0iY9/GvC
7aWiA+LczxPPx3HUllTbk2wlIpH1n+s+gvhTzeziRMuy77eKSJ/PlImI5uIN
6ePpjqsDYPZxDbU4VcwaF5Ai1bwN5BIlBW2ydxDzoMEEEBudNKR3SdbWqhG9
PxgHxHX8oJe/Vq4raOb2J8QQ+gYKUwj+
       "], 
      Association["Book" -> 1, "Theorem" -> 43] -> CompressedData["
1:eJyVUMkRwkAMk2UxO2wHPGmJEtIAtdJRLIdHGIZM0GN9SD7W9+X5WALAy895
BDJiZNKIrh2UgMRFrTAl8ppIJVtYWW5Q2s+DmZI2ZQEe4foKoW5P/bXuCfCY
fdPeQ7TdUdFEpcK87dfHXE+0sKWbYifj3pkz65zuNOsSda0a6eg2xq9tq1XU
QT8Hr/f/A/0=
       "], 
      Association["Book" -> 1, "Theorem" -> 44] -> CompressedData["
1:eJyVUMltBDEMk3VbtmceqSAtpYRtILWmo1D7yWsRLA0I1GHq+Hx8fz0GEf20
eQMfERYRWuWlepkny4CGqYgZ63BmVlUaPiYlGYtMd1hTcELKyPWl/lk1s2ru
s3fsfc5aRau2niNQqvneuP/j9SiNKpggWuu+z8qc0WtKQxVccIsQvKgIj464
e8f9WdTcn05IPuETa4CLJGJIwsPHuNFo4JLClElstGjIYCab27qPNxhSYfwH
Yb4qrsup+6Iwh9AvVKcIfg==
       "], 
      Association["Book" -> 1, "Theorem" -> 45] -> CompressedData["
1:eJyVUclRBDAM8yHLTgaKoCVK2AaolY5QwoMXMKuZXPKtvD0+3h9uZp9newYc
FgkKwAJfAhFmQGZVwivcu8qjo53Ow0fEsTJgupZl/Jq+W24JtoC918Jqm9Ui
qg79ZLv/An9ab706J9kz03XHTBzUWQJKnNpjHeYKg+J1ITWLHilnXvRKNGYS
ozh8i6gkVCE/n5FWZQ5r98irLKQeXvcmdwCOqR+osuLXokE2eEpk/wJVLAeC

       "], 
      Association["Book" -> 1, "Theorem" -> 46] -> CompressedData["
1:eJyVjzFyAzEIRUEfEMLs2I3XjRtfKUfwBXLW3CiwkxlXScavQIgPQv/x/Px4
MhF9dXiL87rd73m57Ju76zKMroJ4iLARxhhuxqwcNMlN1YChKu7DaJRA8vvS
TF/T54qMsFPhy8hWaqYq1OPd7/5H/qmaUXujiG3LtlJefsjs3ArYcZh6V0Sk
5SPgugM7rC6GeaATCJQXaDeUPo/erRbVJIERk4ZSMosJIaxHtDBRVrAKvwDz
Oj3WUupXpEYY9A2bpQiD
       "], 
      Association["Book" -> 1, "Theorem" -> 47] -> CompressedData["
1:eJydUEFOBDEMS+PESaphpFmJEye+xBP2A7yVH5EWceCwSGC1bqs4VuPX+/vb
fYjIx6K/4WVW5ZyV59MxL4KjPaBDzez5dANmhKqrDwz61xMAaRC6Q+APzau8
nJFM0m/XdcVJibM8szs9H3f+E/PXKtlk6+yBgwziG5ncQ7ET2Me+Ap3CKm9C
VW/04K2JDRa2fvESrLW06x+qJmqdmCikbHiYuqm2p3ZwB30oOufxA0fWcbgs
tzkyBuQTW+EIDw==
       "], 
      Association["Book" -> 1, "Theorem" -> 48] -> CompressedData["
1:eJytUFFKBUEMa9OknXXfzkPwAl7JI7wLeFZvZGcFeQoKgvmYmZBpGvJ8e325
uZm9reOPEFkkcx6Xy2NBgBkCHhFPk61uY8DlBFGK2DIRgSrRAv2b+NlbKUYw
mal5XI+YZZpqV6lGO/wztl/VqpVpxRpjZUh+oor3aKbz0S2c1714En2gvmk9
ufjoRe40LNZ1mtK7PwSjywuktFGOBOhfsIv7TluLH3yUh70DiBMHQA==
       "], 
      Association["Book" -> 2, "Theorem" -> 1] -> CompressedData["
1:eJydkFFuhCEIhMGBEZRs/mQf+twr9Qh7gZ61N6qaJn3qJt0vEZUBdXx/fH48
VES+dvgv9/u83a6KCA+i7RREm5lS0FobEaquJSkjSAKNtDFaSFuC+N+XVkX2
6DlqDM5FJIVZXuUOj3rhuU+ZT1VStjfJrCq6by8/zLnX3PZ4JnrfGTPb8gm4
rjXAtSH6wTuQWF7gpwDop3YbW50CxVzf5FKq1k0wTosvaK4OddNfoJrjLdNl
n2JqVMg3M/MIKA==
       "], 
      Association["Book" -> 2, "Theorem" -> 2] -> CompressedData["
1:eJydUEFOBDEMS+rETTqdQdobIw7wJZ6wH+Ct/Ih0hMSJlcCHtI1jpfbb/eP9
riLyucqf8XrcbudTRHgQbXUg2syUgtZakKquKZSgO4HmbhHNpBUh+H3pPiN7
9BxzDG5z2yIpzOlzusMj//PdR9gesqQsbzLnvu/LSnn5xnGsOwvgddD76pjZ
oq+C5xM4wXoQ/YJ3IFFe4Gug+H7N7rWolJUN+pBGmapGE1TIJfECzdWhbvoD
qOZ4yaxQS20lUcgXPpMIVQ==
       "], 
      Association["Book" -> 2, "Theorem" -> 3] -> CompressedData["
1:eJydkEtOBDEMRO2UXbG7kx5pdrTYcCWOMBfgrNwIp4XEipHgLZxP2XHKb4+P
94eKyOcKf+e4389bRHgQbV1AtJkpBa21IFVdUyhBdwLN3SKaSStB8HvTOSJ7
9NzGtnEf+x5JYQ4fwx0e+a/vPmF/qpKyvMkYc85lpbx8cxxrzwK8FnpfN2a2
5Cvg5QROsA5Ev/AOJMoLfCWU3q/cWY2qsmaDvkmjDFWjCWrIVeIFzdWhbvoD
VHN7zayh1itWJQr5AgqqCDQ=
       "], 
      Association["Book" -> 2, "Theorem" -> 4] -> CompressedData["
1:eJydUEGOAjEMSxynmc4gjtznSzyBD/DW/dE6u4DEAQ7jVlGVOE7q/Xa/3tzM
fjocAJljzlyD/lBwAekB4LRtYljq1CA7xUSVyhDJ4/PQHBtFj8qqYGQgaciB
TIkQeWzdz7h8rZIKMBujSgsAE0/su96cAifXVXdZO8OhL8udfkZvnViUWfjf
tsgNvkRUlD3is6wtDHPl0hAmARsaHoTam9s+/vX6G+LMCDib4BHS+AWRrQZv

       "], Association["Book" -> 2, "Theorem" -> 5] -> CompressedData["
1:eJydkEtuwzAMREkOf4plu45WBbJpj9Qj5AI9a29Uyll0lS7yAI0wGlEffty/
v+5MRD9TXuFzvEd0j4BMK8Siyk4QkTBjNk4ycgUWMwHUXXTGU54ee+zrkq0t
676usb1t26UHZb/a9TCDt3zxuU+5/JtGlCjR7TbGkUV9DQ96j9IoEOcUnnNF
VU89N41RA1EmkCfeqiFwBxw6Q7RZoqMuqgICw5PEaWdWK5fuoWqFq7GpmPIf
YN56tclITWGiwaBfdowIww==
       "], 
      Association["Book" -> 2, "Theorem" -> 6] -> CompressedData["
1:eJydkEtyAzEIRIFuQPL8MtbGXvpKOYIvkLPmRmFmkaychV+VkLpaCNDj+fX5
VBH5PsJbjHvmGpmwQ5mokRoCM0t3VdcmLkGgZxrACKM4SQFfvrpvy9R6n5Zt
WXL9WNfLnNLmq193d0Rv77b7isu/bmaF6vZ2G2NvES1rkBqpmOc6IwvkueXp
lk38XhqjFrJEop1EByZEAAEeJvqRwlGFKkGgiCYWsqnSS1XVJL0IujrNqX9A
dZ3rm1zohBtTIT9Hnwif
       "], 
      Association["Book" -> 2, "Theorem" -> 7] -> CompressedData["
1:eJydUEFyAzEIAwtk8NrJsdcmT8oT8oG+tT8q3ulMT+khOmBjIbC4Pb8eTxWR
7x3ewzUiPIm2E4g2M6WgtRakqmsKJehOoLlbRIO0IgSvh64Z2aPnmGPwmMcR
SWFOn9MdHvn2d1/g+JclZXuTOdda20p5+cXlsu8sgOdB7/vFzDZ9BnzegTtY
CZEnvAOJ8gLfNcX3s3bVoFLWbtCHNMpUNZqgllwSL9BcHeqmf4Bqjo9qK7uL
lUQhP6JhCA8=
       "], 
      Association["Book" -> 2, "Theorem" -> 8] -> CompressedData["
1:eJydkEFuAzEIRYEPfFtNPJ7FKOteKUfIBXLW3qh4qqqbtos8y19YHwzi/fG8
P1REPpa8CDmYHbZiEzV3DYGZMUI1tElIOtDcDfBM82Uv+fPPOWZPss/rnHHd
9q1tXdq2Y9+AINvr4/7O5V+39xIXud2OY7bMRnwzxopZoA47+eW6+6ln0px1
wXoQ7aQWhjdkAglfJvoq8aMaVYFAlV0sZagaTLUx6R5FemgkwvWHSh+XMUZI
VEs3p0I+AekaCFo=
       "], 
      Association["Book" -> 2, "Theorem" -> 9] -> CompressedData["
1:eJylUMkRwkAMsyXLJNABP1qihDRArXSEzDAMD+ABStZeK/Gl03Y5bxkR1zG/
4lh7ISNdY6y9OPy6mC4/TIksElRI0ekwkvxYktzBheAMQipnIKodwijUH+O+
hb5+BWam8QDLpj3EeHg439kGm5LjQw+jWZmUr7wP7SQzzbpj5azxgtlz/nx0
s56FAIOJbEe0GtZ3OvqYnPcJy16Ly8Jau7f1TcYNy9kFaA==
       "], 
      Association["Book" -> 2, "Theorem" -> 10] -> CompressedData["
1:eJylUNERQjEIgwRo6xau5AhvAWd1I0PP8/xQPzTtQZsCDZyP6+VwM7u1+Rm5
im6uGm3li02vmfDQomeS2mBappXrbE5+rEguqBCUQYxR0wcspq4QAvGP3Heo
r69Aa2oPMGRKItqjxYkrgcVMZK7NRLespnXkFq0kMcXYmNFtvEBloyMfv8Et
YKDR4aUbNQ3rmbHEiuz9hMZepygFKp7pqaHbHbzDBW8=
       "], 
      Association["Book" -> 2, "Theorem" -> 11] -> CompressedData["
1:eJylUUtORDEMS+I4aSseaPROwJXmCHOBOSs3wn07JGABXliqm4/rvj+e94eb
2cemv+P1ZcUAzCIjq2pmJVDdHgx4eiHidhwRgWSGFQkL/jiQOTQJbHbzOM4b
3pb1IYUA58C/7H6D/vV2re3JbM4x5KircDkRdBDrzYWStsg5t6Kyi6+iXa/7
3twX5lLrpTe4ZSx1k0OL3PUb4aR5WoYrNXckYu+gMqFBaad/wTnqPNuo2Dpq
qPUTT5gHcw==
       "], 
      Association["Book" -> 2, "Theorem" -> 12] -> CompressedData["
1:eJylUdsNAjEMS2yne7ASI9wCzMpG2D0kkBD84L7TxE3cy3G7Hl1V90x/YLGN
Ku6VizFK6pYbW0OOBNLWmiZQ7fENgJJRC+bwASXm5L3pzflfup+Yn7dMPcg6
g1S4p41JkRSSpKtyPy32lLj2lpGFDdlPz7AXw1bMirRD5sxj/8aSZfCrSLif
tmId4RTt7P7O4CtobXEcND0mrAcqpwSq
       "], 
      Association["Book" -> 2, "Theorem" -> 13] -> CompressedData["
1:eJylUcENAzEIAxtIKt0SXakj3AKd9TaqyfVTVe3nrASsQIgh9/352N3MjjZX
AGemmQfluWVlRM4JBMjiTHK7TXHUyDACsMTPchEIlZIFxqjmMDDEqmIGL8r9
wv+CaKW5ZKGVQY20pgX0FkTadbiPV/ydtHohuosz5+zujDEX7aWr6zWXHqVo
nBbuClmX1B+xMreRGrZO/QOBHlMnYnhNp70AIrAFww==
       "], 
      Association["Book" -> 2, "Theorem" -> 14] -> CompressedData["
1:eJylUMsNQjEMy99Nt2AlRmABZmUjnAcICQkuWGqSNq5j5XS5ni8qIrcJf8EO
BRU1M01TNa1MMbfUECAr2SnYAu/kmLh9VQNWWHg0uh1oeLdUoyiElch/7X5i
/exyuEhM3ns8FOKFtaYGEYiqrmKfL3nEeBC7eWJewA+DUVjkPxk9OVlsDrLZ
iwq3Jy5QVeduI9wzfe+ducXTfcUbNhY2jUk6Z2ukhd4BqdwHNw==
       "], 
      Association["Book" -> 3, "Theorem" -> 1] -> CompressedData["
1:eJylUNsRgCAMa9J+eG7hSo7AAs7qRvYB6nmePwYoj5YQsrRtbRCRPcI/JIPK
PAGonXqnUCGDnWeZL+yDTB3Ugm/ImmNU+rfcB/iZHe8BoSTidTHXjMkbXaqn
szR/il4ULxAZE+XRcKqMQddhxk6bDppq3VUblr2pLdr7yQF9XQOY
       "], 
      Association["Book" -> 3, "Theorem" -> 2] -> CompressedData["
1:eJylUNERglAMa9o8BJnClRyBBZzVjUxbuFM/+CEfvaMJL2ke2+u5wczeOS5i
vplx9rGuC+w+jRFkcNBDEC8LlwI+lZx24ukC/BfWOyCJ63H/DM9ZbwmZQXKi
k1SWnsrWiYO50fG55y5CssShUScA0bNIBML7MKnVloQGHZ0K1mcaH4+F+Xd9
3pVpXX75I+wDM5MEaQ==
       "], 
      Association["Book" -> 3, "Theorem" -> 3] -> CompressedData["
1:eJy1UMENAjEMs+P0uAFYgJUY4RZgVjYicY+TTkLigYjaqErs2M1te9w3Anh2
+jWuQF6kdR2AUoI0Rg4sWjQRATAkixFfNN/tONdIxif436NUw8qdy0SwA3Y0
vU3HZa8b7h8gdjd4fNoIzuzypHh+L0imyMxEmpE1zY+6Om9v36+1sR+8AJNx
A44=
       "], 
      Association["Book" -> 3, "Theorem" -> 4] -> CompressedData["
1:eJy1UMENwkAMs+Nc2zVYiRG6ALOyEYmPVqqExAeiu+gUx7Evt/1x3wng2ekH
kau0bQNQSpDGyIFFiyYcAEOyGPFF84DjWiMZn9r/HqUaVu5cJoIdsKPpbTou
ew0YP5vYaPD8tDs4s8uT4vm9IJkiMxNpRtY0P+rqur1jI62N98ELdBwDeQ==

       "], Association["Book" -> 3, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 3, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 3, "Theorem" -> 7] -> CompressedData["
1:eJytUMkNwzAMo6RKcR55dYKu1BGyQGbtRiXloEAf7SsEJOuyTPOxH8/dALzk
rsBqXlUIN7NgWqOWjARTGFh1QAYE/PeaEDL7YOKuHdmx2rlcRPeDP1z0XtJR
ItL3SFoonhc9Tr4QO3Lsj7OulfOrzZpdb9/QberhetlUnur0/BjGKa0vdszu
tZlwm0VO8pSeXyARdzuFbrwBClUEVA==
       "], 
      Association["Book" -> 3, "Theorem" -> 8] -> CompressedData["
1:eJytUNsNwkAMs53rlTVYiRG6ALOyEXHuWvUDEEL4pCS6OA/nut1vGwE8bP6C
iN4vZEukiVhDrbcmgGLZaAtyHKH3XWhMFxVz//XPp9Lf8IV+WZxXkWVQmtug
VnWL0URRi1b+IBVR3DlwooT5KtMnW0OmkMkc40Bc8p0GJSdcC74Axml9aTwB
pdQDwg==
       "], 
      Association["Book" -> 3, "Theorem" -> 9] -> CompressedData["
1:eJy9UNsNgDAI5E4/XMOVHKELOKsbyQGNNTHxx0gaKMf71rZvDWZ2SH0jC+DN
KE1pGTJjMnBwkjO/zOxh3jFU+e9CzY4lYos8DuEnXBu7hy6ZgwQjsa7CIOwV
xZN/LJkLJudqfo2LjEf2MDw7ARBVAyI=
       "], 
      Association["Book" -> 3, "Theorem" -> 10] -> CompressedData["
1:eJytUMERwzAIkwSLZKWMkAU6azcKWHbau17zin0GWzIg2I7XfhDAu81DKzMA
1YUSE1lOMlWOtREgmfxftGgO87XgY/Y5ubPgPUt/iWghbSe2tFifFSsaUUTj
ccVzPFZXPZGA7SBrNqKT1diwOsaY08zdBAzqR7HTgp/IEzliA1s=
       "], 
      Association["Book" -> 3, "Theorem" -> 11] -> CompressedData["
1:eJytUEESwkAIg5Cs4/gBr37JJ/QDvrU/MlAPXvTUbJsuDIWQx/Z6bhkRe9NZ
uN8QqwCSWtdVoug05hGqpkod/QJZcxoHk07SPdGN60S5g/8NyaNkrRZi9nbl
F/2pgws+1lxUZyh1XlPUqjlB4bON7wsabqsAwf9WD5IicoY6ZOTKi6eTmTkl
3r7d+Jbs+/hkUxsB2t03n0AFSQ==
       "], 
      Association["Book" -> 3, "Theorem" -> 12] -> CompressedData["
1:eJytUNEVwkAIgxDOp1u4kiN0AWd1IxPOD3/0q7k2PXgUQu7H83FkRLxMp+GG
WAWQ7HVd1Wwqi3kaVVPUjn6BrDnGZlJJqifcuM6Ua/xvSO6StSxErO1KL/yp
zQUdaS62M+x2vqfIqjlB4bON7gs9bKuAhv4tD+qOyBmqkJErL5pOZuaUaHu7
8S1Z9/FJphoByt03ePkFMA==
       "], 
      Association["Book" -> 3, "Theorem" -> 13] -> CompressedData["
1:eJytUMERgCAMaxKOPVzJEVzAWd1I2oinH1+WIy2UljTLtq8bIuJI+NHQNaw1
jQVRzP4IRiig+kyf9cMKHhbezv5LN7l90/ETKYkkXneTi/mZMWs+KgeE7nrU
YU7FksJYyaEN4Wat4VYgfWd3TMuYMd/6cPrnPCdT2wN9
       "], 
      Association["Book" -> 3, "Theorem" -> 14] -> CompressedData["
1:eJytUMERwjAMsyXbKceLBwOwEiN0AWZlI5QQjvYBr+qhc07SxdZtfdxXN7Nn
pwNxZWWRRDAQABO8nBeQnqwwAm6Jn/lQSvGRjVahBwwMTVGUeuy6ZvyrAm8L
hOikQzTFHD7ghnfIJHInkRvzEKc2f4PpSnNaJ3k5GjNU5sL0sYh/ke6B1ntR
b2h+aoq+AJ39BTE=
       "], 
      Association["Book" -> 3, "Theorem" -> 15] -> CompressedData["
1:eJytUNsRgDAIS4BT13AlR+gCzupGQqmeH/auH80Hj5LQHHs5j0IAV4SZUDOj
QyAUz0pZt0U0OlPECDb+KVuI7Dsx2+6wh9dLRYfRFfOjZHY/uhj73dRjUhh1
0uSRa1uU8Gc0O3FYmN8INyjsAyk=
       "], 
      Association["Book" -> 3, "Theorem" -> 16] -> CompressedData["
1:eJytkLERhTAMQ2UrR5GKFViJEViAWf9GX7JTUFHhS8zFxNZTjus+rwDwc/o0
9jmZSYayvqpIIoGZ3NMX5lu3eyJXhEInpLNPGvk17vvA0tNmm6Epmiuar3kX
MYtR1lXnulQOGPF0pbehc/3UquES2rZ+rIRNQ3UWwxijtQKl9gB0h8cTpefG
wB/gTQQu
       "], 
      Association["Book" -> 3, "Theorem" -> 17] -> CompressedData["
1:eJytUEESAjEIIwGdXhyPnv2ST9gP+FZ/ZCDreNLT0k7aAg2B+/Z8bIiIV8Ox
drmCZKFwYiaCSRBY57pVx1fk78+QETaOw6v0LBEdLvc/4dSjZdky1VupwRY4
KL05ofh42AcH4eaNkxRfz0yGMbldqAou5u10XUpjHDIGmDuTBa5OYOwjSnHp
+gbVdAQS
       "], 
      Association["Book" -> 3, "Theorem" -> 18] -> CompressedData["
1:eJytUMERwzAIQxL99pl/V8oIWaCzdqMKcNNX+4rskzHGQsfjeO4HIuJVdDE2
SBSEpJIBxwQzdWc9Z/D3XxjEoMswS75aFZfb/S/Y/Ti2BjZRRlgmh+1X/RSf
DOtgM+z55C6Kb4ZN0bXVKBPTbPaUO0hplGuYay4Lt2VyRiRrOXwDqMED5g==

       "], 
      Association["Book" -> 3, "Theorem" -> 19] -> CompressedData["
1:eJytUMsVwzAMEoj03BW6UkfIAp21G1USTnrLKdgPf8A21mv/vHdExLfpbiCT
grAxlSgCCW35ZKuKvDhaIIxxw021VJZye9ZrtWU6llFfo0R2SHPlzZHi2GEP
HEZlPnlM8d/hUIy3H5Lgx9xtr4kyfTMDXHVZeKyQLlHZulo/nKgD4w==
       "], 
      Association["Book" -> 3, "Theorem" -> 20] -> CompressedData["
1:eJytT8kNAjEM9DmOIyHYD39aooRtgFrpCHsBIR7wWsenZuJMLuvtujIR3Tvt
bueTsPFCB5rhbiLiQ+aUQcIs5L8fzczhwKialgkgQcjUclXDsrfU+IsClbRr
RMLMXd8W0b2XaR2Hd7RGs4btRapQr8E36kZXaHM2RjTSXeuo/VQDjMQowXoU
KnJ9G5+VXP5lKGV15QmasdIDpmQGpA==
       "], 
      Association["Book" -> 3, "Theorem" -> 21] -> CompressedData["
1:eJytUDESwzAIA0uAr70M2bv0S3lCPtC39kcVuQxd2ik6DLaMjU7P/bXtbmbv
TtfjYUZbfPV1ZnKMkdOXhdPg7pa/h0YECVA1JgUQpmNGpID71Urj7y2gVIqS
HiiQWdVKJAgtSEgcBVHN8FBacTZpJZTPHjRTSTZfnRQs/a9B7jQM733ICbeb
KPkh/9hPCWc528MvMEo+ySTZ5hke9gFq8AaS
       "], 
      Association["Book" -> 3, "Theorem" -> 22] -> CompressedData["
1:eJy1UNsNAkEI5DHccIm5DzuwJUu4BqzVjhwuGr/0SyfswDKwS7jst+vuZnYf
+gtg7Sc/ryQigotvG9oUu3V87KoqIBPytUJIpOnKKpJL8tdj1lc1U9Rm1P+V
siS7OdBFnALzcFmcDI5Ju55FOpTyqslDJTD5HpKhu2cOd1iGs2asKrfFwrUP
7Q/TinS0wnjDJekBqFGrqmDF4g9bKQa7
       "], 
      Association["Book" -> 3, "Theorem" -> 23] -> CompressedData["
1:eJy1UNsNgDAI5GiNJPy4gis5QhdwVjeyUE3rI/1S0vIoBwed07okENFm6g/R
qKykR8REsBOyj9Ap40ZyABRrt6S/nrO/f+WzSUzXQvdhBpYFXpqVRwc6FHcE
WkeEaXDKyaij8EicExLOL3usjzLCte0OVh0Dbw==
       "], 
      Association["Book" -> 3, "Theorem" -> 24] -> CompressedData["
1:eJy1ULENwzAMI0ULGTt06dqXckIeyK35KLJQ24WBdGoImBJoi5L83vZ1I4Cj
0i0oLz7xAEEzGoMhCpLJdVnFQNIUwwBWs3+P+duw9eMXgFglpWnergzfvB3c
9vkoGJTHnfFHNVtqaXF66oqGacv+vEFtNsSvqigidAJ/wQOf
       "], 
      Association["Book" -> 3, "Theorem" -> 25] -> CompressedData["
1:eJy1UNsNwkAMy8MehJUYoQswazfCDqJUquhX69NZed3FyWN5PZeMiNV0D5o6
qExUiRvoThuJ/01JUIU08CGEDEW6W97VKnGedboty1LgEb7Y23YPEcOquT0Y
TMSlU73lPVjVXFITB6MgCrUtC4EXU9D2kD9on7Mn5PyYYHa8AW+eBWc=
       "], 
      Association["Book" -> 3, "Theorem" -> 26] -> CompressedData["
1:eJy1UEEOgzAMsxMHrrvwgH1pT+ADe+t+RJpRKiFtJ7BUp3Ib1+lzfb9WAvg0
ugm24AGCZjTSDU6HZD7pZw8TRaeaBrDmdXXI//OTI1ZHDmZf6ZT3UIZvnQ7u
8+wKBtWKYP5R282tVcEo3fPBsuVxvcN7NmiSQlmhDWSRA5U=
       "], 
      Association["Book" -> 3, "Theorem" -> 27] -> CompressedData["
1:eJy1UMsNQjEMs5v/EwNwQmIlRmABZmUjkl44wQncNHVrS7F6vT9udwJ4Tvsb
zkiPUHLFoZlMkBQcn4dWpXumV0OqIrISubk08vLriPldHVmByVRp5i6iKjtK
NhdvSC/v2GrzYmbd1bZpUpd4X3xbxzR8PCY6Yp+mqnuQElS6YQlq0U4Cuo88
u9Fmdr2xyIij/xijKqfwApWFBss=
       "], 
      Association["Book" -> 3, "Theorem" -> 28] -> CompressedData["
1:eJy1kNERwzAIQ3mgxgNkgqzUEbJAZ+1GxZzjXHrX/kUfMgYJg7f99dwxs3en
28AKhjsOyOVhEtHiv+vr7EgH4bcO+xOXV3MZInKdXijOuk/pkacoLznz5NFh
ZigyjkaSmypsXSeZyhYMCUN/4vzJpcVjEZmJDzBcA2Q=
       "], 
      Association["Book" -> 3, "Theorem" -> 29] -> CompressedData["
1:eJy1UMENAjEMc9rEORBPFmAlRrgFmJWNiNuKe8ELrDRN5aS2ctsf990APJX+
B7sw081antupAlY1ts+iEd7D6wg+Affw8Qq//tphfGVLHCCQKTuk7Mzsb3sH
BhGDX026uIiFGssxzMWERigftRxYR5agI5rZJgvJqjgVG2nsdqAo/aclaXOm
wAvohAYY
       "], 
      Association["Book" -> 3, "Theorem" -> 30] -> CompressedData["
1:eJy1UNsRgCAMS0qqc7iSI7CAs7qRyEPhw/OL3LUNNH1ct3jskQDO200EkQdY
YqRZgMTg+qkZY+WpweRlPzBMZUPhJc8uWzWoIhKPb5r2U2RgayQZlOmazKT6
CnxPwn4HdjdZXO6exAgX8f4DNA==
       "], 
      Association["Book" -> 3, "Theorem" -> 31] -> CompressedData["
1:eJy1ULsVAjEMsyPLse9WuIaVGOEWYFY2QuFR0EAFKpzETvTJ5bxdTzez+yr/
xF7FMUbNcRyYBvdh+Vk0kxFApLBTAGFgdmZ3J7Zf++PXKaAizSr5gdyge9t6
QYfuglB4LuBcnSCrqqnteiXTDXVed1aUro4QQW9rWNmxoknIXVkllstWwlVh
SQ53/UpkhFFkeMMQI4sMQ8iA7+nTHpPZB0E=
       "], 
      Association["Book" -> 3, "Theorem" -> 32] -> CompressedData["
1:eJy1UEEOwjAMi+0wbeKE4AN8iSfsA7yVH81Zq2oc4LZItRI3cd081/drRUR8
Ck6N24OZvF+xzJjjIiGkfwPoJ45dzqlk5JlWfwSPhUSQwO5yR5sdTgfPgir8
3YFdYTAsqIpkm5ZTmK03BaQ7mmCTsywnL+IrukFfetETvKUNkrQD1A==
       "], 
      Association["Book" -> 3, "Theorem" -> 33] -> CompressedData["
1:eJy9ULkRwzAMw6OcipSpUmYlj+AFPKs3CinqHLtJl0ASxaNICIfXui0rAewZ
fosnKT3u7B0N3Sbob/2cB+cuxZaY999x+TNVDEReESdVR12oLmqUQvlhdOUZ
iyrnNRkcaYwMA5y8HoT1HKSxHPZ9IHsaRdyk3ujGN189A58=
       "], 
      Association["Book" -> 3, "Theorem" -> 34] -> CompressedData["
1:eJy1UMERwjAMsyT74MWLBViJEboAs7IRllt68IBfdRfFseTYyW153BdExNN0
MEjxesH5hIqSENQ/O7YVn66OOaVHDvoDXz0zCRKYKYd72Hyre54mIyeViP2j
19hMiz6RW4/sEOxUO+a1nAtbpgXvFRpwkK7U2lpUFZR4AVXRA6o=
       "], 
      Association["Book" -> 3, "Theorem" -> 35] -> CompressedData["
1:eJy1kMENAjEQA3dt74YfHxqgJUq4Bqj1OsKBk7h7wAssxYo0jmL5utxvS0bE
Ou3fupwBZLEZJWVIH6MliKQdUHeVrwGqJTVLP6+L73RiRnQDhSnSxVzuZZu4
84PGIMYBkbvwE25s+w1eR5GMaWROmj6ewsoZV75VmYJH9Ui2wjj56QMvfgTz

       "], 
      Association["Book" -> 3, "Theorem" -> 36] -> CompressedData["
1:eJy1kMENw1AIQ7ENzRA5ZaWMkAU6azcq/pHaJFJ7ai2EkDDiwbLd1w0R8XD6
u2YCTN4YUxWj6qNTAi2oJ+gEBKCSVEzp12jf74fb7GiMhgEGnTQKvHQoz+pr
URcD33lv7vIimieZAYUCmeoXcOxS700PFXRQmcU47WbHZM8TsVAENw==
       "], 
      Association["Book" -> 3, "Theorem" -> 37] -> CompressedData["
1:eJy9UEEOgzAMs51UgMQn9qU9gQ/srfwIhyIx0LTb5rRWGketm8fyei4EsBb9
HhK9mjBPo6DxSyNYgOCkqJdD8k79x+4VLCsoR3Xo9nrpreMzMom8NfDkLp6K
76yfO8LEaB6Bw7ODmVlvNsQO7UhLhzNnmYMaYwM1bQN5
       "], 
      Association["Book" -> 4, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoB+hr20gCyCELAJZUAnw=
       "], 
      Association["Book" -> 4, "Theorem" -> 2] -> CompressedData["
1:eJy9UEEOgCAMW1uIHP2CX/IJfsC3+iM3IEQPetMmNGN0W9my7esGMzuC/sCM
MiFblmDUmxL92FXlMZW89EuPD7jNlAgSqC4ru9nhdOQZFFBN+afHolsczHiM
G9lnJA9BTzUFjbUhWzt/MuZQD6SoVBstppLhWzoBK08Dgg==
       "], 
      Association["Book" -> 4, "Theorem" -> 3] -> CompressedData["
1:eJy9UEEOwCAIo+D8h1/aE/zA3rofraCJ0WTLLlsjHLAFSqnHXiEip6dfgLwh
i6qC6ZHYQ7AWQemHG74Cd4iYi/dsDD8TuTtclIm3AS/kEyzcojn3TvxgwUQx
kILZRhlfqC/aVwL5
       "], 
      Association["Book" -> 4, "Theorem" -> 4] -> CompressedData["
1:eJy9ULENgDAMix1VYmBm5yVO6APcykfEaSlMTAgrcpo4ldOudd8qzOwQ/QNy
SrPlbcjdCZKRHUJURjhaSf98q3dVMpRjJRFxoZ2zlRsrc/TBPpQv6MyhtQ4e
JKNSIDPAPMIkyB79L2LMWtyYQ6frevqlbidr2wPF
       "], 
      Association["Book" -> 4, "Theorem" -> 5] -> CompressedData["
1:eJy9kMsNAkEMQ5N1vrPUgERLlEAD1EpHOCDECU6Id7AyTqR4crpczxcVkdvI
nzhaCVQ3qc9LI9wMsCC7EzgEHntE78uxfh3Kv3YBSk8sgmGtzu7MZDTqWIl+
tOzlUGskM2Jl0OrEY3SGns6i01OmJ6oquEiVfxVJZjIJKMNtEmabqj1vIZbg
+w1r9/I5UsBKD64pdwD4BpY=
       "], 
      Association["Book" -> 4, "Theorem" -> 6] -> CompressedData["
1:eJy9kMEVAkEIQ2ECgV2rsCVLsAFrtSPD7smDnnzmwLwHA/lwvT9udzez54R/
aZXBfRk/m5IZAQSlS0pIGJJNdjex/5opv1YBBXlWiQdD1r33wFFok5OIwvFy
MhEDvx0bSH00lko4P1Wz1NwMnpOKMavJyF27mm01WIQr6ljkcs8ZG2FZK7He
pFm6kyFk4zud9gK8vwZY
       "], 
      Association["Book" -> 4, "Theorem" -> 7] -> CompressedData["
1:eJy9UNsRgDAIC4EP13AlR3ABZ3Ujgfj60S/PtJe2kAu5jvMyzQZgLfoNUTTA
nxWWoAnsglbkMzw7X0d6N+x5VCzBnYwgK6Q483q3cFRYB5stM5/cIlwVNqG1
NSjCNExb8ryEu5wJK+dbwKEExP5Fnl553QAxzAOc
       "], 
      Association["Book" -> 4, "Theorem" -> 8] -> CompressedData["
1:eJy9UMkNwzAME0klmaMrdYQs0FmzUSXa9a99FSFgQgch0Xqcr+eJiLiabgQB
xPF9abVZmpZBGIgRU9qQ/zb0+//oNm1rQiL70U7tk8spVp2mSkQsnhNWhSZY
60VkUMhaqHBZPsVnZufcoWzIKC9A+kjqK2Vp4w1G6APr
       "], 
      Association["Book" -> 4, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoDNjpb+XQAIyMjGAMZIEwGwOMxQiLJdyxxYbEBgCj
JAKa
       "], 
      Association["Book" -> 4, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoCWTFGBlZpHFKMzIywmhGZhDBwAjmMYDFmRiY6eNK
VDdBnIMWUih8JiYwwtTKCNaOaSAGkxFhJCMTiMvIwMTICvI4E8xuRjS9YIsZ
gaqZgXZDLAJyuBk5ADnDAz4=
       "], 
      Association["Book" -> 4, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoCnQYGRlNccoyMkEcBFTEyMwEBAxMQB6QYgSJQ3hU
BUQYyAhyDhMTWkgxIvNB7sTmOJAizBBmxMJkQjWSEeRXRmawCCgs4GphAlDA
BETAQANiaJAxiDDzAgBbEgNx
       "], 
      Association["Book" -> 4, "Theorem" -> 12] -> CompressedData["
1:eJy9UMkNgDAMi50vH0bgwUIdoQswKxsRA604isQDYfWyY7VuhjylDDObtfwL
AONzkSgmOAPGYLFB+sa+TfPKApAXJ45cOVvhZLq/gMaR5yuhv8JXRb2o3iLs
YIxoWkypdFrv3QIHRwMx
       "], 
      Association["Book" -> 4, "Theorem" -> 13] -> CompressedData["
1:eJzNT8ERgDAII8HryyVcyRG6gLO6kQRa/fnyYY4rJaSEbv3YO8zs1PEjuDtB
MrJDiMoIR5Xk14bvA9MPyrGSDmKi7knlxsq8eZRQW3vyQzN6xUgwtTJqDTID
zCNsfFdTpswqHqzR56Ln6Zd9uwA8hAOp
       "], 
      Association["Book" -> 4, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAAjIyMYAxkgTArA4zFCIsl3LHFhMQGAJkUAo4=

       "], 
      Association["Book" -> 4, "Theorem" -> 15] -> CompressedData["
1:eJy9UNsNhDAMSxOnrZAY4lZiBBa4WdkIu0jwgbgvdFbk5tE2Tj7rd1mLmW2i
vwOPlYhAzQTPyIPS6DR6jYXpbSnxW6iU1iGLAgiIagWoECdy8D3DFEe5wjMz
fuKgKtIYHjrcwi2hDYWXou5sWySEFxOO5GO/ELTQtqzPvc09FewpGAWI
       "], 
      Association["Book" -> 4, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEWBkgjiIEQiYmYCAgQnIA1KMIHEIj6qACAMZQc5h
YkILKUZkPsid2BwHUoQZwoxYmEyoRjKC/MrIDBYBhQVcLUwACpiACBhoQAwN
MgZeZm4A0RoDBA==
       "], 
      Association["Book" -> 5, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBgBtZmQEU6OAugAAmjwCfg==
       "], 
      Association["Book" -> 5, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAjAyMgyU1cMeAACY8gJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBDAxoFsN5TOCAJzPCIKMaEroAhjpax1ORzCg+J6R
EYpgYoxwBtzBAKqrApo=
       "], 
      Association["Book" -> 5, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCKBbzQQRYQIBCB9MACETIxQw0dF5QEuh9gKdSi5g
ACO88jAWdkfAwgEUXGAOODjgIcIACSqQJFAMpAIoCAD6HgM0
       "], 
      Association["Book" -> 5, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDDACbR9QBwxPAACY4gJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDjAxMoAgCDAykukURqhuRjCTEWweI1iAEWomVBIF
MEAxRAuacYxQxsAGDjKAuZEoFwEAzJcCtg==
       "], 
      Association["Book" -> 5, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEDAyoDmAkRFKARlggiBgAhNAAGYCKTCDCQqQJFEA
AxRDtKAZx4TFGnoDVJ8zMTGCxZjQJcB8RhQeAwD8LwLx
       "], 
      Association["Book" -> 5, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMDKCSAoMABGMUFOAFCPEUAhAkkQBDFAM0YJm
3IAHCgaAuZEolwEAxeQCsA==
       "], 
      Association["Book" -> 5, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGOcDtRKAMIyNWFXARRkawCoi6EQIAoP4Chg==
       "], 
      Association["Book" -> 5, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjBiZRJSOgqgACNMAJitAn4=
       "], 
      Association["Book" -> 5, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 5, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjAyM7AwMzICncKMKclEf/dAACMYDQ2AGkgArtoC
kQ==
       "], 
      Association["Book" -> 5, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGHDCCERAzMjNiAKA4AjDT01VgN6G5AEkWLs2IJgLT
ywCVYWSEsWBsFBEoRLcdTsINgRiKrhIYJlAnMkDNAQDfGALi
       "], 
      Association["Book" -> 5, "Theorem" -> 16] -> CompressedData["
1:eJzNj8sNgDAMQ218YA5WYoQuwKxsRJImrVDviBysfJ5k52jX2QjgdvlBaTch
Ni4lgiPkp2klc5YsgLXvUHF1jaGWPSdZYhASQlH5R26m2fIZh2bX+XLiwHxU
rhjwA/bZAw8=
       "], 
      Association["Book" -> 5, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAWCEuGOQuGaIAwCYwQJ9
       "], 
      Association["Book" -> 5, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGA2ABEcxYXMMIIRgZYexRAA4HRjBAEkAAAKXMAo8=

       "], 
      Association["Book" -> 5, "Theorem" -> 19] -> CompressedData["
1:eJzNj8sNgDAMQ22isAcrdYQuwKxsRL5SJcQV4UPquD68HPMckwAuH38RsRsO
Xbm6tnDsxodSNRa16VZE2Kpff2PtXBJ7gVdUCd2qOyJZz5F3DnaDz7BsQnR6
A+3xAwc=
       "], 
      Association["Book" -> 5, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGB2BixS1FR2fAAQsYDQJAsu8BpmcCjQ==
       "], 
      Association["Book" -> 5, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCRhETgEBxkHnImIBAJecAn0=
       "], 
      Association["Book" -> 5, "Theorem" -> 22] -> CompressedData["
1:eJzNkFEOgCAIhmEw5zW6UkfwAp21G/XzQ1s110NPfSIiiArL2NahIrKH+g0W
ygMaqTCaFT5JYgUKaGKhocUleENqZsrjOjq6Wc9X8bGvCOU1flpTouDGEtAc
bBStcTdIZWWrIghfnIDzAMxHBKU=
       "], 
      Association["Book" -> 5, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGF2ACATADRDAyAzETIxQw0dMhjIxgAkQBbWaCOAJJ
Fi4NFQQ6E0kcIgQ3iBHJUEZUETDE4TOQIqgUEy41EGVgeyEqANTtAtc=
       "], 
      Association["Book" -> 5, "Theorem" -> 24] -> CompressedData["
1:eJzNULENgDAMi2uHP3iJE/oAt/IRTdIOSMwID5bjWJWbvZ9Hh5ldQf+CB1IU
0UiXOwD5hz3atjUFaKYHyOgiFQvTRjrFTAmZYBldKZSfDmM5ofcS8WGWVA1x
jgEljwFrOS4UifHWDZerBEk=
       "], 
      Association["Book" -> 5, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGEcB0DhMOcdoDFhYGRkYWIAliMjExMUIAXBbEAgtA
hRiZwAxGJLeyMEAVoXkAKoIkyETQOagqGFGYjCjGAwDJtALT
       "], 
      Association["Book" -> 6, "Theorem" -> 1] -> CompressedData["
1:eJzNTkuuAkEIBKuhYXq6x6TNy5vE8bfwQB7BC3hWbyTj5u1cuXi1qAAFVZzv
j9udiei50n/CqW/HWrfLflmGw2We20+h8XeWeScCr/7tvPJZXWUlmqber6NZ
cZEUAFCrBbubw8W9RDWsE1EJfhMwNUGDR+cY3rAiGGEmsHWpASVOVHsEpbSh
DUONktKRWTQsTIuriuYwzixZPPEfwNx3rfdM2TLis8KgFwDKCSk=
       "], 
      Association["Book" -> 6, "Theorem" -> 2] -> CompressedData["
1:eJzNjTsOwjAQRNfO/rJeQ4IV8ZFAQkJQcRmOkAtwVm6EDQUdFQWvmGJmNHOc
77c5AMCjyV+xXQ/jOF1O5/Nyv5umsnHIu5Me1sxoY//rO/+etlgAUirlkFXd
iPCNe1/VrDc0MnMzleYQt4a8a9kJHa1ahumFpuaoEmorOWJCEpFSjxAjxNAx
AxJcQyDupFNa9cxSZ1k4iHKm+KGLMefFMAho1nqOJWJ4AmA7CbY=
       "], 
      Association["Book" -> 6, "Theorem" -> 3] -> CompressedData["
1:eJzNUMkRwkAM8yE7dkiYtEBLlJAGqJWO0JIBXvx4oMfaI40ley/77bqriNzH
81+YIqLOp3Ut1swJYlOhygwI/Dpt/i6pCqCqJpJZlUa0vbAs7L0Jb6+x4NyD
QYS7Z7A9SliTaT/GCoN5IzhgSPozEGDUOFPEZX6qQpfyTPiwHpql6QdcEhs/
zLSTeeqtLg+BNQXR
       "], 
      Association["Book" -> 6, "Theorem" -> 4] -> CompressedData["
1:eJzNjTsOAjEMRPOzHSfZxChaEAUFEohPxVk4wl6As3IjvOnpKHiSRx6PLR+X
13Oxxpj3Kn/G9XZ+lFrKqdaU7mU2+7ZRmEku9Otn8WvirHW9a1N0K6bUdrV2
oQEAxCiqooBQa701dTrhnFWHqGECBsmkkQxAbwlq1YK8huo4ckz6yHtvnPNM
xqGZnAX0zSMirBUQtsHilCGhDsEPLAZOmTmb7WGGIIAu2A996Apf
       "], 
      Association["Book" -> 6, "Theorem" -> 5] -> CompressedData["
1:eJzNjTkOAjEMRbN4SWwcMkzBUiAhUdFRcBGOMBfgrNwIZ3o6Cl7xZX/b35fl
9VxiCOE95N943O5Va722pnqXOZzqxtQNnm/061/fA1OMaZq82PoWMevBbNeL
w4yItXbX7mBns8nMu+GrutoQRC0FC3ZlH7UVIEJCs4KGbQy9q6LC42OCkFOu
HBIGyREotUx+AAAeRg0jNAEhNzGvRILCIlTC8bwH8vgE8QMqcQnz
       "], 
      Association["Book" -> 6, "Theorem" -> 6] -> CompressedData["
1:eJzNjbkNAlEMRP8xtv9prT5arUAQIFIi6IQSaIBa6Qjv5mQETPB8aub8fD2e
3jn3XvF3ut5qr/UyTa3d2+KOVVvvKfG4ya+jvhsG78M8W9PtS1LqB9XdYBZh
BlDKMA4TBqvOqiLrpvRubCusZEbG6IyBaRMLIFBlKNp6tKnkkpMFxRhdCDGL
C+Ra8CRRIzOTBRIRK3m0gsK2pLjJm0WuVYrbnxaQggP8ByvOChE=
       "], 
      Association["Book" -> 6, "Theorem" -> 7] -> CompressedData["
1:eJzNTTFuAzEMsyVRsmX5jByKogWyFOjQ5Dl5Qj6Qt/ZH1WXv1qEERIgiIX7c
H7d7LaV8H/T/cPVw/1xrjMt4KWefM6J3O33pXzf9/pBqpX3PZcuUmsV7xGmZ
tWYGoPeVvBJYNuc+Z6q8NPfkeRDgGTUsz8F8QhRQRBgC8zBTNR9uWcTMhYhV
CkkZVGG8sbAIQOlJcKVwasqq4CeqSrPuauXt/CrYREjqD5fACUM=
       "], 
      Association["Book" -> 6, "Theorem" -> 8] -> CompressedData["
1:eJzNjksKAjEQRDtJdU/+kzAMggjiTtfexCN4Ac/qjexk786Fr6HoD3TV5fl6
PA0RvYf8IbnmfOu9lHve6JTXUqv3sl+XXxt9f2iNsb2PLERBqcdatyYTADE2
1aagSSkaVWRsQkqqU4AcBAEtCRrqhD3gtRdUpHHUKXgtNXLOkbUuLGSFijW8
uNWpG6shM0thgxwRRZfsJkYQfcoh0OG8A4XFwnwAo+QJgg==
       "], 
      Association["Book" -> 6, "Theorem" -> 9] -> CompressedData["
1:eJzNjcEKwjAQRHc7m23SJG21QhAr4qE3v8ZP6A/4rf6Rm4J48+TBNzBsMgNz
XR/3lYnoWe0fWW63vpRj2Z0TxXnWSwFcGMKvd/LXNCUztVaeplNU7Tze9H29
vQGTT3a09UdEzFW2UuyADt4eHmFDI5CgahWThYgQVZ1sSKShhrltSRwtzJBG
2Os+WO5MrbJ6l4U/gDkPaRwdheTr+IFBL1dJCHM=
       "], 
      Association["Book" -> 6, "Theorem" -> 10] -> CompressedData["
1:eJzNTVsKwkAMTJpHs+7W0m2xIAVR6IU8Qi/gWb2RSUH888sPZyCZZAbmuj3u
GwLAM8ZfYl2HeZ6mac5QTotdFhFKNf26Jn93w9bY43guqgejN0oJbQ5yWnbR
xoeZfSrvoa54kMwPo7RDc3xUPeJ0SZlYVUcvYm6gQTeBBW6IosRkMiT3xdkq
qnLH+AEh9v2xVgHrLLorErwAEM8IPw==
       "], 
      Association["Book" -> 6, "Theorem" -> 11] -> CompressedData["
1:eJzNjcEKwjAQRHc7m+2mSdraHoJYKIJXf8ZP8Af8Vv/IbUG8efLgGxg2mYE5
3x+3OxPRc7P/5NrXeqyHNVNaFl0rEOIQf71SvqY5u6m3yjyfkmpneNP3220O
XJbNYrv9iIi7yl5KHdDB/GGIO5qADFWvuDxEgqjq7EMiDTXMbUsS6MIMaYRN
p+h5cLXKaqEIfwBzGfI4BorZtvGJQS/3UAgj
       "], 
      Association["Book" -> 6, "Theorem" -> 12] -> CompressedData["
1:eJzNjVEKwjAQRHc7m03SJG01H0G0iOCJPIIX8KzeyE1B/PPLD9/AsMkMzOX+
uN2ZiJ7d/pSptUPbrZnSetJzA1yc469Hytc0ZzO1Vqn1mFTHgDfT1O9gwBSy
Hb7/iIi5ylZKIzAi2CMgbmgCMlStYrIQCaKq1YZEBhqYvSdxdGWGDMJBa7Tc
mbyyBleEP4C5zHlZHMUc+vieQS++jwf1
       "], 
      Association["Book" -> 6, "Theorem" -> 13] -> CompressedData["
1:eJzNTjtuQzEMk21JFG28jx/SZO6VeoRcoGftjSojc7YO5UBKIgTy8/n99Swi
8rPov6L3iJueMhEP4LrO+2x/naHvrVJkjNSHiLsqNoBxHOc+BjuBSI5Ej50c
JGxd3D35YxEZdtAYvjP4gmOj59tB8L7MVJ/XfPVoUksNz0CJnLTM2oy1taZm
SpN2DQ0okIUWCswiN5ftNgxboPbyC7uGCKo=
       "], 
      Association["Book" -> 6, "Theorem" -> 14] -> CompressedData["
1:eJzNTTkSwkAM8yKfG0LJQDpghgfxhHyAt/IjFDIUNFQUyB75kI/TfL/NTUQe
C/0trpdzP5b0abTt1gwx+q9f5Fe1imR0633s7hV4IyJWDtCiItKWjqqSTV9D
lUAiWARyxUAR7oAzUsQANd7nIy5IU3aZyU6bp/LQxjRTo8ytFtW0fWCfOR1M
3BwKHRvkCRlTBv0=
       "], 
      Association["Book" -> 6, "Theorem" -> 15] -> CompressedData["
1:eJzNTkkOwkAMyxBnmdDhBELqpeJLPKEf4K38CLcSBy6cOGCNPEmcOLmtj/va
ROS50f9iWWouOc7DpslMc/ivN8RXtYpkfFY1yr2HvuG+xUFoaGZlhm0VAGTD
3lRdtWswCe07oihymAb8KWopjP5cxAFpYJWRnNA8QaODIRPRza03OAztA+fM
68V4oikUo6m8AOuyBtY=
       "], 
      Association["Book" -> 6, "Theorem" -> 16] -> CompressedData["
1:eJzNjjEOQkEIRNllYAG/25qvlbW38Qj/Ap7VGwlqY2Nl4SOZEJgwnLfbdWtE
dC/5Yy7jGOSrsxmzWODXAbuv24gUITKb013V84s3EdVbwlkWZq41AZAqeJrK
n1uU+ouFWVm1VGrMC0NVZwb1DurIC9SZAo3rCNI2BoaLijcMCNoHh7DTKpSW
Ct83pgfKqgbU
       "], 
      Association["Book" -> 6, "Theorem" -> 17] -> CompressedData["
1:eJzNjrsNgjEMhJ34kXMeNSAkJFZihH8BZmUj7HQ0VBRc8flkO+fcj+fjKET0
Svyz2hk0Ts4AM7cuv873r1MgoERmc/ow6xBW5RT2j5AFu6BbdlRkcy95dGIq
ybHVV/rWmI01LS9Wi/w4VKtQzYdB6lI4vVSNdWmupl5EI7p86DJxuxqZIM+u
IvQGjjUGsQ==
       "], 
      Association["Book" -> 6, "Theorem" -> 18] -> CompressedData["
1:eJzNjTtuAzEMRClpOPqsREOh4y5B3OU8OYIv4LP6RtZu7y6FH8AHfoDh9Xb/
uwUReex6a/qn/HQbY5TCj9/83/GvA2MI0X01JlJKrfY958XJnEkArfmyL+A0
czNy32xmy2MX0CtR4UY4TgdagII5iYmxH9e0ta3V9SilJDGmmiWqjBg0p5lI
qipWcWhAb+hcS00HgWhl67XK5esM7WBEeALe/AkU
       "], 
      Association["Book" -> 6, "Theorem" -> 19] -> CompressedData["
1:eJzNjc0NQjEMg/vjuGnySh8jIDERI7wFmJWNSOHAjRMHvkiWlVjO5bjfjpxS
eiz5b2bar8beAc4Tf93ev/+e70hr7jZUNwNEsGgtPCyAwX26O9eGZGjnK7Q8
YaGG7YWOaIAqoDFxxACbNo9HtdaUUYFwaUMGK9asuIiwSYarSPmQS5nkeWey
6qiKUZCf5fMHOg==
       "], 
      Association["Book" -> 6, "Theorem" -> 20] -> CompressedData["
1:eJzNjc0NwjAMRu04duzGEWkPSBUSpTN0E0boAszKRri9c+PAk/XkH+nzur+e
OwLA+9C/c2+tVDer82y/zv4emBDTtkXjAKqt1av72FWnSVVEhqGHeyBdl2Vb
llKOjdYaPiViRaVIrypd1pMYo9xVXLzwOamZtXhElAATMgMxXBLmTIqaJTMT
B8pIjxs3IRGmkziaaS0OI1fOQy6U8QM+IAni
       "], 
      Association["Book" -> 6, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLAA6npERQmPKQBmMYBUQdSMEAACfzQKF
       "], 
      Association["Book" -> 6, "Theorem" -> 22] -> CompressedData["
1:eJzNjUsOwjAMRPObdJyW0kBpVQJILLkOR+gFOCs3wumeHQsszdPII4/v6+u5
WmPMu+Lvp9s3u47k9ZJ+Xc2vibPWzbMaFdn3/WNZbkUkJREAImdlKeeCIjnP
OUfUzTCdlGMFkBpBgzIJCsZtQD3Fsqgw1hAMQz7kXh9574x1NgTjo8nV+GTp
GdRoWSSsPw4YCbLZ+hEYU8uOO5MxILQhergPO5oKIw==
       "], 
      Association["Book" -> 6, "Theorem" -> 23] -> CompressedData["
1:eJzNTssRAlEIg0eAx/u46tiALVnCNmCtdiS7Ox69eTDMZCBkAvf1+ViZiF4b
/T9mjAFYP9uvk+PrpjCXOZnZiVrrvQ33HvggIg4OZMWMaLopqprsupuq5deI
HAJtR52b4p6WrGwxoZ7JeVFEiCFAdnQCw8QFEqgVbuZhjOZqUkTKARa5drtd
jBYbkIpRwG+WxAcG
       "], 
      Association["Book" -> 6, "Theorem" -> 24] -> CompressedData["
1:eJzVjTEOAjEMBB3Hdhyfj3CHFIkGiZqWl/CE+wBv5Uc4R01HwxSrlb3avW7P
x5YA4DXkD1juN1Vpvf66+HshpoTuYRSgtd7PzezgHBCFqA7vATubudlcfL/r
rjRy8Yyv69Dpw6p8ijyzcWFyppW1Wu0xRISxmkWAGC4psWTNymsVKdEpzqkM
g4IYqUFGavNhWQpMx4lYyZDTG3gKCIs=
       "], 
      Association["Book" -> 6, "Theorem" -> 25] -> CompressedData["
1:eJzVjdENAjEMQ5OLnbY56G2AhPhjHEa4BZiVjUiL+OWLn3Ol18hO3ev+fOwq
Iq+BI4i83S+l2L97fxSqCpB3iKwrQAKwiFZa61vPOWmpzRoyo32dZCDRuyM6
0mrd8sFcsuFg+qeBbucsDXy+zGPLIgpxVcvRTDkCdxpddK0IVrLUKWOBzQSV
4XSWzd/inwdp
       "], 
      Association["Book" -> 6, "Theorem" -> 26] -> CompressedData["
1:eJzVjbsRwkAQQ+8neW+9cHh8w4xDaqATSnAD1EpH7JmYjIQXKFhppdv+fOwx
hPAa8hfc64TLtf66Vr46KcZk9om01vvWVM8GpxSSIuZqDg2qpnqa7LjLoSAB
M9B9GTof2FrZqQoqJw8QK6Vq7T5USvLVTIaCsMUIZsmCpXJ0eQfi+EFiSp4a
5FSWdm7LFOY2A1I0l/gGOscIZA==
       "], 
      Association["Book" -> 6, "Theorem" -> 27] -> CompressedData["
1:eJzVjcERQjEIRIGwEPK/yegY77ZkCTZgrXYk0bM3L77DMrMsy/X+uN2ZiJ5L
/oNaLY7x89avG2GWutaFqPc5Z+y7hyeqAGqN1EiwzOo+3k5rDWtkRjUvsCOa
I/Jg0eeGCz6+uqac0cxt5iNVya/FjBR0YoYVLRVbmFlWWoCtMCAmkqlFER2H
PkaQb64oaqL8Ar0pB5k=
       "], 
      Association["Book" -> 6, "Theorem" -> 28] -> CompressedData["
1:eJzVTsEJw0AMsyPZd7lQCDS5exc6ROfoCFmgs3aj+lJofn31UxmEkI3ky/a4
byoiz05/gtvVHb8O/RI4DEKqahEZR9IYQM7JU2qtAngzGjLAEIaPX1hbXVdj
bgwrN+ynHeNqC3d/ypV1wRmlFO6VGoPoVYqrIiSg1hfwyHfRKWGyA2rWe+li
tDletNNsLx4lB/o=
       "], 
      Association["Book" -> 6, "Theorem" -> 29] -> CompressedData["
1:eJzVTcERAkEIg0vAY0/H5WEB9mAllnANWKsdyd7N6M+XHwMTmMAk1/VxX1VE
noP+BTd3/Nrzi6GqkDWbSARpLCDi4BGZHcDOSATAWgxvvbGod2MkS4rE9jrQ
0jo3fRmUOKO1xj2yCtMkSnFV1AqojQNnwmbRZcbRPkB15dKFbienWbvwBaQn
Bz0=
       "], 
      Association["Book" -> 6, "Theorem" -> 30] -> CompressedData["
1:eJzVTkkOAkEIhIGCHnpMekm8+yWfMB/wrf5IZmLiyZsXC8JWbLf9cd+ZiJ6H
+Rus0F+vtO8UMwHpa2p1NwNUI4qv65zDXedsmugaqkmJviu9t3qErUFiysip
rmdrNsk2MYBIux1kl5yP8PMgU4oIkRIyWWhhWSzrrAWCQlyLFP2AAcm7ZuSK
S35occULoOAHag==
       "], 
      Association["Book" -> 6, "Theorem" -> 31] -> CompressedData["
1:eJzVj8sNAkEMQ/NzPCOxRdASJWwD1EpHOMuZGxd8iEa286K5n8/H6Wb2mvE/
WvlrIr5HEUa6uyoAiequnFeNgNRMSWaVTKVyCFzpVYpAxSwpwkeCqNqiTSlm
jd1NHcx0C31ymbe1uzYPC24XRl1uGHg0kzrKS2IL2FR/3RIbjPI3vUoFtQ==

       "], 
      Association["Book" -> 6, "Theorem" -> 32] -> CompressedData["
1:eJzVjc0NwjAMRp34s52kaRMFUZUjKzECCzArG+H0zo0LT/KTf2T7/nw9noGI
3lN/RP71Qfs6iSHEffekEYnU2o7er8MsJTMAKQ33cDC0tb01s9kp6+quU8CS
DRljVQxcTtR3Db17YJtDr0ouufoj5kghMkAsNGJQxcaJDSIsziaBa5GirCp8
AkXOZbGFbvWAbLCI8AHuyAhD
       "], 
      Association["Book" -> 6, "Theorem" -> 33] -> CompressedData["
1:eJzljdsNw1AIQ3FsE27UJbpSR8gCmbUbFfLTvy5QI/HyQTzP63UiIt6T/ln1
2x1bEftetcrOJCWO1uqe2WJHVqY8G9ud5Rs6VoPMHvJGB5p+GFNjdrWk+5EQ
ENKxMY4NfjAwN5I1UKNEEl9tQK2jSjGUQIPxAbjoBWY=
       "], 
      Association["Book" -> 7, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFGBkRHcySIARLAxmMEC5EADVgiIG5oMxAyMDI6aB
GKYPMQAAxXUCrg==
       "], 
      Association["Book" -> 7, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFmBiwiLAxAQmwRwoFwIgCphQxMB8MGYEQiZUAxmh
GMJhhFjHCBYlAzCAEV55KIMywIggGBkBJVEDRw==
       "], 
      Association["Book" -> 7, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGGGBC5zOBMJhkYGBkZAB6iZGJiREMwGpBDAiJAAxg
zMDECNGJGzAOvRACANOEAr0=
       "], 
      Association["Book" -> 7, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOGBmBmEwycDAyMgA9BIjMzMjGDCDFIAYEBIBGMCY
gZkJohM3YBx6IQQA4HUCyw==
       "], 
      Association["Book" -> 7, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHGBEB2ARoDhIClMWG2BAplHMBpuBbBedPUc5AADe
BQLH
       "], 
      Association["Book" -> 7, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHmAEOntoupweAACYNwJ9
       "], 
      Association["Book" -> 7, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIGACo1GAFQAAmVACfg==
       "], 
      Association["Book" -> 7, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJhi6Lqc1AACV/AJ7
       "], 
      Association["Book" -> 7, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKAA6nRGr87GLYjVgWAIAmv0CgA==
       "], 
      Association["Book" -> 7, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKmBhAWJGIGBhAJEMDCDECCGRAAucQAIMUAzhIAGI
GTAOI4QDJUgHDEi24JDHdAIZgBFBMDICABMDAzE=
       "], 
      Association["Book" -> 7, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASpgYhpoF1AGAJfgAn4=
       "], 
      Association["Book" -> 7, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLmBmJl4UizpqOmUQAQCj6AKJ
       "], 
      Association["Book" -> 7, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLGAFIiYgYGUAkQwMIMQEIZEAK5xAAgxQDOEgAYgZ
MA4ThAMlSAcMSLbgkMd0AhmACUEwMQEAeRAD0Q==
       "], 
      Association["Book" -> 7, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMmBjZmJkZGNgBAIGBhBiBJPMjFDAzMzMAaJZmRhR
AAMUg0lkAyFmwDhQWShBOmCA2YJbHsqgDDAiCEZGAB46Az0=
       "], 
      Association["Book" -> 7, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMGBhYMT0AQsLgs0Ik8fiURZMoWEBAKp7ApA=
       "], 
      Association["Book" -> 7, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGBiYgD6gREIGCAMBkYWFjAfDJhA8iBJJkYUABJk
YYCqw2k4WB2EZmBgJANATMArT8AJxAFGBMHICAAFjgMk
       "], 
      Association["Book" -> 7, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGBlZmTkYGAEAgYGEGIEk0yMUMDKysoGotmYGVEA
AxSDSWTzIGbAOFBZKEE6YIDZglseyqAMMCIIRkYAIcIDQQ==
       "], 
      Association["Book" -> 7, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNuBgZGJlYAICBgYQYgKTLExQwMnJyQyimTmYUAAD
FDMCIROycUAOIzxQgCywLFiEkQzAAEZ45aEMygAjgmBkBABXpgN8
       "], 
      Association["Book" -> 7, "Theorem" -> 19] -> CompressedData["
1:eJzVUIENgCAMa7c4lC98yRN8wFv9yG2IRr3AEjpWuiYwr9uyEsAe9HOQRHUA
E1BHFJEiZqWYQ1UZlTparA74dtM0kDF5wzxwuKOh6lUkycGOdk5JXkpykyIh
G7nG4nxyXGaX+a9XPTofSO3rCyObm+0zeAAetARy
       "], 
      Association["Book" -> 7, "Theorem" -> 20] -> CompressedData["
1:eJzVjMENgEAIBBduQ64NW7KEa8Ba7UgWNfczMb6cwAAhsIxtHQZgl/5OByJJ
AyRoxggWob2a0xONxhbsdTlpmX4P7npYVXoPKh73V/MNn3I/AMzbBC8=
       "], 
      Association["Book" -> 7, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAHAiDJwMDBysDKyMjKyc4KBuwgSRADQiIAiMvI
ysLOysXJzo5sEgsQwwOFmZkBpBVEgwjSAQMY4ZWHMigDzAiCmRkAJvUEww==

       "], 
      Association["Book" -> 7, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAiAJYoBJJkYoYGZmZgPRbHARCGCAYjCJYhID
kgBMFkqQDhhgtuCWhzIo9D+CYGQEAAf3Ayg=
       "], 
      Association["Book" -> 7, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAjQBhCCUZgRBGGCAYCwmQfEQBgCyEQKa
       "], 
      Association["Book" -> 7, "Theorem" -> 24] -> CompressedData["
1:eJzdUdsJgDAMvMZaGywW6l//XMkRuoCzupFNteADBH+9hEtyIYGQKS1zUgBW
oT8gxggEwDv0RNx5tpaZPRGpHJmagcUqIBrb4LQex/MiC1BbC61hzB6FvgPF
X/tHcsf1MfmKotHzciVe5oVU3rUBYSQGUA==
       "], 
      Association["Book" -> 7, "Theorem" -> 25] -> CompressedData["
1:eJzdUcENwjAQc+6sJgUKUis+/bESI3QBZmUj4itBpWyAEzm2o1xyym153JcE
4Cn6C8xXYAQuJxzN+jyUwGBmScL8XDQaoKzk8UBO07ZOBozNuKPr6koGVXjD
qiPiLgleI1UIw88x6TdrMxy5bwffH1O7iMx+G0+aHq+Vq3e8AAkOBdU=
       "], 
      Association["Book" -> 7, "Theorem" -> 26] -> CompressedData["
1:eJzdUYkNwyAQM3dWoE9SKVEH6EodIQt01mxUfClRmm5Qg4xtxMGJx/x6zgnA
IvoP3IERuF1xMTvlvgR6M0sS5kPRaICyksczOU37MhkwNuOOrqsrGVThDauO
iIckeI1UIQy3Y9If1mY48tgNvj+mdhGZ/fadND1eK1fveAPt4QW7
       "], 
      Association["Book" -> 7, "Theorem" -> 27] -> CompressedData["
1:eJzdUYkNwyAMPBwrJsJqJTJBVuoILNBZu1FtU6o8G+RAZ99ZGCy29n61BODj
dBtU4KkoRItUzVlVKxEli0rTQ30NwD0ttTCv675HASgPIYJ5tsgcZJCBnofF
Jye4W94hBP+Pef5jL4aK/kccP8amCI+uQyffEq91ZXd8AasvBwk=
       "], 
      Association["Book" -> 7, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGE+DjZuBiYuJk52YHA24mJiZGEIOJmZcdBGGAASTG
zibJycIiKopsABsDAxMzjMPMzMDKCqFBBOmAAYzwykMZ6AA1YoC+AIsxYfqY
EYTA+kEEI9AsAJaGBU8=
       "], 
      Association["Book" -> 7, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGESDPO8MsEACWtgJ8
       "], 
      Association["Book" -> 7, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGPBhmgQAAlbECew==
       "], 
      Association["Book" -> 7, "Theorem" -> 33] -> CompressedData["
1:eJzdUdEVwyAIPCCBJ6Y/2SArdQQX6KzdqEJqXpNs0FNP7hTF59Zez0YA3kF/
hQcW5jrNXqu7zyJCfXaS1aMNoA/2opOI2W9+AViHUEXpBnKLBXRgj9Oyi5O8
W3FCCjvSIv5yLKY6l5A4fwwzpcf3B1P0rDiI+h0fU7EGwQ==
       "], 
      Association["Book" -> 7, "Theorem" -> 34] -> CompressedData["
1:eJzdUdsNwjAM9KMOQU6ohJCqfrISI3QBZmUj7CtFrdiAS3L2XeI8lPvyfCxM
RK+kP8MsMg3Wbld3N1XliM568WytOUAx5FxPg2qt++oQYpsohcYxYu+ggG1Y
c1iZlJ0DXheRGUH0b1nmH85JKOx/xPFjRBie/D6Xs+PGSRxnvAHc8Qem
       "], 
      Association["Book" -> 7, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 7, "Theorem" -> 36] -> CompressedData["
1:eJzdUdsNwjAM9KMOQU6oVIkf/liJEboAs7IR9pVWrdigl+Tsu8R5KM/5/ZqZ
iD5JZ4PIY7B2n9zdVJUjOuvNs7XmAMWQa70MqrXui0OIraIUGseIvYMCtmLJ
YWVSdg54WURmBNG3ssx/nJNQ2P+I48eIMDz5fy1nx42TOM74AsfMB5E=
       "], 
      Association["Book" -> 7, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHWBBMBkZoT7E4lEWTKFhAQCgAgKG
       "], 
      Association["Book" -> 7, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGM2BkhPoQi0dZ6OsUugEAnAYCgg==
       "], 
      Association["Book" -> 7, "Theorem" -> 39] -> CompressedData["
1:eJzdUYkNwyAM9BMTKluRKnWBrJQRskBn7Ua1L48SdYMecPYdmEfM63tZmYg+
RX+IweL1dHdTVc7orOFo4QDlkEcfB9Xer5UpxA7RGk1TxghQwg5sOaxK2sUB
b4vIjCDiLKt855qEwv533D9GhOHJ71O5Om5cxHnGF434B0I=
       "], 
      Association["Book" -> 8, "Theorem" -> 1] -> CompressedData["
1:eJzdkNENgDAIRA85GtZwJUfoAs7qRgJNmib6558v6bXAHR/d+3l0AXCl/BFv
lriIbPkQqdra6BvibEZvNPc1SEDmp5Dhi1u1JCD1CYuXgYKaOS6m4eMQTOM3
mFJLYukN8sIEnQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 2] -> CompressedData["
1:eJzdkd0NwyAMhI1jOYEDoSp97UNWyghZoLN2o9qmrRp1g36C43z8CMR23Pcj
EdHD5S+5lNJ7uzHz1A2elrrUua5rDcg6o+SrSM7f+woRz++iNVK1USTEAHRg
fqiohWjAJwkdiwhKVkBFEQz/Up+MKs4/c/4Y5hQZ/z40eYM7v2lS6BN4owb0

       "], Association["Book" -> 8, "Theorem" -> 3] -> CompressedData["
1:eJzdUVsOgzAMc5K1Ql1BVX9giJ9daUfgApyVGy0JA4G4wazWtd2mD/U9L5+Z
AKxG/4ncTdMrigiNCpJcc33Wcej7qoB27trwEAnhXNYCnHZTCppGxxidFGnH
pj0yUU6J87YIKcFNPMpM/9gm3fn+V1w/hpk84/s7yZrf2Ij0jC9c/AhX
       "], 
      Association["Book" -> 8, "Theorem" -> 4] -> CompressedData["
1:eJzdkesNgzAMhP1oZKOAfyH43ZU6AgswazeqfaAK1A36KbncOS9FeW77a2Mi
epf8KWvE2FSVI2Gd+tTHHtEBZZfB7aHqft2VQdo3OC1LjvMMScxaswLeUCrj
lwr0WERmhHBkO/2pNYmE8+/cP0aEUZPfZ3I1K1fCeccHW9gI9A==
       "], 
      Association["Book" -> 8, "Theorem" -> 5] -> CompressedData["
1:eJzdkd0NgCAMhPuHBFPQFVzJEVjAWd3I0geiQRfwy+USjlxTwlaPvSIAnM3+
Sik6MTMWAzkbRddFNasq5Kw0JxFmkXspAVDoh9QEEKObE0J8Ib2FVnN90y+H
5Z8fQ4Se0fhKbPJ+M7RZF0ikBto=
       "], 
      Association["Book" -> 8, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLVCVYWVmZmZUAAJGZm4hbiEuIVERcXEhIGAAYiY+
XpA8KyuyHl4GBiYuGIeHh4GDA0izsYEJIOCCAQgbLARi8CCJgEmIIgYuLgYw
hw2uDcSGkiBJMA9sPipAjRgmJkawGBOmJxlBCOxiEMEItAMANhYIKw==
       "], 
      Association["Book" -> 8, "Theorem" -> 7] -> CompressedData["
1:eJzdUckNgDAMc1K1QqHwqEB8+LASI7AAs7IRSTgEYgOs1nWcpocyLeu8EIDN
6L8YYwiBJgWFutRFSt8NQ1FAJ7eN5WN8ljQAyxXkjKrSNSUnhVw4tFsm8sNx
PjZBBB6ku8z0yZb0yM9/490YZnKPv38kG/5iI9I7dhYdCAo=
       "], 
      Association["Book" -> 8, "Theorem" -> 8] -> CompressedData["
1:eJzdkdENwyAMRG0jULg4leo/PrtSR8gCnbUb1TZt1agb5AkOn20QiNv+uO9M
RM+QE1NLKTwcLqutBrPrGOaQT7lsUa/1d8dGJPiY3mlZfG0txQF04vHUpp5E
B76Z1NlEUHIDbYpkxm+NYro8/8jxY0Q4c/L/RI6RN9ZwCn0B9uYIDQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 9] -> CompressedData["
1:eJzdkYEJwzAMBOWPlOYrCKQbZKWOkAU6azeqJLchIRv0sB+9/DYIr9vruTUR
eaf8MwCGOcAw+eQ3XxYvJDbu5EOVPF4Ig/FnIjemUS3RDFsn6q5q0aSTe6e0
h4QmYWhqLHr91TwsV++fOX8M0KqH64QtV41g6Yz2AdMKBho=
       "], 
      Association["Book" -> 8, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNeDmZmYCAmYOJhTAAMWMQMiErB7IYYQHCpAFlgWL
MJIBGMAIrzyUQRlgRBCMjAAb3gNB
       "], 
      Association["Book" -> 8, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGN+BmZgICZg4mFMAAxYxAyISsHMhhhAcKkAWWBYsw
kgEYwAivPJRBGWBEEIyMABFtAzY=
       "], 
      Association["Book" -> 8, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOWBmAgJmDiYUwADFjEDIhKwayGGEBwqQBZYFizCS
ARjACK88lEEZYEQQjIwABwcDKw==
       "], 
      Association["Book" -> 8, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGO2AEAlYmRhTAAMVgEkUxA5IATBZKkA4YYLbglocy
KPQigmBkBADl0gMF
       "], 
      Association["Book" -> 8, "Theorem" -> 14] -> CompressedData["
1:eJzdUdsNgCAMvBZCTAl+gPGfP+dxBBdwVjeS1kc0buCFHHfX8gp1WeeFAGxK
v0edKrlYYpE8DOOYG1By5j5555z3z94EsNwmoevaHIJRg1w4tEUq0iMxPpog
AjPhXqb6ZC2as/3feH8MM1nG38eRDruxErUzdtP4B8U=
       "], 
      Association["Book" -> 8, "Theorem" -> 15] -> CompressedData["
1:eJzdUdsNgCAMvBZCTEE/iPHfT9dxBBdwVjeyrY9o3MALHHdHeYVxWeeFAGxG
/8c4Ucg1V6l9PwxVAe3ctTGEEOOztAVYLlMKmkbHlJwUcuHQHpkoj8T5KIII
3KR7memTbdKd7//G+2OYyTP+vo2s+Y2NSM/YAbFtB58=
       "], 
      Association["Book" -> 8, "Theorem" -> 16] -> CompressedData["
1:eJzlUdsNgCAMvBZCTAl+gPHflRyBBZzVjaT1EYwjeCHH3bW8wlK3tRKAXekH
WMjFEovkaZrn3ICSM4/JO+e87zsTwPKYhGFocwhGDXLj1BapSF1ifDZBBGbC
s0z1xVo0Z/u/8f4YZrKMv08jHXZjJWpnHI/5B3w=
       "], 
      Association["Book" -> 8, "Theorem" -> 17] -> CompressedData["
1:eJzlT0EOgCAMK4MQM9EDId79kk/gA77VH7kNNWr8gQ2UtgwYc12X6gBsSn+A
833uM+dSpikLIJPGIXjvQ7gXDgDxaVJC18kao5GATzRtkYp0S4xbEZhhJl7H
VB+sm+bs/lfDD0fkLKOPn+mwjpWcvLEDbMoHVQ==
       "], 
      Association["Book" -> 8, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBGBiRAEMEAwUZWBiQlYHDBBGeKCA1TAwwAjSAcQE
vPJQBmWAEUEwMgIA3/IC/w==
       "], 
      Association["Book" -> 8, "Theorem" -> 19] -> CompressedData["
1:eJzlUNsRwCAMCmSSrtQRXKCzdqNCrPbx2d9yJwJBz3Np29oQEbvpH0gmBVNm
hjYvQnyvyWB+ipSq0Sus+kDXFfGVFPfIN5ThPGZ9soflnk/4AFwEHAvJA0M=

       "], 
      Association["Book" -> 8, "Theorem" -> 20] -> CompressedData["
1:eJzlUdsRgCAMC1WQdgA//HMlR2ABZ3UjaXwcnCOYg1ySltexln0rAcDh9BPo
orOaKYE6RXMah2Ga2q4MSHxMSjDzMJMq4oNLM3KRmoR8NSFG0OR3meubvUjH
/Xv0HyMSmMn3WcEHb+wU6hknl4EF8Q==
       "], 
      Association["Book" -> 8, "Theorem" -> 21] -> CompressedData["
1:eJzlkVEOAiEMREuBbBjT7GoAf90fD+QR9gKe1RvZFjVuPIIvZJgZGhLCut1v
WyCih8m/gOvhclx6P9daaW0n7i3FGFP6HmpEvLxDKTRNuufsogDzLIb6oVm0
RAE+jesYIghpgGSBM/xL7dCT379n/zHMwTv+fVWwBXNiSSBPsdYHyw==
       "], 
      Association["Book" -> 8, "Theorem" -> 22] -> CompressedData["
1:eJzlkdENgzAMRC8GH44X6HdXYgQWYNZuVNsBVNQReEpOvrMVJcp729etAfik
PIb+6u7eE8SWbpynaVl+ZwwQPQ0J9wytJFDlIOqhxgiVqldSOoagRBilxUAy
6kOzWa7Ov3P/GJFWmfw/quWqGzMdlV+KKgX1
       "], 
      Association["Book" -> 8, "Theorem" -> 23] -> CompressedData["
1:eJzlkVEKAjEMRNO0ZelIWBdp9Hc9kkfYC3hWb2SSqrh4BB9lOjMNhdJ1u9+2
REQPl//hsC5H1Uvvna564rOWnHMp3yNKxMs7tEbTZHutIQYwz+KYH1rFSjTg
04SOIYKQBUgVBMO/1A8jxf179h/DnKLj3zclX3AnngTyBJCYB6c=
       "], 
      Association["Book" -> 8, "Theorem" -> 24] -> CompressedData["
1:eJzlUcERgDAIo1TCUadwJUfoAs7qRgJV73qOYK4NJORBr1s/9l6I6Az6EWxt
zZqZkVduhqVW1SlBxPIIH3k0a5BDBAPeD1a4KX5fJ3mESEAuBApJjP7mGKaa
V0jMH8Nc0uPvk0qc3BihILgAYJoFtg==
       "], 
      Association["Book" -> 8, "Theorem" -> 25] -> CompressedData["
1:eJzlkdENwyAMRI0BBd2HEyUV/DYrZYQs0Fm7UWyTVI06Qp/QcXdYSIh1f217
IKK3yT8xTa09am30XBZuNcUYU/oeqEQ8X6EUGgbdc3ZRgHEUQ33XLFqiAJ/G
tQ8RhDRAssDp/lQ79OT337l/DHPwjn9fFGzBnFgSyAFz6geY
       "], 
      Association["Book" -> 8, "Theorem" -> 26] -> CompressedData["
1:eJzlUcENhDAMc4Mi8gDJ5dUHn1uJEViAWdmIJgUEYoSzKsd28kjU37otawKw
O/0V5mnKJJEzZRxUu0712R8AscuUgr6vNUbUQVpD1Y3VashC3klwGwIN1dDU
GGj6ZG+Ge68QeH+MSIpMvgclf3TlayejHRTRBwU=
       "], 
      Association["Book" -> 8, "Theorem" -> 27] -> CompressedData["
1:eJzlkYEJwzAMBGXFIjIJvB3wAF2pI2SBztqNKslJacgIPcz7/2UMxo/99dwT
Eb1d/ottq601qrXyumSZppx/xwsRlzP0TvNsu0iIAZSijvmholaiA98mdBwi
KFmAiiIY/lAfRor7r1w/hjlFx/f3JF9wp54U+gHqsQbJ
       "], 
      Association["Book" -> 9, "Theorem" -> 1] -> CompressedData["
1:eJzlkdEJwzAMRM8yIhiOCBzIf6ATZYQs0Fm7USW5LQkZoQ9z1p30IePteO5H
AfAK+TMe3cHau9istVbVc3cGhF/TGqbJ7xzRgFwWC7wequYhG/lLUscQaHBD
U2My6o9GM911heT6MSIlM7k/p8TJjS2c0d5JIweO
       "], 
      Association["Book" -> 9, "Theorem" -> 2] -> CompressedData["
1:eJzlUUEKwCAMixURoWziYfd9aU/YB/bW/Wg2OlD2hAVJk7Rgxf28jtMBuI3+
hlKBrRRZl+C9D2FsLoDoa1JCjLVyJBhUc1YDtTIykYaE3IagCprmtevO1qSb
VyDmjxFxzOT7GmeHGxu5escD94oG+w==
       "], 
      Association["Book" -> 9, "Theorem" -> 3] -> CompressedData["
1:eJzlkdENwyAMRA8DyiE3H8lHfvKVjtQRskBn7Ua1TVM1ygh9QsfdGSEhtv35
2BOAl8vfsa4blvsi81RyzqX8ziZAbkdoDcNge60hBjmO6pjvWtVKNvLbhPZD
oMICtSqD7j/qw0hx/5nzx4ik6OT6mOSL7tSTUt/lNAar
       "], 
      Association["Book" -> 9, "Theorem" -> 4] -> CompressedData["
1:eJzlUdsJgDAMvKYt5iOCgv74I365jyN0AWd1I5P4QHEEj3C9u4SS0qmsSwkA
NqP/YR7RDT21TYoxpvRsNQDJZZhRVXrm7KRgrmsxqD44i4aGO3E+hsACNSxZ
riHTJ1vTnd//xvtjiIJn9H1LsGJTtnYQlh3X9gas
       "], 
      Association["Book" -> 9, "Theorem" -> 5] -> CompressedData["
1:eJzlUUEKgDAMy7oNe6gwQS9exC/5BD/gW/2Rbaei+ARDyZK0jI7N67asAcBu
9ENM6MeBupJijCk9OwUguQwzmkbPnJ0UzG0rBtWVs2houBPnOgQWqGHJcg2Z
Ptma7vz+N94fQxQ8o+9TghWbsrWDsBy2JAaF
       "], 
      Association["Book" -> 9, "Theorem" -> 6] -> CompressedData["
1:eJzlUcENgCAMLAVCHyXBxI8PH67kCCzgrG5kW8RIHMFLc9xdG1LCVo+9OgA4
lX6JeV1wKsF7H8I7LwDI3RBBSnLGaCQgypkVohtHllDxJMZtCIhBDHHkPqT6
Zm2as/tHjB+D6CzD70OcFqnStR0TX6FTBm0=
       "], 
      Association["Book" -> 9, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 8] -> CompressedData["
1:eJzlkdEJgDAMRNO0RXoQ/FE/9MuVHMEFnNWNTFIVxRF8lOvdpRRK53Vb1kBE
u8k/GTqexhRjTOlZj0TcX6EUahrdc3ZRgLYVQ33VLFqiAHfjWg8RhDRAssCp
/lQbevL737w/hjl4x993BFswJ5YEcgDJ6QbG
       "], 
      Association["Book" -> 9, "Theorem" -> 9] -> CompressedData["
1:eJzlkdENgCAMREuBeKQxwT9/XckRXMBZ3ci2qJE4ghfyuDsIgbBs+7oFIjoM
P9XMU00xxpTebSViuUMpNAw65+xQAeMoJvWNWbREAZ7G2TYRhDRAssDV/EVb
9OTn9+o/hjl4x99nBBswZ9cOAjkBaIUGGg==
       "], 
      Association["Book" -> 9, "Theorem" -> 10] -> CompressedData["
1:eJzlkYsNgCAMRMun4UIanMGVHMEFnNWNbIsYiSP4Qo67oyEhrPux7YGITpO/
EpeWU0o5v8umfR0BoFJ0Z3ZRgFrFUN+VRUvjaVz7EEFIA4RlDJm/1Q49+f0z
88fEGLyL31cEWzAnlgRyAVgtBhU=
       "], 
      Association["Book" -> 9, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 12] -> CompressedData["
1:eJzlkesNgCAMhMtDjvvtArqSI7iAs7qRfagJcQQv5KN3bQiEdT+2PYnIafiv
ljqVUusQieT5MaT0rntrDhXAkNbBRg1B4E2cMSSgqAGbDpiivmlNd37+qPFj
ck6e5e8bki34bc0RvABNygXg
       "], 
      Association["Book" -> 9, "Theorem" -> 13] -> CompressedData["
1:eJzlkd0NgCAMhEuhudwMPrmAwziCCzirG9kWNRJH8As5ekdD+Jm3fd2KiBwh
P6a1Wlt7P8IiotNtSAF8NktxAHa87mr0EASeJLU3CShuQPOGoNeXxmK63H9k
/BjVkpl+r1BiIE8bjuAJLwAFrQ==
       "], 
      Association["Book" -> 9, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGMuBkYREVRRZgY2BgYoZxmJkZWFkhNIggHTCAEV55
KAMdoEYMExMjWIwJ0weMIATWDyIYgWYBAGEgA/8=
       "], 
      Association["Book" -> 9, "Theorem" -> 15] -> CompressedData["
1:eJzlj7ENAzEMAyWKpv/xRZoAqbPSj/ALZNZsFMpdikyQs0FLoi3Bz+t1XhkR
75a/ppL8aUqRiT7nHAVwL+vicRdQknZKx9F76wqbGnRYTgrEJnFjLVRdsYkl
ROdmelpmRQJjBCrcKxSOCJFwo17rbX7Bm6f4F+2lbzM/o8cFAA==
       "], 
      Association["Book" -> 9, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNuDm5kbmsgAxPFCYmRlYWSE0iCAdMIARXnkogzLA
jCCYmQFFogPT
       "], 
      Association["Book" -> 9, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOODiQuaxADE8UJiZGVhZITSIIB0wgBFeeSiDMsCM
IJiZATr7A8Y=
       "], 
      Association["Book" -> 9, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOpBE5rAxMDAxwzjMzAysrBAaRJAOGMAIrzyUgQ5Q
I4aJiREsxoTpekYQAusHEYxAswBFMwPd
       "], 
      Association["Book" -> 9, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARJgY2BgYoZxmJkZWFkhNIggHTCAEV55KAMdoEYM
ExMjWIwJ08GMIATWDyIYgWYBADDjA8Q=
       "], 
      Association["Book" -> 9, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQMjI4RkHA7hAQCZkQKA
       "], 
      Association["Book" -> 9, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARQMk8AAAJWpAns=
       "], 
      Association["Book" -> 9, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQwwMzMPtBOoAACbzwKD
       "], 
      Association["Book" -> 9, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQgwMg6XsAAAlz0CfQ==
       "], 
      Association["Book" -> 9, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARgwDpewAACWcwJ8
       "], 
      Association["Book" -> 9, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQQMl7AAAJWqAns=
       "], 
      Association["Book" -> 9, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARwwMw+0C6gAAJl9AoA=
       "], 
      Association["Book" -> 9, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQKwDLQDqAAAl/ICfg==
       "], 
      Association["Book" -> 9, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 32] -> CompressedData["
1:eJztkd0JgDAMhNNAOG4Gn7qSI3QBZ3Uj86NCcAU/yvUuCSXQuY59DRE5Q346
uj2OFMBvsxQHYOG+1OhFEHgrqTUkoHgAzQeC8rdGM1O+3+kfozqypt9tRxzk
tpEIXvkBBWg=
       "], 
      Association["Book" -> 9, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 9, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARpgHWgHUAYAmJ0Cfw==
       "], 
      Association["Book" -> 9, "Theorem" -> 36] -> CompressedData["
1:eJztUckJwDAM86EM0pU6QhborN2okepPCNkgwghbGMngqz93dzN7SQc7ACyz
TNEA0BoI9ZCUmBXxL8mBQ+1UXyz7lMkSPT8mwqXFeqOzoGs5jYwPQ4QD9Q==

       "], 
      Association["Book" -> 10, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASZgHMIBAwCWWQJ8
       "], 
      Association["Book" -> 10, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARbANNAOIB8AAJZYAnw=
       "], 
      Association["Book" -> 10, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAVbAzMzNCAYMDIzkAgYwwisPY2EFLEDMxsAANgbE
YWRiYmBiApIwXUA+CAAlWVggKhgZAdrxAzA=
       "], 
      Association["Book" -> 10, "Theorem" -> 6] -> CompressedData["
1:eJztkdEJAzEMQy3bVTjCfXWCrtQRboHO2o0q5ejP0RH6IALJwgnkcbyeByLi
bfnzm3uS+5wdMc0wZHcVAHmpI4xe1OYkh7VuMDOIUIkYOEuZifwmHsopUvPC
+THUaTtgW7uqoiqrmIvQUzxlUCtV960fqYEEyw==
       "], 
      Association["Book" -> 10, "Theorem" -> 7] -> CompressedData["
1:eJztkdENwyAMRO847AYzRVfqCFkgs3aj2CRfkbpBn8QJni2DxHs/PjsBfCv+
/KC5zTkFRIwxvDDrXSIZ4Zml6H3RtjKpMmS1ZcCINEbn1bT0bVTFm/G8+/oY
z9XrRG7wbJcgNcnaAvmUqhr8lWVf805/MgR/
       "], 
      Association["Book" -> 10, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAS7ACAYwmgzAAEZ45WEsrIAFiNnADmFgBHEYmZgY
mJiAJEwXkA92KAMLCwtEBSMjAM7BAx8=
       "], 
      Association["Book" -> 10, "Theorem" -> 9] -> CompressedData["
1:eJztzdEJAjEMBuD0mjSJnq1JFeHgPERwhQPHcIB7cAFndSOrPruB38MfCPnJ
6f643QMAPN/x99PSIW4ARHLuh1L2dZoul93O3UWqu9XG6jjPyzwLvzft0sz6
vngbzJOz13y0atcPlsHVS5m8+GY0bgUXVc3tWYwdhC4QQSTIXUCMGgQTEkUi
JKUQq6EmTEnwK0VRWfEZnPstKnI+rF8zFQ0Q
       "], 
      Association["Book" -> 10, "Theorem" -> 10] -> CompressedData["
1:eJztzdEJAjEMBuC0TdpEewlnEUFFq+IQBafw2adbwFndyJ4+u4EfJIGQn5ym
531yAPCa299vHlEBmFXz3mxdar2ex9HMRIqZlk7LtrVHa5zmTb/sch6sD07V
khU9atHbR+KdSY/XXsNBUw8Yi0h/AiF4cN4RQSBQ7xCDOMaIRIEISciFMqJE
jJHxKwYWXqQLrFIeUDAtN/IGzSoMYg==
       "], 
      Association["Book" -> 10, "Theorem" -> 11] -> CompressedData["
1:eJztkcERwyAMBE8+kDQyjwwdpCWX4AZSazqyRPzKIxVkB25gJQ0PnufrOAXA
u+LPDyIc2GPO6YVZa72THMMzS9FVteWKMqkyutWRA0akMTp1sfRtehVv5ve7
n4/R3K1uIg+oQFgjGxnbAhSpqkFNme3ZwAvbCAVU
       "], 
      Association["Book" -> 10, "Theorem" -> 12] -> CompressedData["
1:eJztkTEOwkAMBNdxyGkV3J4CbsilzWt4Qj7AW/kRdpKKghcw0lnavZFceN5e
z00AvHP8+cXaA8tC0hMzchwjteZnR7/u3B7ZVM853Zk0GBGS0XlItYZyNlN+
RqoBv9ceh7nE6zOJEEUgqlDtVIduByqSrqEMJQxIbP0A6XYJlA==
       "], 
      Association["Book" -> 10, "Theorem" -> 13] -> CompressedData["
1:eJztkcENwkAMBNcYcnbwPU8ByxKKriNKSAPUSkf4kh8PKmAeK3m90j523V7P
jQC8h/z5yRnoq6qGe4SZ6jyLSO+RGonEtVar9fYYTnNPXe4y6DCBhJuE1J3W
MiKejssynnm1RL9bj2EuRz+ISFEIxAzmE/N02gETjayhTEVQQNn6AbnxCTw=

       "], 
      Association["Book" -> 10, "Theorem" -> 14] -> CompressedData["
1:eJztjcENAkEIRWGHAYa/TlbjTS/GmzdbsYRtwFrtSGYasAEfyQt8SLjt79fO
RPQZ+vOL1twdwIrez8dt8wgzpDNDKgZWMXNLP4YiXLpLIE8xTwFtB7fc9Eye
Y+kt7Hq/eH5hZsqqlUgIOSy0skpkXhYp6kLlFNYkKTJhVdGq7rRqSPYQK1/J
xggD
       "], 
      Association["Book" -> 10, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "], 
      Association["Book" -> 10, "Theorem" -> 17] -> CompressedData["
1:eJzt0bERwjAMBVDJcSLZ/iI+cmlScDgcaViEmhGyALOyEXYWYAFeodP9X6hQ
2d+vnYno08bfT895PsVtizpN3o+jmeWsZkkr01EkiJRHS5ZSgKTXuiKdLZtB
SzaFHi4Fdse6ZtygrUTBEoLO9YhzTPUfGomFQMyOHQ2qQ9dJ36tITxK8eOZW
HTp2SAmYSIYB3nuNCF+7kgnv
       "], 
      Association["Book" -> 10, "Theorem" -> 18] -> CompressedData["
1:eJzt0bERgzAMhWHJGOuBbRmcQMGlCOZScNkmI7BAZs1GARbIAvkKNX/xCs3b
+7UxEX2O8/fbMKhf1xbjaG3OXdf1PVKK2CX0Io3I/ETSOM1FNUrZo/eXmFNU
LDlBcbqVkFZdSqcPlSP6e5iAZtg3jGHa/4GWWCgQs2FDDnBVJXUNkZqksWKZ
j3Sq2ATvQ7iSOBettfARX7j8CgM=
       "], 
      Association["Book" -> 10, "Theorem" -> 19] -> CompressedData["
1:eJztjcsRwyAMRBEskgCDnM94JscU4iZSghtIrekowhWkgLzDO2h3Vs/j/Too
hPCZ+vMD+731fjFVEWYA62puc2AyRhujXOeFW3OXism6KBZYU9hZNVNPO3pX
t84QDczCu79IKYVEJBJiDjeimGIl5SaAclbumVQgIOcUUUJ82Ni2GrT4ckWJ
oC+tuAfj
       "], 
      Association["Book" -> 10, "Theorem" -> 20] -> CompressedData["
1:eJztjdERAiEMRAksSYCDMOrcjJ92Yg2WcA1Yqx0ZrgIL8H28j+zO5nG8XweF
ED5Lf37h1nqfpirCDGBOc5sDkzHaGOWyLtyau1Qs5qbYYE1hZ9VMPe3oXd26
QjQwCz/9Q0opJCKREHO4EsUUKyk3AZSzcs+kAgE5p4gS4t3GvtegxZcrSgR9
AYU3B6g=
       "], 
      Association["Book" -> 10, "Theorem" -> 21] -> CompressedData["
1:eJztjbsBwjAMRGX7cpZt5DhQUFBRsg4jZAFmZSPksAAD8IpX3Olz31/PPYjI
e+rPT2xmo6vmTAIYo7u7g55tUraZsDV3qZicu8L7dvhAvT3BTGHQGaOBzHz4
AyBKDIkqoNxCWJhqUlrhnLjSGKhcMpJvpS+IF+tjVGmrX64oEeEDejwHug==

       "], 
      Association["Book" -> 10, "Theorem" -> 22] -> CompressedData["
1:eJztjMsNwjAQRP3bZOP1xqDYKHADJJQzEoGAfaEESkgD1EpHGCqgAJ5G7zAj
zXZ+PmYphHh99Oc3pumeQ2D2nh15n4pTTtmlLnab2NV4K80wnorjsYhogYGR
0hg4U9+vSipqy3I5B75SDITERMPusPfl3xglpJLGCFWJpZIalJWoEQA0gKka
kJqsaREQHXxRNTTOEq6Fqx0ba8q/fQNDDw1h
       "], 
      Association["Book" -> 10, "Theorem" -> 23] -> CompressedData["
1:eJzt0b0NwjAQhuHzX2zn7uw4cSJkUoCUjhEAISRKKkbIAszKRiRI9AzAU7zl
13z7+fmYBQC81vz9aN4M01TXpeQUEYecqB+6IXRbXFnsU6TD6RyY8NgmQiQ7
ZsvdpcTM4cOGrm34di3hzs0YLRPTwaNPy7yUyxsCjAGogIUQSkhw2iqldGWs
QwO6rrRXUikjvzx573ZQO4paas/BvwFdhwt4
       "], 
      Association["Book" -> 10, "Theorem" -> 24] -> CompressedData["
1:eJzt0bsNwkAQRdHZn/czM2uvF1tYKyRInLgEIICEEIkS3AC10hE2EjkFcIIb
vuQd5udjFgDwWvP3q20/jiGU0qaI2KdEXZ/7mHe4stg1NU2nc2TCY06ESLa0
lvN1qDccP2xs24ZvlyHeuSnJMjFNHl1a1qVc3hBgDEAFLIRQQoLTVimlK2Md
GtCh0l5JpYz88uS920NwVGupPUf/Bh/QCxs=
       "], 
      Association["Book" -> 10, "Theorem" -> 25] -> CompressedData["
1:eJztzMsNwjAQBFB/1l5n10s+chy4IAUJBeIeoAdKSAPUSkcYKqAAnkZzmMPM
2/OxaaXU61N/P1v3E1GMEbnvS+3relldGdPQpSH4InU5zeJ4Ou6QmYeQY+Dl
nOTCOacalFYavt9GKTxlCRyFly61bT231ihtNIDSTvVWGzCkg0Wo0HvfeA3o
wBtnDLgvg46JhA6KmihAgBybNxCGCxI=
       "], 
      Association["Book" -> 10, "Theorem" -> 26] -> CompressedData["
1:eJztzD0KwkAQBeCd2Tczm2xi0EIJsbExggcQtLAXA2JhKZha8KzeyHVP4AH8
isf8MLMaX9eRnHPvb/z9zgyJArPpoIiP2+k5GeZda10bynOabA77uorLXSpj
LL2qx+W4aO61ZGaajmOhUiMoPCpD32/Xkr9755jBjr0T5tyCmERIAFE48kTI
4TNiDo2GoqBK0p7EYPwBOT0Kmg==
       "], 
      Association["Book" -> 10, "Theorem" -> 27] -> CompressedData["
1:eJztjDsOwjAQRNf2etefkA0icWzFDVfgFLRAk4IqF6DmmNwII3EBet5II80r
5rg9rpsCgNen/vzA8yDLUiXHGKukEH0Y/I2oI/IuNBN7KUMKUiTNqcN7xRz7
S4mz+3KuLk/jWqcsa8E8c+5kEG7XSiloMQbAQGxDAyrUrnlNzIYYdMcUjDUG
rSVrrUZ0zgYu4IlH1t7tT7s3J0QLZQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 28] -> CompressedData["
1:eJztjDEOwjAMRZ3EsRMX6gKlqUUWrsA12FgYQAy9AGflRgSJC7DzvvSl/4Z/
XJ6XxQHA61N/fmHbm1WdRaTqlCXlId2IVkSZczPSqw1F1HQqU4dLxVn6q0lJ
X86V53H3qHvTu+Fc2DodlNuzcw5aQgAI0LXhAR361Lwn5kAMfsUkIYaAMVKM
0SOmFIUPkIlH9jltTus33R4K+g==
       "], 
      Association["Book" -> 10, "Theorem" -> 29] -> CompressedData["
1:eJztjEEOwjAMBB1iJ6lRgkojFXFAQmYP/Q5nTv0Ab+VHuPAC7sxK3l1b8nV9
3tdARK9t/PmJx7IAUM2AZcdyrzXXmmTbqKqZTWpw23MHI+to2cYPSSYMaK2j
Yp5uDBPoobXij0MI5BIhEhq87KgEZk8hRmYpTLFKKuwlftkOSSQdqUi5SORy
nk9vK4ELLg==
       "], 
      Association["Book" -> 10, "Theorem" -> 30] -> CompressedData["
1:eJztjEEKAjEMRVObNJ3AVKeDgriQCQFrj+MR5gKe1RuZcePave+HD/8nZFmf
jzUAwGuzP7/RWjcbBu6m7CjP45h8aGtEpKlOcjNVFZw7GsuhsU4fEtU+WClz
H+1Y72iNFtmXkv1vCAFcRAAE2cMOKCAm72NEJEaIEonjF1/ERJgqZMxXv+bL
6fwG9UAKzA==
       "], 
      Association["Book" -> 10, "Theorem" -> 31] -> CompressedData["
1:eJztzE0KwjAQBeDEeZPMtPSHpDmAa8GFEkl3uhK69Ai9gOfweN7I2Bu493sw
PHgw+/X5WK0x5v09fz96TcssotOSVRu9aRBhEXiN9+xFxmvufVryfFZKkYpq
GPXSboAYpbRNClqGFKgMOLlu6LA93tVY1E7GWWusYQuSOhAYYDbUeDiuwBsC
aAf2vXGOO3Ysx/7wAWswC6M=
       "], 
      Association["Book" -> 10, "Theorem" -> 32] -> CompressedData["
1:eJztzE0KwjAQBeCJ8yaZaekPSXMA14ILpRJBV9ILeIRewLN6I2Nv4N7vwfDg
wezX13N1RPT+nr9fTctV1aZlNmvsblFVVBEsPeagOt7mPuRlLmfjnLiYxdEu
7QZISUvb5GhlyJHLgJPvhg7b312NQ+1M3jlyJA6sdWAIIELcBHipIBsGeAcJ
PXkvnXjRY3/4AAbnCv4=
       "], 
      Association["Book" -> 10, "Theorem" -> 33] -> CompressedData["
1:eJztzDEKwkAQBdDZP7Mzy67JEQyBHMETiHaCsFhIKklhkyKNhTZ6Ar2bNzKJ
J7D3FX/4fJi6G3LniOg9xd/PmjMQnk22YFaFFCfK4VFnr9ZXOdq9yds1ULzQ
BouDHdLMa7rZMcbFJbSpuKI9+Z0vy8Lmt46cA0AkJBgLAeJ0HMYr7IVgzKai
Kl+kymBWI+95Lyy8Wa4+2S0Pjw==
       "], 
      Association["Book" -> 10, "Theorem" -> 34] -> CompressedData["
1:eJztjDEKwkAQRWdn5u/sZlUQG0mxYOsJPICtQrSQVIH0oofwFp7KGznJDex9
xYf/H/zd+LiMgYg+U/z5nYE51neHCDzRlKKlJEV9dTlZe+9W5vJ0ZG4q98i5
xbXMSFzUfDNL1fpldrm1s/me5tdAITAzkZKyFxLWEF2IqIgpiQkbFDCdIUDY
ZeMGm2k5rPdfIM0NSg==
       "], 
      Association["Book" -> 10, "Theorem" -> 35] -> CompressedData["
1:eJztjTsKwlAQRefd+b3kxUJbSUIQcQGuQxCXkM5KQjoR9+iOnGQJ1p5imMsZ
5g7j4zYmIvos488PADIcJxXRTjbulXvsw2HKrm031RbyegHqHm/JpdXZV5RL
b0/33OdXqULu5e5NU/L6NFFKAIiEBBGIIcnWNhU2JTgjilSjbIHMGMxWU9id
xNF5e/oCY1gMJw==
       "], 
      Association["Book" -> 10, "Theorem" -> 36] -> CompressedData["
1:eJztjEEOwjAMBJ1dO0VKinoliBQCiMdw4wn9AG/tj3Aq8QHOzMFarVdzWd6v
JYjI2s+fX3g2MrXW7rlNtQ61zo/elFrHMc0nj44pjdfzlG/ZNo5Kb1SVHvoz
KQ+llJ0bQ6AEwExAGQCJAlAt+rqjBKOC8BHCF913WYimhk2JDwBsCAk=
       "], 
      Association["Book" -> 10, "Theorem" -> 37] -> CompressedData["
1:eJztjMsNAjEMRP2NkBzB1Su0KAm7N6qhhG2AWukIOyXsmXcY2TOj6cfnfSAA
fFP+nGJjrvuzDxtX9+LeXuksrZnV+xqnmamwcn/cbJhOVuFwRITjyLAKL+5+
iUFEBiRSBWIoRFDijVIRIeIkwhAMpkzEcgwjUMq20g/B8geK
       "], 
      Association["Book" -> 10, "Theorem" -> 38] -> CompressedData["
1:eJztjD0KwkAQhed/s84mlUKIlY1ICntBUtsEYSEXSGflWb2Rmz2CtR+84b2v
mNP6zisCwGc7f37DJOVlnj0fht6GfuevYsb7lDwdb6W6RzZjeU777uFaiWEz
Hk1baUxY2iCX8XrW+pEBiISAGISoTkFCVWRmVQZkrAaZKkgUOgtNRBc1wZJA
X8iACc0=
       "], 
      Association["Book" -> 10, "Theorem" -> 39] -> CompressedData["
1:eJztzD0KAjEQBeDJ/IbE7BHULfYEXkAsrLQJFpJG2H6RLbQXBM/njcyuN7D2
K97weDBtf829A4D3FH8/6rJ5s7WPYaLkX20WtWGVgz27vN8ipgcWb2G0U5yJ
xpudQ1jcfYlpxHKRgzRNsvmhA+cQEYCBsRZAZKd1qJdJGNCITFmVv0CVkEgN
ROjIxLRbbj4qtg5k
       "], 
      Association["Book" -> 10, "Theorem" -> 40] -> CompressedData["
1:eJztjMsJAkEQRHu6u7bno4J4EYVZPBuAEXhdGD3IXjcAxSQMw5zMyNkJwbPv
UFD1oA7T4zI5IvrM8edXCjrgiZiSpuQV+VWCt929rKx/l+HMHHseEcIe19SQ
bpHDzcxnG5ehyq0NVnff/hw5x8xESsq1kLC6rgoRFTElMWGDAqYNAoSrjNVg
My+n9fELecMMNQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 41] -> CompressedData["
1:eJztkbsNwlAMRf29yXs2iD6iQMo+SAxAkQUYgCmomICB2AgnDT0150i2LF3d
xqfldlmYiN7r+PMzgGNGDCtuyHz2Ecf50YfM1/UsjOTwNk64tw1X9MrD06L5
nmPS1jLDvp1aP1Fikk0mI6nFLOA6hMzLQktSVTGVYHPdCZvEwT8kUgim
       "], 
      Association["Book" -> 10, "Theorem" -> 42] -> CompressedData["
1:eJztzLsRwjAQBND77J0kJBOBx0NGwGdogMQ0QODA44AUN0CtdARWCcS8YOd2
g9vPr3FmInov8fe76f4sw6ZrvWvjanSU860vueyu3zPnpO6Kod82j2xVCO5A
Tm4NokPRBJyOl4PVb0okAiFRgkitYGEzNsAMxMq8LKxSsUhce4yJi5mDLcD0
A4B8CVM=
       "], 
      Association["Book" -> 10, "Theorem" -> 43] -> CompressedData["
1:eJztjM0JwlAQhPdndt/G99BTIOTmQSQHC1DBDgQxkALSQGq1I5PXgle/wzAz
MHOcl/fMRPTZ5M8PvKYytn3nfRd5NOThfis599fVltyom2J6tIdntkpKbsCu
cSsIh6IknIfLyeqZEolASJQgUiNY2IzXkRmIlXlrWKXCIrH3iOBi5mBLSPIF
U9kJBw==
       "], 
      Association["Book" -> 10, "Theorem" -> 44] -> CompressedData["
1:eJztjD0KwkAUhN//Zn272oiEWFn5dwUhhQhpVPAIuYBn9UYm7wq2fjDDzBSz
G9+vEQHgM9ufX7iX56ZrrWsX/jDx4/WyKr7tp1g8sxnLrV/XwTXIaV48m1Zp
TFhqktP+fND4YgAiISAGIYoqSKiKzKzKgIyxIFOARGlpqcnooiY4KdEXIJwI
pw==
       "], 
      Association["Book" -> 10, "Theorem" -> 45] -> CompressedData["
1:eJztjLsNwkAQRPczu3f23eEMZJmEBCNRABIENIAlRAEEboBa6QhzLZDygqcZ
jTS7+fWYmYjeX/35idW0Hvow9LG9OdLhci45bU9LTKlVd8X9uumexSohuAOp
cSuIDkUOGMfj3uqVEolASJRMpFawsBkbYA5iZUaVVlgkdh6bhrMtO1tAkA/7
yAh8
       "], 
      Association["Book" -> 10, "Theorem" -> 46] -> CompressedData["
1:eJztkb0NwlAMhP3/sF9iFNHQIMQy1IgmQnRZgEGYjo0waRiAlu+su6uu8Wl5
XBcEgNfH/vyGRe9eJ5Z9drPDcx4143Y5E/bEsObHdpcVtRjYW4tJ3WOLsTff
ZKZ957h+woBAqxAUqIKQjBCEkIVriFegClUfUJQnYqFxp2/JTAf8
       "], 
      Association["Book" -> 10, "Theorem" -> 47] -> CompressedData["
1:eJztjL0NwkAUg9+P/e6SC6FCitJRJRITQINoqGhghCzArGxEcivQ8hWWbck+
Lu/XoiLy2eTPjwzjEOOQ2ydR5utl35XxvNpSGg86HrdDfy+spBQE2ibYIQcc
u4R5Ok2sRy5iBhNzgVmNUFNS1xEJUVfdGnWrqFnuI+esHRlQJtC/vAEH+A==

       "], 
      Association["Book" -> 10, "Theorem" -> 48] -> CompressedData["
1:eJztjMkNwjAURP/u5cvBOLngCAEtceeSBqg1HZGEErjyNHoaaaR5LO/nggCw
7vrzK6VoKWYleY+neq9zq9F7vyhOGX1or9swhy/WsnkMU1S3cRuvllMbGx8/
tIUY96JEIJsRDESAVZgUISixsIjxATCnIO4ZI4mJauAzfQBCLgcZ
       "], 
      Association["Book" -> 10, "Theorem" -> 49] -> CompressedData["
1:eJztzDsOwjAQBND9jWPZOMnaSkE6TsBdOEL6CHFUboSTggvQ8qTZYkba2/Z8
bExE7+P8/WyfX3vMV8Bzq6350gzuc9QCxbrc61r9FGKBASg9uJjCI/I0jehP
RJiEyIRMaSDhs2DtGGY2GDFUcA5faQwpFc4hWN+CFP4AsxwHvg==
       "], 
      Association["Book" -> 10, "Theorem" -> 50] -> CompressedData["
1:eJztzM0NwjAMBWD/PTdNCYqbQ3tlBFZhhN4RYlQ2IuTAAlz5JPvwnuXL8bgd
TESvz/r7Xb0/07IBMUdtLbYwRNSkBYq9Xdd9jcFTgQEofXAyRSTM/RL9hwiT
EJmQKU0kPALWjmFmkxFDBaP4ymfPufDibr1zKfwGcnEHSQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 51] -> CompressedData["
1:eJztzMENAkEIBVBmgQF2BcwYY2J0E+1iE6uwhG3AWu3IUVvw6Dt8Dj+fy/q4
rwUAnu/4+4FlUW2Zoe4RsXHPfkTmlGx+jha3D9FjWq/mjPRTSB+kmln0F4gD
lKEwAzLEUIhQilIlZmQmVi6439FUqdZKXxXVdJQrbHl0EpLpYC/avQkb
       "], 
      Association["Book" -> 10, "Theorem" -> 52] -> CompressedData["
1:eJztzDsOwjAQBND9jWPZOGFtpUg6TsBdOEJ6xFm5EY4LTkDJk2aLGWlvx+tx
MBG9z/P3C8+YN8Bzq6352gzu16gFin291736EGKBASg9uJjCI/KyzOgfRJiE
yIRMaSLhUbB2DDObjBgqGMNXmkNKhXMI1rcghT8w+AbQ
       "], 
      Association["Book" -> 10, "Theorem" -> 53] -> CompressedData["
1:eJztzLsRwkAMBFD99nw+I8byBXZKCbRCCW6AWukIcQEVEPJmpGBXo9v5fJxM
RK/P+vuJuuxAzLH2HnsYItaqDsXR79uxxVCqwwB4Di6miIo5L5EPRJiEyIRM
aSLhEbAmhplNRgwVjOKrXUtrzkspll0R5zfxiQZb
       "], 
      Association["Book" -> 10, "Theorem" -> 54] -> CompressedData["
1:eJzty9sNwlAMA9DEzkPlplBGQEJMwCSM0AWYlY3I7Qx8ciJFlqPc9vdrVxH5
zPX3G8vqrMd1W7NyxlGLuRvrPNaqPDy9McMd9Hm8OO9jnKL/ARGq0AUUaE8X
IFpHeCc1gtqgB1G1RERoqgVAM+cXmxEFXQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 55] -> CompressedData["
1:eJzti70NAmEMQxM7Px9cGqhPSAgQYh1GuAWYlY3IXcEElLziyZbs8/J6Lioi
71V/fsSerNP9UVV57VjTFGRw3I41KjYuRjq/BMs4Z+bouypFAbYpAYh3Jdwj
iO3WhkGh2ruVDnHIXVDDzAFzOj6t+QWb
       "], 
      Association["Book" -> 10, "Theorem" -> 56] -> CompressedData["
1:eJztizEOwlAMQ5PYye9XfjuxMbEg1IEdCXVjZeAIvQBn5Ua0/wyMPMmWbcmn
9f1aVUQ+u/35FcE235eW7XjbYmZFBPhcDtMjvVPLvmQNHzkEwbHwMl/P3v8Q
MaOJQWjWK9XUXQG4QxTaF4V11KxMUYaqSQ/qJscXRCYHFw==
       "], 
      Association["Book" -> 10, "Theorem" -> 57] -> CompressedData["
1:eJztyz0KwkAQBeDZN39h180Zoh7BCwgWNmIhEkQIQqoIoqClp/RGrvEKln7F
Gx6Pmfa3XR+I6PWJv5+Zt2o+NG3062y/XgL5jq7yePE2jdTS0w8xTh7VMeUz
upNutK6zj++BQgBAJCQohQAJVoZyhVUIzuwmZvJFZgxmc1LlrbDwqlm8AT2b
DLY=
       "], 
      Association["Book" -> 10, "Theorem" -> 58] -> CompressedData["
1:eJztizsKwlAURO838178gFgYMAhuwNIdWIrogyDYpbG0dpnuyJu3BktPMTBz
mP34uo1MRJ8p/vyOkhI2z7JE/75eTiJtLw/PufMyq2gz3+UBSFvcFzlkhzNi
T/XNxCwiREYmUUjFuAmhaqowUqjAzR1WIXeVkG0YX0/LcXX4AmLbCjM=
       "], 
      Association["Book" -> 10, "Theorem" -> 59] -> CompressedData["
1:eJztkbsRwlAMBHX62e9JMM49BHZBBBRA4AZogD6gLTpCOKEBQnZndHPJJVq2
22UDEb0+588P6aOf1mcfMh/XM8MTYW2c/d52TLx7DG6p0eyImKW1zNDvgtRP
hEC8C1LiCoAdVZjUykJKEhFW4YCaHBjKMdkbf2gHcA==
       "], 
      Association["Book" -> 10, "Theorem" -> 60] -> CompressedData["
1:eJztjcENwlAMQxPbSX5FYAcG6oURugCzshFBPbAAR56lyJal+H48H4eb2etz
/vyS7lBtHVHrFkqJ3RmXOlmp2FiVKmaGxnLva+v7QDCjxTiM6DLAAXAKFxA4
s4/MfdaUZQFupJRLb1TtBOA=
       "], 
      Association["Book" -> 10, "Theorem" -> 61] -> CompressedData["
1:eJzti8sNAkEMQxPHk8zOT3tdceJMKSAhUcI2QK10xLASHXDkHSw9yz7vz8eu
IvL6xJ+fcl97H1ePXls2D+Ppto2tlYPsHoll8VRZPRlH4qWN6sfVRABC1ITA
VAAKIZEsaNMjKWcF6pdcIi9ZK9znzBl4A+hqBj8=
       "], 
      Association["Book" -> 10, "Theorem" -> 62] -> CompressedData["
1:eJzti7ENwlAMRO3z2U6+f9KjVPTsQYnECFmAWdmID4gNKPOKk57u7rw/7ruK
yPMdB/9l6X29RfaqySKN2/W0bL19iUhnzeHFCjcuzktfKz5PEwEIURMCQwEo
hAQtacPTR69jpD+yZU6zNkQQHkx7Ab5LBeo=
       "], 
      Association["Book" -> 10, "Theorem" -> 63] -> CompressedData["
1:eJztkbENAlEMQxM7CbkjX+gKBkBiEiZA1EiI66ioGZON+FxzC1DyLNmF5cqH
+XmZVUTeX/vzY4737ZC72/mkGqmVQ3vldVwgMlkRnl7pvXxgjNbK1zX7JxQV
LFJxQQ8C0RuD0FbEjDB6qDknmKH2mw9aMgdV
       "], 
      Association["Book" -> 10, "Theorem" -> 64] -> CompressedData["
1:eJztkbsNAkEMRO3xZ/HeYXQiQWRUQ45IEETXALXSEWYTGiDkWRq/xE7mtD4v
KxPR6xN/fs1taxmP6xk8JYe3OLa7Dsz7LNFaXyyi7zgOHpvM9O+xVCdCTBjD
ZIRaYDiYFCwq9UgGVILymdVkgSimvb8BAo8GeA==
       "], 
      Association["Book" -> 10, "Theorem" -> 65] -> CompressedData["
1:eJztkbERwkAMBHU6Sfb79Qy5IxoicE7iBiiAKiiNjhBOaICQ3RndXHCRLvt9
2yEir8/583OWKfN5uyoiMbzNazzagTOWGFN42mh+wljZW2a375b1EwpEDyEm
WgFooIqKeVmwFJJq1A5zDoVpP/sbCOkGgw==
       "], 
      Association["Book" -> 10, "Theorem" -> 66] -> CompressedData["
1:eJzti9kNwkAQQ+fwjCeLBBs6oIzkkxIoIQ1QKx2x2Rr45Ek+JMuP4/06VEQ+
p/35PVfWunYa09hv/dLvmFQyiwATLI69Jdv+3DBfJuLmepbwIQlTKQHU4T5C
YxQEkDERs6jIJbRZpCUKi38BfN0FlQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 67] -> CompressedData["
1:eJzty7ENwlAMBFD7fLbzv/NJjVJRMwg9I2QBZmUjkkiMQMcrTjqd7ra9npuK
yPuIvx+IHHNNFmlcH9dlHe3UIzJYLbxY4cbhvM+X4nkyEYAQNSGwVwAKIUFz
2t6DcFMz16/smVPTjji2YNoHfhMFZg==
       "], 
      Association["Book" -> 10, "Theorem" -> 68] -> CompressedData["
1:eJztizsKwlAUROfN/RmuZgU24hYsLbQTgoWmsk1hFQi4U3fkS1yDnaeYYTjM
bpjuQwHwnuPPL9h35yO5frGPaJ5xbRbMcvRbZo6rPmf50Iu17SaWT0EpJAGF
sg6QWryK2iqmYIiEq7t+gbtQxANm0qmonLaHDx9UClk=
       "], 
      Association["Book" -> 10, "Theorem" -> 69] -> CompressedData["
1:eJzti7ENwlAMRM9nO84PpIjSICEFUTMAE9BGAgqUNgswKxvh/B3oeMVJd093
Xt+PVQB8tvjzE+7zjexOXLyUoz93FW32U3lFtFMsfUl5iDlyb+tFIEISMBiz
QGnSpFA11TBoKMPNPawCd2XKLo2P23IdLl93QAhW
       "], 
      Association["Book" -> 10, "Theorem" -> 70] -> CompressedData["
1:eJztkbERwlAMQy3LNjj+4ehTsRAFA1BkAQbIlGyEk4YJ6HjvzjoVqnxbX48V
IvLez5/f8LwrYqA8z0tseeCMKeoUPqzSL6iFmWOUfWfsn1Ageggx0Q5AA11U
zNuGrZBUoxbMOStM6+ofrWsFwA==
       "], 
      Association["Book" -> 10, "Theorem" -> 71] -> CompressedData["
1:eJztkb0NwmAMRH3nv3yOAKUNFSulT5MFmJWNcGiYgI73JJ+uuMqP47kdEJHX
ef78iA2IKyoy19zjgzHmKDO/eA2/oe6sHGOK70r7JyoQtmeakAKAgS4U87bR
VjpoqglzXUjjvPgbaeQFKw==
       "], 
      Association["Book" -> 10, "Theorem" -> 72] -> CompressedData["
1:eJztkbsNgEAMQxM75PgcSIiGlpUY4RZgVjbigpCYgI5XxEqsuPFWjr2oiJwx
fr5CtZl0pHcLVz5Y5pDcso8Mc2HvKSV7n1g7gUQxuJWCKmqgapggaEQIpAai
Lo04OMetndoLEo8ERQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 73] -> CompressedData["
1:eJztyssNwjAQBNBd7ydeJiiCC0KEGFwSJaQBaqWj2FADt7zDjDSa5/p+rUxE
nx67v6kTKurS8/51m4EFjzKhoPQZM67NqZ1VhYRlOFByOjKrK0vOPoi4Wbgb
u6Y2NsL8qxTjJeJM5gYV9RjzBiIVB3o=
       "], 
      Association["Book" -> 10, "Theorem" -> 74] -> CompressedData["
1:eJztzLERAkEMQ1GJb3tvbyAmZuiIEq4BaqUj1iRUQMYL7MCWbsfzcVjSq8ff
78CdCijmRw6YZEISfWT05rJ+IxBm7DqVznZUiG3rSC4V6cQZ/sKe+3XVqlti
RYzeIFQElQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 75] -> CompressedData["
1:eJztyksKwkAQBNDq/ySKE8E4UZmFR/AqrgR3WQs5qzdy1DO48xU0VdDnebnO
BOD5Pn8/VLWkfJv6/foj4l69lPFxGqdhOWgpflzlIXftlYjQIgIIUhsMJuXW
iNVM1MCdWy8s4vzlbmrhFaFp5xq2vWxeH+YHqg==
       "], 
      Association["Book" -> 10, "Theorem" -> 76] -> CompressedData["
1:eJztzL0JAkEUBOB58/6OXfU6ECzCEswPMZQFrwExNTSyPTvybnsw8wsGBoY5
zPfzLAA+a/z9EltmfcSldB71lddSNs+h1e2b7eZTjONu6FuBCEnAYFwKlCax
ntBU3cBUzfCI8A5mStVImPtkSzvtj1+uewmX
       "], 
      Association["Book" -> 10, "Theorem" -> 77] -> CompressedData["
1:eJztjLERwkAQAyX27t/+gZiYwA25BDdArXTEPQkVOPMGuhnppNfx3g9L+ky5
OBVaQGf9kR0GmZCUvc2kLo96jECYPnRrutvRQizLrGTRIp04w3+w1/GsWc2V
qIrRFw9/BHA=
       "], 
      Association["Book" -> 10, "Theorem" -> 78] -> CompressedData["
1:eJztyrENwlAQA1Cfz/f/KfyIgirFL+ioGYURsgBCYgwahmAgNiJJxQJ0vMKS
ZR/n62U2AO81/n4rapviXjfhrZdnluz52A2dr0m3HMdWtqfBjCQgiLZWyrQM
pMJDYIr1C9yd7hqw7Hu5/Hw4fQAdqgfL
       "], 
      Association["Book" -> 10, "Theorem" -> 79] -> CompressedData["
1:eJztyrENwkAQRNGd2dm9O8uAnJGSgERAD27BkhtwA9RKRxwISiDjBSON9E/b
fdlgZo/X/P3YfJzWffnIcA0tY1RNucaiy/V2znfoZqRodAuyX1BwREBeQv3X
hIge4asdsraKHSPVS4U/AXfNBX4=
       "], 
      Association["Book" -> 10, "Theorem" -> 80] -> CompressedData["
1:eJztyrERhDAMBEDpdJLsMQwZCUNA/tV8CTTwtX5HGPdAxgY3c9Id5+97qoj8
73g9bV+3pQ4lI4Othje2cOMc/MzLxLEzEYAQNQmgVwPUhYTzJv2JQL/SBjUr
U9ZStQ+C9GTaBTe/BNw=
       "], 
      Association["Book" -> 10, "Theorem" -> 81] -> CompressedData["
1:eJztisENwzAMAyWRsuT4UbhIH0H66UoZIQtk1m5UJ52hvx6IAwHytR/briLy
PvXn5zxvy3Thec+sLXuJyp5a18w2P2ZeNxsx6FncTDisEkIKSIAqWRQOd3wZ
UwSzTQqzwnCi2wc/TwT4
       "], 
      Association["Book" -> 10, "Theorem" -> 82] -> CompressedData["
1:eJztkcENwzAUQr8Bo9StfWoGyEoZIR3As3aj2rlkgtz6kEDi/xvb0fcjRcR3
2p/7eXzKCbG8WG0/c11c0TqKW6u+Xjk2YaTA0MwcQCQSeVw0OlEST8ImxOxE
4Q0JbS0/UkIFKA==
       "], 
      Association["Book" -> 10, "Theorem" -> 83] -> CompressedData["
1:eJztysENg0AQQ9EZ2zOzi4iSFjiRQ6qhBBqgVjqCbBE55R2+ZMnLfmy7m9n5
zd8PPGKoyqCmnjGrpaS5tL4/a44TzQDBQCvgnoScHuFShWTekkGQgcGB9sre
mz9RKUaqcAEjiATP
       "], 
      Association["Book" -> 10, "Theorem" -> 84] -> CompressedData["
1:eJztkcENwzAMAylSchHbMdAB8uhKGcELZNZsVLUokA3y6h1EfvjTax77NADn
J/7cQf0SKrX0CO/Rmg+um2odY9U1VP5EMDD9NWH01OCEJ1JeigjRFYsVatCd
7fl4Awl+BFs=
       "], 
      Association["Book" -> 10, "Theorem" -> 85] -> CompressedData["
1:eJztissNAjEMRP1L4tgB1iINcOCAhDhQCiVsAYtSKh0RJGrgxNPMm8uc1udj
RQB4ffTnJ4xtjE3rJVr04y169OtSY8bcbT8PRDyLIkAZGiEzF1ROLMI5laQJ
uZlkmgjRd+piqnfw4gdJYruzvwE7oggP
       "], 
      Association["Book" -> 10, "Theorem" -> 86] -> CompressedData["
1:eJztibsNwkAQRPd7n90DvPI1QECAhAgohRJcgJFLdUccEjUQ8TTzJpjz8nou
CAD7R39+w7pta6nXaNHne/Tot6nGiLnbcfxEPIoiQAkaITNnLKwswkmzFkVu
JokGQvSdOlkpD/DsJ1Gxw8XfDUkHng==
       "], 
      Association["Book" -> 10, "Theorem" -> 87] -> CompressedData["
1:eJztybsVgkAUBND33X37E5bA3MDAhENgJZZAAWirduTDIoi4Z2aSua3v14oA
8N3ndJBt+1h69Nqnvniv85i6x1qrF7+J2IsiQAEqITNHNFYW4aBRTZFrlkCO
6U+I0pjNnlBiGUQlt3v5AeALBzE=
       "], 
      Association["Book" -> 10, "Theorem" -> 88] -> CompressedData["
1:eJzticsNQjEMBP1NHDvAs0gDHDggIQ6UQgmvABCl0hFBogZOjHZnD3tYH7cV
AeD10Z9f8bxbO2XPsb/kyHFeWs54hG/nS8SzKAJUoBMyc0VjZREuWtUUubsU
mgjRd9riZleIGjtR8c0x3rMTBr8=
       "], 
      Association["Book" -> 10, "Theorem" -> 89] -> CompressedData["
1:eJzticsNAjEMRP1NHDuwa5EGOHBAQhwohRK2AKiVjggSNXDiaebNYY7b874h
ALw++vMzHtbO2XMcrjlyXNaWMx7h+3kS8SyKABXohMxc0VhZhItWNUXuLoUm
QvSdtrrZDaLGIiq+O8Ubhg0GTg==
       "], 
      Association["Book" -> 10, "Theorem" -> 90] -> CompressedData["
1:eJztybsRAkEMA1Dbsne9P7i94HICAhKGgEoo4RqgVjrCUAQRbyQlOu3Px85E
9PrM3+94ucw+13mLbtelzIiP0Q/xiSDKqiSJujCAzA6DKpJlc2P0qkkC5EtF
ylLd79RyO6ppHef2BlvFBeU=
       "], 
      Association["Book" -> 10, "Theorem" -> 91] -> CompressedData["
1:eJzty7ENgEAMQ9HEdoKuoWADVmIEFmBWNsJXsQEVX7noSdHt53WcGRH3XH8f
Vt2SX1OLzK21jjHKJyCCGawADSSQJpzZE0lQ+eY/VZCUZXmahQfbqAOx
       "], 
      Association["Book" -> 10, "Theorem" -> 92] -> CompressedData["
1:eJzt0bEVwjAMBFCdopPtF9tKXgpqRoIRskBmZSNEwQpU/OKau+7u5/U8ISKv
T/z90i3GPI4W+6x14+yPEaUUzwbINxSkwKQpNIFGZYaz0sV8ccsdFF/dGH0I
1cdiVtdoby3RBP0=
       "], 
      Association["Book" -> 10, "Theorem" -> 93] -> CompressedData["
1:eJztidEJwkAQRGd3du7MfQSSEySiQtB0kEosIQ1Yqx15AWvwyzcw8Gbm7fXc
DMB7rz8/5d5NdVzmYeof1zhfdOvqWEs7zAwtJEDkJo6w8EPbPSSG4CWr0En/
QiqUU0UKnVJkHdfhA0+yBZw=
       "], 
      Association["Book" -> 10, "Theorem" -> 94] -> CompressedData["
1:eJzticENglAQBd++fbufgERKwC4sghMXThqCBXigAzu0I78U4ck5TDKZy/ac
NwPw/urPbyn3tj29mlvX71wfmnMYzs1xDGYkAUGsAacs6yDlHgKLe8nIzDiA
5HTPAkUsqjWN1w/QgQeO
       "], 
      Association["Book" -> 10, "Theorem" -> 95] -> CompressedData["
1:eJztkcERwkAMAyXbseMcd56QCmiDMighDVArHXFQBC92Rnrs6Kfb+XycBPD6
1J8fM65Hq2PYVjku915mVlOLEPOPCDBQIIUK99VVwyNmELnkIpQ5/aLktree
HW7WVW1tI98cLgTq
       "], 
      Association["Book" -> 10, "Theorem" -> 96] -> CompressedData["
1:eJztib0RQGAQRPf2FjcMApHgC2RynShBA2Z0ojQd+YlUIPKCndn3hmWdFwNw
3PPzNZFHir0qE/deWzRNnT/eYEYSEES7L2W6AqnMM4EhFi/g7nRXiau3cvnU
jSd2rAYl
       "], 
      Association["Book" -> 10, "Theorem" -> 97] -> CompressedData["
1:eJzt0bsNhDAQBNDZD2t7MUYkZCCRkiFRxwVXAg1crXSEjyKIeMFIM+ksx+97
EIDzH6/HfaZh74a5t5JLXtvOSx2Z6xsEM8CQiUiotiaKiFbBFdqwBRZmlRuL
pBBT2uDqRVktj34BQEMFcg==
       "], 
      Association["Book" -> 10, "Theorem" -> 98] -> CompressedData["
1:eJzt0bsRgDAMA1D5E4eEkDuYAApaCkZhhCzArGyEYQgqXqE7q1DjpZ1HIwDX
E7/vzdNexqVaGeqw5ZKqd8z+DYIZYChEJORX6EREXcwKDWyRhVnlxSIpdimt
yJqrsvpgfwMkowUU
       "], 
      Association["Book" -> 10, "Theorem" -> 99] -> CompressedData["
1:eJztiMsJgEAUA/O+uyr4AWEVvViCrViCVw/WakeuWIQXJxAmWfZz2wnA9dTP
B0xpaI9RU/Kpatom5ouIkCMCCGIeDCblbMRqJmrgwq0UFnF+cTe14DOCxt41
WLfWNyyyBUU=
       "], 
      Association["Book" -> 10, "Theorem" -> 100] -> CompressedData["
1:eJzticENg0AQA71e7x5SROgAiSYogT9JAxHwJ62moxw0wYd5WPLMsOyvxQD8
jrm5gubzaL/cVr2z6555OoMZSUAQ7biUqQbS3cPB4l4yMjNOUHVNaqCIWZRP
/fgHb6sGbA==
       "], 
      Association["Book" -> 10, "Theorem" -> 101] -> CompressedData["
1:eJzt0bsRgCAQhOG9BxwwB86YGxgY2Y0l0IC12pFoEyZ+wR9sums/j04Arie/
T9S2VGtefS+epzEwjzcIIQARTkRChKQmIjqkotDIwViYVV4sYilb2lC0NCWN
Pucb/ZwEpQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 102] -> CompressedData["
1:eJztyakNgAAUA9D+thzhCAKFQODQjMIILEDCHCzHRhyGETA80aTtsKzzEgCO
O37fKIuee+ctr+sqfZZABEnAMOOudPg6SFuJwdzM0hckUXIBJ2osa2rHEytC
BUQ=
       "], 
      Association["Book" -> 10, "Theorem" -> 103] -> CompressedData["
1:eJztiMENgDAMAx3qNE2KhJBYgBHgwx6MwALMykYUluDDWTrpPB/nfgiA69HP
RwzBMXyMdVuspYigTRUgokWHLJne/pTIXIg0VbPCUowv8kiz9XCtpkavHjcI
pgTG
       "], 
      Association["Book" -> 10, "Theorem" -> 104] -> CompressedData["
1:eJzth90NQEAYBPf72btDEImQS7woQStK0IBadeREEV7MJpuZ9Tj3QwBcz/18
RfZpjrnp+i6VEhGUmQGGVEKh4lpM1ElzQqvA2tQs6AtJZwwL6GkMHjls7Q3m
8wR2
       "], 
      Association["Book" -> 10, "Theorem" -> 105] -> CompressedData["
1:eJztyrsNgFAMQ1HHcQIVr6ZkBRagp0LUiOItwKxsxGcJGk5h6Uoe6rFWA3A+
8/sM901LltK1bxrMSAKCeAdIWT4/yj0ENu5NRmbGC5LTPVsoYtZdUz9eGe0F
Yg==
       "], 
      Association["Book" -> 10, "Theorem" -> 106] -> CompressedData["
1:eJztzLsVgDAUAlAQXn5Ha2tXcoQs4KxuZFI5go234NDA0a+zE8A94/cdKcuS
ttFtQVRuWBJW0slQKZGlGJKDIYb5ElnbXmtgvnhMKDzJDgO3
       "], 
      Association["Book" -> 10, "Theorem" -> 107] -> CompressedData["
1:eJztiTEOQFAUBPftLn6EKFRKndpRHMEFxFHdyP8qN9CYYpLJzPux7QHgKvr5
kMln6vuufiIQQRIwzChJh/MgXakymMzmBSRRcov8B8tax+UG7K8Ejw==
       "], 
      Association["Book" -> 10, "Theorem" -> 108] -> CompressedData["
1:eJztkcENgDAMAx07CqBW4tUBWIkRugCzshFpP4zAh7NkW/766NfZDcA97OdL
uEWtJd5B+Ylg4JSNlkFxocEJDXy6ECFmX03BRjn3Vh62vwOD
       "], 
      Association["Book" -> 10, "Theorem" -> 109] -> CompressedData["
1:eJzt0bERgDAMA0DLsZyQM2QGVmKELMCsbIQpmIGGL3Qq1Gmf5zEhItcTv09t
o9bqWYB8Q0EKTBaFJtCozHA2upgXt9xB8QrjiBCqRzFrfe03xTADzA==
       "], 
      Association["Book" -> 10, "Theorem" -> 110] -> CompressedData["
1:eJztyUENgDAUA9D+tmPAEoIFbghACRJmAK04gnFBAhfeoWnTpR57DQBni9+3
cilj97RABEnAMKNNOnwfpK1ksDdz94IkSh7gpMmytnm9ALv/A+s=
       "], 
      Association["Book" -> 10, "Theorem" -> 111] -> CompressedData["
1:eJzticsNgCAAQ/sDPOAQruQILOCsbiScHMGLfUmTvh7jOgcB3Kv+fJy+9/Ku
CDCW0MSskCjJ82CsZnkJTkCWtKQi9mbHpeUBsrIDSQ==
       "], 
      Association["Book" -> 10, "Theorem" -> 112] -> CompressedData["
1:eJztkbsNgEAMQ53Yl/tswUosgHQFLbOyEYGGEWh4ki3brZd5rNMAnLf9fM22
t7cwP3FEyqFUh2cUWWQIFYpS5YOREdFGt+GqnrsGL+kkBCg=
       "], 
      Association["Book" -> 10, "Theorem" -> 113] -> CompressedData["
1:eJztkbERgDAMA2VLMQ5MwUoUDJAFmJWNMDSMQMPfve57rePYhgE47/n5nD3f
Zn3iiNKhssMrRTYZQo2ilHwwMiJy7ra4wqVJnRfHfwO5
       "], 
      Association["Book" -> 10, "Theorem" -> 114] -> CompressedData["
1:eJztkcENgDAMA53YDWm3YCVG6ALMykakfBiBDyfZOvnrfZ7HNADXqp/vyVdZ
nzii4lClw0tFNhlCjaK08cHIiMjRbbjS1z78Bq2IA2I=
       "], 
      Association["Book" -> 10, "Theorem" -> 115] -> CompressedData["
1:eJztxLsNgDAMBUA/+yV2goACKRIlKzFCFmBWNuIzBQVX3NaPvUNEzqffN5iZ
GOAummQB1LQi8uBk5BR5TAinE7c3wKjrPLVWJcpAVhYlLsVrBAM=
       "], 
      Association["Book" -> 11, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAAjHh4pAEAlWgCfA==
       "], 
      Association["Book" -> 11, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABwNhgBMcII/Z4ISK2gEoAlk8CgQ==
       "], 
      Association["Book" -> 11, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSABjAyQ6GBkhPDIBACWJAJ/
       "], 
      Association["Book" -> 11, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAB4swM7MzMjIwMbIwMLMwsbCxsDExAHhcrK1QB
E7oOYPQxMbMwMaFEIwCtXQLu
       "], 
      Association["Book" -> 11, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABrAzsTExAmp2BgYWZlY2FjYEJGEFcrKxQeSZ0
DUBZJmYWJiaUaAQApooC1Q==
       "], 
      Association["Book" -> 11, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweABXNycjIwMLCwMjDysjEyMDBwsjIxMbGzMQMDA
xMzBzMjAzIQEGIEARAJ1MoIRSBoAscwDHw==
       "], 
      Association["Book" -> 11, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweABwPhgBAIQixHGRUgR1A1WDACXZwKI
       "], 
      Association["Book" -> 11, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSACLMzs7AxsjAyMPKxMQMDMxM7MzsTECARAgpkR
KIECGBgYQbJAjRAmSCUAqt4C9w==
       "], 
      Association["Book" -> 11, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWACwkwsDMxsDIx8rIzMjAyCPCxMzGxsbKysrEws
rEJAcTYWJACMPkYQgFAMDExMLIyMALxeA2o=
       "], 
      Association["Book" -> 11, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaACzIwMzKwMTJysTMxMDBJ8DExMrKzsbGxsDKxs
/OwMDOysSAConJGRiREUi0xAJlAzCyMjALhRA2A=
       "], 
      Association["Book" -> 11, "Theorem" -> 11] -> CompressedData["
1:eJztyrERgDAMQ1HJlo251NzRshIjZIHMykaEhhkoeMVvpKOPsxPA9eT3LTuw
IIJ0muXWjKzMrEhWeM3h5aS0SoKb23w3Cje4wgNz
       "], 
      Association["Book" -> 11, "Theorem" -> 12] -> CompressedData["
1:eJztxLERgEAIBEDuOGB4A2NDW7KEb8Ba7cgxsgUDN9h9nseEmV1Pv89piwAc
ydwWAl1VIwod3sGXA1JLMqeTOVa53beyA4c=
       "], 
      Association["Book" -> 11, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAD4FhhIlc3AJWSAn0=
       "], 
      Association["Book" -> 11, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaADjAzMTKysTCwMTMAI4mRhgQozoatjBioFARRB
AJ5ZAq4=
       "], 
      Association["Book" -> 11, "Theorem" -> 15] -> CompressedData["
1:eJzt0UEBgDAMBMFcrrkkMrCEhBpAK45KX1jgwTzWwB7zOifM7N75fRATcGB0
6XlUkrI1KrxaL7gzSIZxc2XQFrRUA30=
       "], 
      Association["Book" -> 11, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAEFEQMAJUdAns=
       "], 
      Association["Book" -> 11, "Theorem" -> 17] -> CompressedData["
1:eJztxLENgDAMBEC//P6YwiiwAStlhCzArGyEqFiBgivumOeYMLPr6fdFQMjp
GdsitVCoCU0s4uVA1dp7WFY6yR1uN7dCA7A=
       "], 
      Association["Book" -> 11, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAErIzMjIwczEyMzGxsoGhiBAkxMqDHGBMjIxMT
iAaqAAIgCwCe+wK6
       "], 
      Association["Book" -> 11, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAFoPhhZGRE4mEHTEhsAJXhAoE=
       "], 
      Association["Book" -> 11, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAFPLyMzIyMjCzMzOBoYmRghCJkwMTEwsQEFARi
JrAuZgCgwgLO
       "], 
      Association["Book" -> 11, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAFLIyswAhiYWNjYGQEMhiZGRiZGEBsGACymZjZ
mJhAoiAJsCYAng4CxA==
       "], 
      Association["Book" -> 11, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAFjEzACGJmYoJxIQgFMDGxMDExwuRAAACZiAKk

       "], 
      Association["Book" -> 11, "Theorem" -> 23] -> CompressedData["
1:eJztkUERwDAQAoGDTGzUUiTEQLXWUS8u+ugOs4/9cu17bQJ4jn4+jfsjVUQF
slUjI07PYRXUlVCL5UnxBaKhAwU=
       "], 
      Association["Book" -> 11, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAG4kxMTKysXJycnAwcnDxcDAwcLEgAqICRkYkR
FItMQCYDAwszIyMAr2EDNw==
       "], 
      Association["Book" -> 11, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAGXJwsTEx8vLy87Fy8rHzMjLwcrAjAxMjIwcbB
wcHCwMzCzMzMxMbBwsIIALdrA6I=
       "], 
      Association["Book" -> 11, "Theorem" -> 26] -> CompressedData["
1:eJztxDERgEAQA8AkFzJcjwEsIeENoBVHzFdYoPgt9hz3NQjgmS2/dojsJHuC
3qqjT5F220bJktMyXqrtAz8=
       "], 
      Association["Book" -> 11, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAGzExMnOzsbGy87AycLGy8HCzMLCzMcMDOzcHO
ycrAxsHHDFQmyMzKCACt8AOH
       "], 
      Association["Book" -> 11, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAH7OxQBhO6DDD6GJmYGVFjEQCYTwKT
       "], 
      Association["Book" -> 11, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAHctBoYmFkRI0wII+ZlYWZmYkBLMMIEmAAAJzw
Aro=
       "], 
      Association["Book" -> 11, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABLAyMqBEG5DGzsjAzMwElGEFckDQAlzoCmg==

       "], 
      Association["Book" -> 11, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAH8hJi7EISDGYsjLI8HEDAxgEGTMzMLMwsLMzM
DExMLExMjCy8rCyMAMMhA/c=
       "], 
      Association["Book" -> 11, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAABpqsnAaM3EzMHILcCMDExsbMxMLMzMTAxARi
sHKysTIBAMKeA/4=
       "], 
      Association["Book" -> 11, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAA+mxcnozcTMwcQjw8PNxc3GDAxMbGyszCwsLE
wMzEzMTCwgrkMwEAwGMD9g==
       "], 
      Association["Book" -> 11, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAAbNyKjCxMbCzigpyCAmIcYMDEyc7CzMLCysLA
ysTCBGSwsrMxAwCzxwOu
       "], 
      Association["Book" -> 11, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAAnKwMrMyMrCyMCMDMyMjKxMLKyszAxMjMxMrI
xsHIzAAAnUEC4g==
       "], 
      Association["Book" -> 11, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAB7AwsrMxc7EwQwAwigAJAwMLKwMnMw8zMxsLO
zMIIAJ/iAww=
       "], 
      Association["Book" -> 11, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABjNxMzBxCPDw83FzcYMDExsbKzMLCwsTAzMTM
xMLCCuQzAQCrCANu
       "], 
      Association["Book" -> 11, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZABTAyMTCxIgImBgZGRiREYi4wMIJKBiYWJCQCb
ewLE
       "], 
      Association["Book" -> 11, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdABLIzyXGwIwMTIyMLMwsLMzMDExMLExMjCy8rC
CACkXgMr
       "], 
      Association["Book" -> 12, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRACeuxs3BzcnFwcHGwcHBzM7KwsbKxsbGwMfOzc
rGxcHNxM7IwArGMDlg==
       "], 
      Association["Book" -> 12, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVACzCyiQqLCIjwcXBwcHGzcnEwsDMzMTAwsjFxM
zGzszExMjACqjgNr
       "], 
      Association["Book" -> 12, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZACvBwcHOxQzMTIyMzCwszMysDMxsbExMgiwMrK
CACiaQMq
       "], 
      Association["Book" -> 12, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAC2hraSnJKSvJKikpMQgLMzEzMrKwMzMwsrKxM
7BycXKwAw5oEdg==
       "], 
      Association["Book" -> 12, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRADqdk+Hs622rbW1ozK0iABJkYGFhYmZgZmJmYW
ViYA5WkFng==
       "], 
      Association["Book" -> 12, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVADWvKKvq56rs5OjLzCID4TIwMLMxMLAzMTMzMb
IwDKGgSx
       "], 
      Association["Book" -> 12, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZADAZ4udlp2NjaMKjIgLhMjAwsLEzMDMxMzCysT
AM28BNk=
       "], 
      Association["Book" -> 12, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAD9sYmpiZmRozGWiAeEyMDCwsTEwMzEzMrKzMA
wbQEcw==
       "], 
      Association["Book" -> 12, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAEsvIy8oryjPz8IA4TEwMLM5BgZmJmZWMCAKmh
A3Q=
       "], 
      Association["Book" -> 12, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAEcTZx0dGM/AIgNhMTAyMLCyMDMxMzCzMjAMBa
BGY=
       "], 
      Association["Book" -> 12, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAEVuWZGYx8/CAmExMDIyszIwMzEzMLMyMAuy4E
OA==
       "], 
      Association["Book" -> 12, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAE8gryjHx8IBYTEwMjKzMDAzMTMysTMwCgVwMU

       "], 
      Association["Book" -> 12, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAFaWmMfPwgBhMTAyMrMyMDMxMzCzMjAKnrA4I=

       "], 
      Association["Book" -> 12, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAFeYx8/CCaiYmBkZWZgYGZiZmFmREAoWkDIw==

       "], 
      Association["Book" -> 12, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAFjHz8IIqJiYGRlZmRgZmJmYWZEQCX/QK2
       "], 
      Association["Book" -> 12, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAFjIwgxMDIwMTMxMTEwATkMgAAlasCjg==
       "], 
      Association["Book" -> 12, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAGBUDMyMjIwMrCxMDAxMLGwcQCAJ4lAxI=
       "], 
      Association["Book" -> 12, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHjIyMDKwsTAwMTCxsHEwsAJXVAqI=
       "], 
      Association["Book" -> 13, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAGnGw6GqwMrKyczCysrNyMzAwAnDoDFQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAGbJycTIysLEzMTIzMbIzMDACXOgKz
       "], 
      Association["Book" -> 13, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHnJysDMwszMwsjCxsjMwMAJb7ArE=
       "], 
      Association["Book" -> 13, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAHeqwMbKyczCysrLyMzAwAmRQC4w==
       "], 
      Association["Book" -> 13, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAHrAxsLJzMLKysXIzMDACWegKx
       "], 
      Association["Book" -> 13, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTAAHCycoszMLGzCdgIAl3MDFQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXACjAwAlOQCew==
       "], 
      Association["Book" -> 13, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAAnOz8LCwsbOwsrACWdQK7
       "], 
      Association["Book" -> 13, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAAcuycwixsrCwsAJcgAtI=
       "], 
      Association["Book" -> 13, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAAbBx8rBxsLKwAlfYCsg==
       "], 
      Association["Book" -> 13, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTABTCzMTLncTgCWvgM/
       "], 
      Association["Book" -> 13, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXABPNxsLCzMAJWQAqI=
       "], 
      Association["Book" -> 13, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbABzBxcLGwAlT0CmQ==
       "], 
      Association["Book" -> 13, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfABHCzsDACVHAKN
       "], 
      Association["Book" -> 13, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTACLKwMAJT4AoM=
       "], 
      Association["Book" -> 13, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAC3EEAlUoC1w==
       "], 
      Association["Book" -> 13, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbACGgCVCgKi
       "], 
      Association["Book" -> 13, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
       "]],
     SelectWithContents->True,
     Selectable->False], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"ListLinePlot", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"Values", "[", "res", "]"}], "[", 
    RowBox[{"[", 
     RowBox[{"{", 
      RowBox[{"316", ",", "353"}], "}"}], "]"}], "]"}], ",", 
   RowBox[{"PlotRange", "\[Rule]", "All"}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
      RowBox[{"GrayLevel", "[", "0.85", "]"}]}], "}"}]}], ",", 
   RowBox[{"Frame", "\[Rule]", "True"}], ",", 
   RowBox[{"PlotLegends", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{"10.41", ",", "10.78"}], "}"}]}]}], "]"}]], "Input"]
}, Open  ]]

By the way, adding a superaxiom can not only decrease the number of intermediate theorems used in a proof, it can also decrease the “depth” of the proof, i.e. the longest path needed to reach an axiom (or superaxiom). Here is the average depth reduction achieved by adding each possible theorem as a superaxiom:

proofDepth
&#10005
Cell[CellGroupData[{

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],
						
Cell[BoxData[
 RowBox[{"(*", 
  RowBox[{
   RowBox[{"proofDepth", "[", "g_", "]"}], ":=", "\n", 
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"pp", ",", "newpp", ",", "len"}], "}"}], ",", "\n", 
     "\t", 
     RowBox[{"Max", "[", "\n", "\t\t", 
      RowBox[{
       RowBox[{"Function", "[", 
        RowBox[{"v", ",", "\n", "\t\t\t", 
         RowBox[{
          RowBox[{"newpp", " ", "=", " ", 
           RowBox[{"pp", " ", "=", " ", 
            RowBox[{"FindPath", "[", 
             RowBox[{"g", ",", 
              RowBox[{
               RowBox[{"VertexList", "[", "g", "]"}], "[", 
               RowBox[{"[", "1", "]"}], "]"}], ",", "v"}], "]"}]}]}], 
          ";", "\n", "\t\t\t", 
          RowBox[{"len", " ", "=", " ", 
           RowBox[{"Length", "[", 
            RowBox[{
             RowBox[{"Replace", "[", 
              RowBox[{"pp", ",", 
               RowBox[{
                RowBox[{"{", "}"}], "\[Rule]", 
                RowBox[{"{", 
                 RowBox[{"{", "}"}], "}"}]}]}], "]"}], "[", 
             RowBox[{"[", "1", "]"}], "]"}], "]"}]}], ";", 
          "\[IndentingNewLine]", "\t\t\t", 
          RowBox[{"While", "[", 
           RowBox[{
            RowBox[{"newpp", "=!=", 
             RowBox[{"{", "}"}]}], ",", "\[IndentingNewLine]", 
            "\t\t\t\t", 
            RowBox[{
             RowBox[{"pp", " ", "=", " ", "newpp"}], ";", 
             "\[IndentingNewLine]", "\t\t\t\t", 
             RowBox[{"newpp", " ", "=", " ", 
              RowBox[{"FindPath", "[", 
               RowBox[{"g", ",", 
                RowBox[{
                 RowBox[{"VertexList", "[", "g", "]"}], "[", 
                 RowBox[{"[", "1", "]"}], "]"}], ",", "v", ",", 
                RowBox[{"{", 
                 RowBox[{"++", "len"}], "}"}]}], "]"}]}]}]}], 
           "\[IndentingNewLine]", "\t\t\t", "]"}], ";", "\n", 
          "\t\t\t", 
          RowBox[{"Length", "[", 
           RowBox[{
            RowBox[{"Replace", "[", 
             RowBox[{"pp", ",", 
              RowBox[{
               RowBox[{"{", "}"}], "\[Rule]", 
               RowBox[{"{", 
                RowBox[{"{", "}"}], "}"}]}]}], "]"}], "[", 
            RowBox[{"[", "1", "]"}], "]"}], "]"}]}]}], "\n", "\t\t", 
        "]"}], "/@", 
       RowBox[{"Rest", "[", 
        RowBox[{"VertexList", "[", "g", "]"}], "]"}]}], "\n", "\t", 
      "]"}]}], "\n", "]"}]}], "*)"}]], "Input"],

Cell[BoxData[
 RowBox[{"(*", 
  RowBox[{
   RowBox[{"resDepth", "=", 
    RowBox[{
     RowBox[{
     "ResourceFunction", "[", "\"\<ParallelMapMonitored\>\"", "]"}], 
     "[", 
     RowBox[{
      RowBox[{"Function", "[", 
       RowBox[{"t", ",", 
        RowBox[{"t", "\[Rule]", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{
            RowBox[{"(", 
             RowBox[{"If", "[", 
              RowBox[{
               RowBox[{
                RowBox[{"Order", "[", 
                 RowBox[{"#", ",", "t"}], "]"}], "\[NotEqual]", 
                RowBox[{"-", "1"}]}], ",", "0", ",", 
               RowBox[{"With", "[", 
                RowBox[{
                 RowBox[{"{", 
                  RowBox[{"g", "=", 
                   RowBox[{"Subgraph", "[", 
                    RowBox[{"euc", ",", 
                    RowBox[{"VertexOutComponent", "[", 
                    RowBox[{"euc", ",", "#"}], "]"}]}], "]"}]}], 
                  "}"}], ",", 
                 RowBox[{"Catch", "[", 
                  RowBox[{
                   RowBox[{
                    RowBox[{
                    RowBox[{"proofDepth", "[", "g", "]"}], "-", 
                    RowBox[{"If", "[", 
                    RowBox[{
                    RowBox[{"!", 
                    RowBox[{"GraphQ", "[", "#", "]"}]}], ",", 
                    RowBox[{"Throw", "[", "0", "]"}], ",", 
                    RowBox[{"proofDepth", "[", "#", "]"}]}], "]"}]}], 
                    "&"}], "[", 
                   RowBox[{"PruneSubgraph", "[", 
                    RowBox[{"g", ",", 
                    RowBox[{"Subgraph", "[", 
                    RowBox[{"euc", ",", 
                    RowBox[{"VertexOutComponent", "[", 
                    RowBox[{"euc", ",", "t"}], "]"}]}], "]"}]}], 
                    "]"}], "]"}], "]"}]}], "]"}]}], "]"}], ")"}], 
            "&"}], "/@", 
           RowBox[{"Complement", "[", 
            RowBox[{
             RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}],
             "]"}]}], ")"}]}]}], "]"}], ",", 
      RowBox[{"Complement", "[", 
       RowBox[{
        RowBox[{"VertexList", "[", "euc", "]"}], ",", "axioms"}], 
       "]"}]}], "]"}]}], ";"}], "*)"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"resDepth", "=", 
   InterpretationBox[
    DynamicModuleBox[{Typeset`open = False}, 
     TemplateBox[{"List", "ListIcon", 
       GridBox[{{
          RowBox[{
            TagBox["\"Head: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["List", "IconizedItem"]}]}, {
          RowBox[{
            TagBox["\"Length: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["465", "IconizedItem"]}]}, {
          RowBox[{
            TagBox["\"Byte count: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["5397840", "IconizedItem"]}]}}, 
        GridBoxAlignment -> {"Columns" -> {{Left}}}, DefaultBaseStyle -> 
        "Column", 
        GridBoxItemSize -> {
         "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
       Dynamic[Typeset`open]},
      "IconizedObject"]],
    {Association["Book" -> 1, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 1, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIYmBiYgICBdoAJjR4F1AbIIQsArJwCig==
      "], 
     Association["Book" -> 1, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgImJiRkGgCxWZlZWKBNIgCShGESAaRQA1s/EDJOA
6mNGqGRGpphBGnAADKNBpqNaRV2A30CwfUzYnUVlAPE8OCJAJAO2cMYB4JED
BQACnQUe
      "], 
     Association["Book" -> 1, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 1, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJAgJkFGbCysAIBC7GAgQFTDEU3M5okAy6A1XSidJIJ
8BsItg+r96gOIJ5ngPmYEisB1kMHXg==
      "], 
     Association["Book" -> 1, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLAgBkEmKAEKzMrK4IPBMxQDCLANApgYABLwSRA+qAQ
JoJMMYM04AAYRoNMR7WKugC/gWD7mLA7i8oA4nkGcPgwgBmY4YwDoCsEAPhf
BRc=
      "], 
     Association["Book" -> 1, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIggJV8QFAzM5RmgWvAAbCajmoVdQF+A8H2EfYeNQDE
8wwwH5NqJQsSGwDHtwiE
      "], 
     Association["Book" -> 1, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKggJVsQFAvM5RmgWvAAbCajmoVdQF+A8H2EfYeNQDE
8wwwH5NqJQsSGwC+xQh/
      "], 
     Association["Book" -> 1, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJggA0MWJEBOzsrO6oAK5oASD0DA5BmR9aEpooJSrNA
aQZcgBULQBHEqZNMgN9AsH2s2J1FZQDxPAPMx6RYyY7GBwDfHQiW
      "], 
     Association["Book" -> 1, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLggJWdFQHYoRy4GDuUQFbEysoGFADqZGVjR9XJDoZQ
wARXCwEMuAArFoAiiFMnmQC/gWD7WLE7i8oA4nkQZgeRpFjJjsYHAN7VCJk=

      "], Association["Book" -> 1, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIQgJWdnZ0NGaBxsQBWIGZgANKseBQxQWkWqA4GXACb
ZhRBnDrJBPgNBNvHhtVZWPxLjAhOAPE8A8zHDCRpRlUMAKIqCZo=
      "], 
     Association["Book" -> 1, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKQACsIsLGxsbKxsrOzgXnsrFDADiXgAhDFQAzSx4YQ
ZmcHK0RSx8SKChhwAVYsgIEonWQC/AaC7WPF7iwqA4jnGWA+JsVKdjQ+AMpC
CIs=
      "], 
     Association["Book" -> 1, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKQAQcHGzJA42IBrEDMwACkWfEoYoLSLFAdDLgANs0o
gjh1kgnwGwi2jw27s7AHBdkA4nkGmI8ZSDQOWTEAlYQJlQ==
      "], 
     Association["Book" -> 1, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQAAcbMmBnZ0MFrBgAKMjAAJIAy4IUQNggLkw5E1SC
BSLLim4pHKBYBTWUAcaGWEVdgNspEOdAlED8iel3pFDAJ0kMANvGygBhs4EQ
Kyy4sQY7HgAABbcIwA==
      "], 
     Association["Book" -> 1, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQARsy4OBgIwBYgRikiZUVjyImKM0C1YFhKVbLoQBF
EKdOMgF+A8H2sWF1Fhb/EiOCE0A8zwDzMQNJmlEVAwB3oQmC
      "], 
     Association["Book" -> 1, "Theorem" -> 16] -> CompressedData["
1:eJydUIkNgCAMrAY4yhau5Ags4KxuZAuYoDwxXkKvvV5TYIvHHhciOjW8wOzZ
KzhkngLiJhIGV+6Q6VbWkhoN1vt2aUFvw0McTv6Em3bTPjnONbeytpU+KEOQ
fjwIgOQsj0Ya1lIpm7RCd9xU+QVhTQvA
      "], 
     Association["Book" -> 1, "Theorem" -> 17] -> CompressedData["
1:eJytUNsNhDAMCyfy6hasdCOwwM16G2GnCJUD8XWWGiWNHbdZ1s97nUTky/CL
jAwiW0P0zF4icw/V2KEFB08k3MFjEzXyElLQSS91g9Zn5IZzY9oRpQaV8x1q
5ELXsmPxZ9hjt/zwWjM+TEeYncpz8yBxI5fbgXs0A0ZYoDT+ElH6LuvTGdcZ
w6jqzsPoDSMYCb0=
      "], 
     Association["Book" -> 1, "Theorem" -> 18] -> CompressedData["
1:eJytUNkNwlAMC6ip7bcFKzFCF2BWNmqOorai4gtLOZxEseTH8nouNzN7Z/rC
UGOIVATRnBlUZzYQUzMJqPFgHYzYC0Lf8q5upwi4dCVa0AHcxOzTF/kz5p/b
0mNczZsRO9xPNJfnSR0djLrEviw1N+RfE0zuaULJAA4gSzDH1ZOJ7U+WFaJN
C+U=
      "], 
     Association["Book" -> 1, "Theorem" -> 19] -> CompressedData["
1:eJytkd0NwkAMgwNq6tidgpUYoQswKxuRpKC24sQT30PO+VEs5W7r475ezOxZ
4ZtlUUIxQooIlE6lqjDJGBtg0EwCpCXHsyQyR0Mgttm4sqWmDHByaFroQFvn
ax/dyZ+Zf3bbL3Jqfh9ix/2UVvNc6aHDoYbszTQi3VB7jTC51xe0DeAA6snM
MVoytehjxwvdpQwX
      "], 
     Association["Book" -> 1, "Theorem" -> 20] -> CompressedData["
1:eJydkesNwjAMhA2qe2e7DMFKjNAFmJWNuKRCbaXAD74ojl/JRcl9fT7Wi5m9
mhmwNCKDXIokUn4ms2VCyHIDwVD/DcisLV1VyWJBo/fyGoFSZlIIjxiLijwg
NWqmfXyZ+rrzT+afVYmbUV0ztxvsuJ/CVjxnetPhoYbsRQlFuKGdawFL9/YF
XQZwAG1R5BgdMnWnPzbf/bgMMg==
      "], 
     Association["Book" -> 1, "Theorem" -> 21] -> CompressedData["
1:eJydUUEOwyAM86bSOAmf2Jf2hH5gb92P5lBNbaV2hxmITBwIMo/l9VxuAN4V
ztAF0sjezYzhGmERTrqgyBXmNMDdJCYZSvUUSaZpjFrepWd6Ttpac79oCsQO
1NQKfLlCXp78E/NPVc0Bqmrm+oINrR22JR4zo2hn1Ck2sRrJyTKbEInyqcut
8Qut8lqt6Nkl0yDDbH4A0T8MGw==
      "], 
     Association["Book" -> 1, "Theorem" -> 22] -> CompressedData["
1:eJydUcENwjAMNAj3znY6BCsxAgswKxtxSYXaSi0PLopjnx1f5Nyfr8fzYmbv
bg4xtxYZ5FwkkfIzmZ0JQZYLEAyz1oDMWuiqShYLWqOW1wiUmJtCeMSZqOUG
UqN22teXqdObf2L6mZW4GVU1cXnBCvdd2JN7ZhRtBnWINSmhCDf0vhawdG/6
giEDOIB+KHIcNbkNZwybH9XuDBs=
      "], 
     Association["Book" -> 1, "Theorem" -> 23] -> CompressedData["
1:eJydUUEOwjAMC2iZnWQ8gi/xhH2At/Ij3A60Tdo44Epp4rhx1d7n52O+mNmr
hWNMU2SQtyKJVJ7JbEwIivwgGU0MZNZCV1WyWNDqWl4jUGIGlfCIU9PcQG6a
nmnfXKFOT/6J8WdX5maUauRygxXuu7I190wXbR/qCGtTRhFuaHMtYOk+6Qu6
DeAA2qbKcTRk6El/bL4BwpQMEA==
      "], 
     Association["Book" -> 1, "Theorem" -> 24] -> CompressedData["
1:eJydUMENwkAMC4jUTtKqM7ASI7AAs7IRviuoVGp54NPlEseKo7veH7f7ycye
LRxgjgxyLJJI5ZnMxoSgyDeCIe0EZNZCV1WyWNDpWp4jUGIuKuERh575BblR
N+2TK9Txuv9h+NmVuRmlGrhssMJ9U7bmlumi74/aw9qUUYQb2lwLWLrP4xTd
BnAA7VHl2Bty6Un/bL4AttMMDQ==
      "], 
     Association["Book" -> 1, "Theorem" -> 25] -> CompressedData["
1:eJydUO0Ng1AIpI14B2icoSt1BBforN2oPGyjJuqP3svj4yAc4TG/nvNNRN7N
nMHcyCFIwjN2pzfGEmn5hdFEphFwj4WOCGcwkK96eTdDJNNlCjU7lfQNUo35
XX5xmrhY9y/0l9UUF2F29Vw2WKG6S1txz1TT9lBHWIspZKaCNlcM4qrTMFrJ
AAqgucwUR0O6CurY/ACc/Av+
      "], 
     Association["Book" -> 1, "Theorem" -> 26] -> CompressedData["
1:eJytUMENAyEMy1Vnh7DFrdQRboHO2o3qEJ3Eo5z6qAUmIUYOOc7X89zM7J20
RI/WmrekgTwArQm6DbOsRGSxpP16U6IHnJntiqm9dPQJGGbudsUj+TPWrVQ7
JamP4ga3xV8go95pTjosoAAaPahxaZYsEfnNNmmf8g9Dowkc
      "], 
     Association["Book" -> 1, "Theorem" -> 27] -> CompressedData["
1:eJytUMENgDAIRFMKuIUrOUIXcFY3kqM2qSY1PrwUCjnoQdeyb2UiogNuDFNV
gVXgZvbTwWkjAmOmzCj3eIlGsat2ZsloTx5nt6GgdOAQE6EWR/IzxqPUcWpJ
LHrf+4FX8gtcyMx3xVNk7NL4y6B0LC3NpS4/ATroCRA=
      "], 
     Association["Book" -> 1, "Theorem" -> 28] -> CompressedData["
1:eJyVkQsOAiEMRKvZ6Uy5hVfyCHsBz+qNbGE1YOImvkBTwvQHt/1x3y9m9ixz
gkS1CHUYpCZAEoQidelHpNFQtCMgd3GVvO42ujvgJ/VGQE/fU4jG7gOV/rzd
//ndSkGmQaq8WsFMzTHBvr7x7FqrbhXqfarB8p2tlSStZTkdQ4fWHGvZ8RXb
5wy8AMq5CaQ=
      "], 
     Association["Book" -> 1, "Theorem" -> 29] -> CompressedData["
1:eJytUMERgDAIi57pwy26kiN0AWd1IyuFU6H2ZR5cCFxSmsu+lQnAcZUR1uRA
3kxQGaC6Dm2HtjOTIi4Uws+4V5IawXiL+hdjQ8mjPYtP+JY9yAVR7dDUjoec
XCvsQxmjnVWYnoasCIk=
      "], 
     Association["Book" -> 1, "Theorem" -> 30] -> CompressedData["
1:eJytUEEOgCAMq8byD7/kE/iAb/VHQmHRbIR4sIel25qWsefzyAuAq5YpkgP5
MKGwqhLpS9PQNCup4UYRfkpjN4LxFvUv5obKoz2Lb/iWI+iCOB3Q1I6HTi4V
9qGM0c4qbG91pAh/
      "], 
     Association["Book" -> 1, "Theorem" -> 31] -> CompressedData["
1:eJyVUIENwzAIY1NTDCRH9KWd0Ad26z6ayVa1lbpOdQJyDAHEND8f801EXunO
0aAKNx5XdwOMoMcXBhWpVRkMwKm0IAk0DY1Ijns4qcXAr1rCfjbzDcBLc1k4
Xfs77kWMp1E2FwGzRnwmWFHK7pnBvdKTtos6whrMRtykctkQEtdSa+W2qLJQ
6rSS9KjI0IewPuMbTTcL2Q==
      "], 
     Association["Book" -> 1, "Theorem" -> 32] -> CompressedData["
1:eJyVUIkNwkAMC6hX5xPpCqzECF2AWdmIJEhAhaDCp7PyyXnO6/WyHojoVrQD
ZRbTeqxqZp4/DS03SQaI3IGOZTEvEYAhcEKakcFjOCLEpqzAcP3aq6W12xgX
pSZp2ymsuuyP+x/m36vXpJxVsz5We2GMjVvJbaSLcmr7iL7hmaxGIiCIMBNA
JsPdy8vzCxJaxFx/I1HH4qlFpIe8A/tfDH0=
      "], 
     Association["Book" -> 1, "Theorem" -> 33] -> CompressedData["
1:eJyVkYsNwzAIRGmV445s0ZU6QhborN0o4HxkV4raPlkIojMHzmN5PZebmb0r
fCNCDQapDpAEoTCrPKI+qDSc9wuqgryTXslEdwf80kpNvtmgtRCNLQeyvX4Y
9y+uRynIDEiV1yjoqT062M4nnlNr1I1CHVUtlu9sc0kyWtppXzo09hhtt18x
nTWwAoCTCXg=
      "], 
     Association["Book" -> 1, "Theorem" -> 34] -> CompressedData["
1:eJyVkIENwjAMBA3q+53QJViJEboAs7IRfqugFCkCTlHkJPbn7et2v20nM3to
+8qlheBKxgBIgohmprg1XSRB9soP7pxJ18NCdwd8+pOqSkLyu4axYiDl4xe7
/zC3IsjckFkuKxhRHwOs9Ymn6zjmHRPjdVJjvTfrnoOyDgtX/1Wx5mQnSL9m
tbxtAE9x0Ql3
      "], 
     Association["Book" -> 1, "Theorem" -> 35] -> CompressedData["
1:eJyVkQEOwjAIRdHs84HsEl7JI+wCntUb2V/n0mkW9aWh0MAH0styuy4nM7vL
fCdDcCZjACRBRJrJz9RDI8jq+cGVM+m6J7o74IeNVNUlJL9qGLsPNPn4adw/
OB5FkM2gZblGwYj2GGA/73ibOvZ5+8R4RVqsKq3UygoWrv17xVz1Kb21fX7F
tMXAA16uCW4=
      "], 
     Association["Book" -> 1, "Theorem" -> 36] -> CompressedData["
1:eJyNkGsOAjEIhNEszCzZS3glj7AX8KzeSIYf2ppo+rWQQimP3s7H/byY2VNq
ATZJcMCBkp27GQi4k7J0hYK9misQ8m6ICPf4V6cfdPpORFRynd1Vb63ddX63
IoBSXlHdvY9ojgH0/iaqa85xs+dtabDMtNTsRrcqB4kTeRyfKjP96fqrbQh4
AV+gCXg=
      "], 
     Association["Book" -> 1, "Theorem" -> 37] -> CompressedData["
1:eJyNkIkNwjAMRQ1q4uPbpiuwEiN0AWZlI5yKq0igfsuJnh3Fx3m5XpYDEd3G
sUcxl7kj3fMhB9xVWJTIjdnMBqucIpjBOSwiABzDNVMwwUwb7HeZ9TlQd6i7
GRwEDy1UBXJnu7vV/2ZtdFrj9V7DmX6q9w1WUpp+q6dqbiIi5W/MJ9lazci4
Z5AIBTevdTYeWweP6blIhMtfqv8sslqbBkFRAbkDa0kNMA==
      "], 
     Association["Book" -> 1, "Theorem" -> 38] -> CompressedData["
1:eJyNkI0NwkAIhdG0BzwO4gqu5AhdwFndyHdNtNZE0+9yJA8If9flfltOIvIY
5hB5yeo9agMOlJuai3SoAujdzY15qqHF1wn9ZwZ6t5gC8Dnws0tVxmDU92Gi
QqLSM92d/qPjHqX9jWJMyvVa4xLwT1rbSQZt9m9audfOY8a/yXoprN0g0FYp
PGnpzNv5rMmrh44cpTJT/jeshyyONq1qbWBPVHYNIg==
      "], 
     Association["Book" -> 1, "Theorem" -> 39] -> CompressedData["
1:eJyNkIsNwjAMRA1q4s/ZpiuwEiN0AWZlI5yKX5FAPcmJnh3FPp+X62U5ENFt
HPsUc7gj3fMhB9xVWJTIjdnMBqucIpjBycEZEQCO4ZopmGCmDfa7yfocqDvU
3QwOgocWqgK5e9yd6n+rNiYte72XOdNP9b7BKkrTb/VUzU1GpOKN+SRbuxkZ
9wwSoeDmtc7GY+vg4Z6LRLjipfrPImu0aRAUlZA7OWYNEg==
      "], 
     Association["Book" -> 1, "Theorem" -> 40] -> CompressedData["
1:eJyNkIsJAkEMRFe53WTy68GWLOEasFY7cvZATwXlHiQwScjvst6u66m1dp/u
IFkRXjsGs4KKorUwETOLgEJZJ+JSkhKE8TMTEeqLm6HDfs6oSp/M/pjOy5tX
IhMA48fXPcb4m7W5Kc8bg0cY3hnjQzKpHd+MAuojokrbZT2VbdOsmYzKxpeW
dP4OXZJfd5k1QqUqtBfsZ1lcbdnUNkAfH/ANAg==
      "], 
     Association["Book" -> 1, "Theorem" -> 41] -> CompressedData["
1:eJyVkIkNwjAMRQ1q4uPbZoauxAhdgFnZCKfiKhKo/MiJnm3Fx7xczsuBiK7j
2qtwR7rnXQ64q7AokRuzmQ1WOUUwg3OciABwDNdMwQQzbbDvJdZ0oN5QdzM4
CB5aqArkH+3uUv8ZtdFpjdd7DWf6rt43WEFp+qmeqrnxiJS9MB9kazUj455B
IhTcvNbZeGwdPKbnIhEue6r+s8hqbRoERTnkBgqrDPY=
      "], 
     Association["Book" -> 1, "Theorem" -> 42] -> CompressedData["
1:eJyVUNsRwyAMo72AHzj2DFmpI2SBztqNKsM17U97qeBEjAQO2vb7bb+UUh5J
p2HhH7Bu5taIBYoyqaq7sLCbEXVyWskmrpZm7gu+exX72iGAPJCrjiIMbUMw
RdDvn989g/ZTVQXhea3pAZloLRklxgTXIbwMw+QqnvrhYlZENHckRVTj1Oim
Ran5WkhKUEUOUmlF0J1gAVWETMRvCHOEI6tlXCg96QlnJw11
      "], 
     Association["Book" -> 1, "Theorem" -> 43] -> CompressedData["
1:eJyVkIENwjAMBA3q+52IJViJEboAs3Yj/C6gFClSOUWRk9ift+/r87FezGzT
dh4yBkASRDQzxa3pIgmy613BzpV0PSx0d8CnH6iqJCT/1jBWDKR8/GX3BHMr
e7+5IbNcVjCiPgZY6xdP13HMOybG56TGem/WPQdlHRau/qvilpOdIP2a1fK1
AbwA7fsJKA==
      "], 
     Association["Book" -> 1, "Theorem" -> 44] -> CompressedData["
1:eJyVUNkNQkEIRONbjuHQEmzJEmzAWu1I9nl9qXGyGRaGDITj+XI6b4joOukP
HBLIzJgPnokYLErkJmxmgIpKAMzgYGc00n07IwS7GRbgo39lRTWyesr8VDhF
lNZeVaPiv3V/Y3xVzZp62zHsBb2jS81r4SHIsmpmT+qmMA1dex5dItYnelZa
7GyVepAqyHhkECslL+6uC0e6GbdZ8sycWd5QkZqHsh18uqEN9QYv6Q1p
      "], 
     Association["Book" -> 1, "Theorem" -> 45] -> CompressedData["
1:eJyVUO2NQzEIS6tLwA4QdYRb6UboAjdrNyq8fkj90VbPURDBxkn4Pf//nQ+t
tUuFPQj3iCDpAN3sNKhojVQh4Q5VNTPRqSG5PAEc3Rih/MlG9sm39svTM1Ft
XAlza2VRLnCLnc/9ivGRZb009xh8Ajf0XtlWuBO6VUA+AjA8Z4JNc1epEvqs
JJmnjcqLRKRJzTGz5tJLpeS0VNTvBZ1zUvQFa8ZaOdnUTJQWV9uRDQ4=
      "], 
     Association["Book" -> 1, "Theorem" -> 46] -> CompressedData["
1:eJyVUNsRAjEIROcSHoHkLMGWLOEasFY7csmN+qXO7SQwLBsCXLf7bTsR0SPN
Ibit69ojRpi6e6uqCtYrW0JFJcYo3Ni584gIsOcI7V3a0qAozb5WT7niAXxk
eYee1EIRAuZH2/2H+jOboxFurTlaHn0BVDY0+ZkxKTNru91FQ3VMZk4FiChW
9FaMd4SPkCbj2oNEKLjkDgpHLoFTw4hMmOUDFbG44IMlo2wF7gmYOwzA
      "], 
     Association["Book" -> 1, "Theorem" -> 47] -> CompressedData["
1:eJyVUMENwzAIpFUcOMA4HaErdYQs0Fm7UbETReqjrXK2DgwnfPZ9fT7WCxG9
Op3DLdyrmVvUWBafIUzkOrMqWhOBtB5MgoObD/HVjWsVnUyVi+vX4Tkj12Bo
JDRAiF5DQttpu38w/+wCnVI1D0/D1o5Seq66O84tZfO48SZywEdlvCqR/wPB
ofDjlBeVAuI+gFhIUcws2yrOnGU0RmEVZvlANa8VU08N5hneeCUMkQ==
      "], 
     Association["Book" -> 1, "Theorem" -> 48] -> CompressedData["
1:eJyVkAEOwjAIRatZB/9Dy67glTzCLuBZvZFMl7iZuGQvgaT0U365zY/7fCml
PJd0kk4aQE7Rp/ARKqU4RyExhUIREaqmXbo0d29mV4dEkIORUjv/vo0N1hN0
lIxlYPain7d7zHh4C2SyVI0fS/y6qxU/6KayCit3PYtIM3J5q4jL6U0OqlWK
AO5FNEdXs/y2Ol0kywhBldy26I4GtsYBzJ3DPCsvQC0MXQ==
      "], 
     Association["Book" -> 2, "Theorem" -> 1] -> CompressedData["
1:eJydUO0NQjEIRGPLx+uVGVzJEd4CzupGHjXqL0300kLgrhQ479fLfhCRW5lf
kTmBifAxxtbdXWSMrlFwc0Nm002hqQmA2SPgmbadNiraiI+lS+58QI8qP6gX
DzhDIvBHu1/Rv7I1mvD2XqPV8SeYqoZWfjFhbbHxsA/RdJ8rs6YizJwreinm
K+JHpCW0A2Im0FY7aIpagpZGGYWp2htuFkh+cKqoWqG7A1wEDJc=
      "], 
     Association["Book" -> 2, "Theorem" -> 2] -> CompressedData["
1:eJydUNsNwzAIpFVsHgZbGaErdYQs0Fm7UQ9HTb8aqT3ZILgzBm7b475diOiZ
5mesPWKEqbu3qqpE7pUtoaISYxRu7Nx5RASy1wjtXdrSoCjNvlZOueIBfGR5
h57UQhEC5v+0e4Z6yuZohFtrjpZH30AqG5r5yZiUydpud9FQHTMzpwJEFCs6
FOOI8BFoMq49SISCS+6gcOQSODWMyIRZPlARixUfLBllK3AvQdIMiw==
      "], 
     Association["Book" -> 2, "Theorem" -> 3] -> CompressedData["
1:eJydUNsNwzAIpFVsHgZbGaErdYQs0Fm7UQ9HTb8aqT3ZILgzBm7b475diOiZ
5nf0iBGm7t6qqhK5V7aEikqMUbixc+cREcheI7R3aUuDojT7WjjligfwkeUd
elILRQiY/9XuCeopm6MRbq05Wh59A6lsaOYnY1Ima7vdRUN1zMycChBRrOhQ
jCPCR6DJuPYgEQouuYPCkUvg1DAiE2b5QEUsVnywZJStwL0AJkMMeg==
      "], 
     Association["Book" -> 2, "Theorem" -> 4] -> CompressedData["
1:eJydkN0VwjAIhdEjBW6gdgVXcoQu4Kxu5E20+lQf+p2T8Bsg3NbHfT2JyLNf
B6haCoFEmKuKIFWj4+RaZdp01tIi9J4LVuWhDTBL7NbNrNZJEplAyyZt6CyD
Nh8bd5/pbxR90mDWBNIniA2dht3dGAKuI4r3PZKmJWIZnu2hO8+3CIMfazSK
EOd6StwlTbleNeXf2c3MGDJ1V/Mf1FEzG1y6xUG6eAHLbQwW
      "], 
     Association["Book" -> 2, "Theorem" -> 5] -> CompressedData["
1:eJydUMsRQjEIRMcEdgOoJdiSJdiAtdqRJG/Ukx7cSUhg+V9u9+ttJyKPKf7B
OU9EhI8OQMS9K8lBGMwzmw51TU0vkHsPuNs4DBIN/Jo2IohKFPOzxULARB6r
Duh/tvsV/Sc7R5O6vVdPmAcvlGk2tOyLobXFcpObUwK5LGuqghlqRW+PfGtV
qGih9gwxSGir9aJpuI+h5eJaGk3VPoBZRmbisLRCPU/9vQxq
      "], 
     Association["Book" -> 2, "Theorem" -> 6] -> CompressedData["
1:eJydUMsRQjEIRMcEdgNoC7ZkCa8Ba7UjSd6oJz24k5DA8r9u99t2EJHHFH8h
L0SEjw5AxL0ryUEYzDObDnVNTS+QRw+42zgNEg38mjUiiEoU87PHQsBEnqsO
6P+2+w39JztHk7q9V0+YBy+UaTa07IuhtcVyl7tTArksa6qCGWpFb498a1Wo
aKH2DDFIaKv1omm4j6Hl4loaTdU+gFlGZuK0tEI9T+JhDFk=
      "], 
     Association["Book" -> 2, "Theorem" -> 7] -> CompressedData["
1:eJydUNsNwzAIpFVsHgZHHiErdYQs0Fm7UQ9XTb/Sj5xsENwZA9v+fOw3Inql
uYYwdfdWVZXIvbIlVFRiXQs3du68RgSy9wjtXdrSoCjNzotCrngAH1neoSe1
UISA+eV2T1D/sjka4daao+XRL5DKhmZ+MiZlsvaxU1SG6piZORUgoljRoRhH
hI9Ak3HtQSIUXHIHhSOXwKlhRCbM8oOKWAx8sGSUrcC9Ac+YDEg=
      "], 
     Association["Book" -> 2, "Theorem" -> 8] -> CompressedData["
1:eJydkNERAyEIREnmFBYRa7iWUsI1kFrTUcBLLj9JPu6NI4I7yrJu99t2IaJH
bidR9G6tAiAyq6wJBGLuhRsbO7sFqlfrMJO2tFAU6M833R2J9zj0AKYEGxgj
qgo73+536n+L2WnYqzWt5cKbKGVHsz6DSpm3uu+7KNz4rLw0KoIY0aHwI5u/
KSlX7yQg5xKjQ+EccuOQGEemwiwfIJKjciwzCyI8Ab/yDEg=
      "], 
     Association["Book" -> 2, "Theorem" -> 9] -> CompressedData["
1:eJylkNkNQkEIRUcjywXGrQNbsgQbsFY78o7rM1E/9CQQGLaB3eG4P8xaa6eh
fmWbtQoTs9bKRADtXUVks1yGpa11pb16D2DegzGBVKZa5ceWmcg7KJJMpvAV
QOT6j+++Rb9GI6jALOVw7jFh7DuBQZcX95IUD/OGO+W17GqNQZBm7hxKN53H
KoEHyl3NrCg6TH8i7hVRhcVozTNR+xlsMQvF
      "], 
     Association["Book" -> 2, "Theorem" -> 10] -> CompressedData["
1:eJylUNtxwzAMU3ohRfBhuyN0pYzgBTprNwqk1P5q8tHgjpJIggBPX/v3bb+0
1n7G8W9UrNml99Z4CaDLoiLyuSzeo2+6amVVAR/lWiUQd9ee/lQxwuMAOFpR
0RhgCnhs76z7F/Rl18emIEtp7mMDHBCZOTEPwmbll/cgjsrsn4NmjFNklG2q
YBhBWjejKdDCJCOFKdKM/lg7RPkk5QT76Z6J65DmN5Frd2oJC98=
      "], 
     Association["Book" -> 2, "Theorem" -> 11] -> CompressedData["
1:eJylkNENAjEMQwu6traTa2AEVmIEFmBWNqLt3Q8S8AH+qBrnyYlyud2vt0NK
6TGe33WKtRiQklTgrgiAiIgKQ6uthrtHa0c3rCu09BLZ/GMgKQ6JEntOMCwp
uPm2tr/WfaPytSuNnTpVNDaa4qacx28aewPT4UZsWGli42R2ChCxOxzNXk2+
D8qZqZKuVNEvkWWdqYIDA4rKDBcqXhR2jtCiHuy0caonsrsMRw==
      "], 
     Association["Book" -> 2, "Theorem" -> 12] -> CompressedData["
1:eJylUNkRQjEIREeO5Yg12JIl2IC12pEkz+PD0R93JgGWhRBOl+v5siOi27z+
QLmYKtEIZgCZzMI1hmloyVFGVjW/r+AqNg531vSv/TIdWI0y20QATo7ECuH1
37ifkJ9Zn5P2EfEXsIF5eovwZdwWg02xyaQctZiHxs0chmdVJzta+n5IFeSi
WWRGOffYW9X1cVbtjaowjNXeYDPE6PLDjBIxzR0cLAun
      "], 
     Association["Book" -> 2, "Theorem" -> 13] -> CompressedData["
1:eJylUNsNAkEIRHO7MAPr5kqwJUu4BqzVjmT34iXG6I8ECDATXtftfttOIvIY
7h/xaoBIsKo7ejfCWu80t6Zde0Rc3M+RaSMXJ600fm1HwJPjwAxSsztA7ob2
57ofUn+i4zRJq5U5fiheUsqIOesTycMmyt3vpBVY9+0nNUn5LsPBWI9sTlMZ
fSlqI3Bz1EKLOp6MrlDLBmpvAo8EF4NZZJyVJ0YHC8Y=
      "], 
     Association["Book" -> 2, "Theorem" -> 14] -> CompressedData["
1:eJylkNENAjEMQwuiTeyk1xlYiRFYgFnZCPc4TkIIfnhVozSO0sjn6+1yPZRS
7jP8RYNZKUAz0sdo7p4Z1cIWndEFcOyJZfE4kUQNfp0WkZhkZGII9CzRgxJS
Yfy77sf6P1XOTXVb4w6e1DqztbAJvlZAvoJM6UTH2rN1uRO+VyTqtUr6yOSk
mSuvVsKqMuhSlsxxaajMoPkbo8smnkI9iZz2PQBQHwwQ
      "], 
     Association["Book" -> 3, "Theorem" -> 1] -> CompressedData["
1:eJytj9ENgDAIRE/Tg67hSo7QBZzVjSwQk/pRYqIvKb2mwMHWjr0tAE4L3yAF
UNYqQlLtDNBZqdK1lq4tbdrsUWrV/cat/fEz81FinEihmzMh/XyDuxG2skXE
7tE7t3bvMoxxAX9CB/M=
      "], 
     Association["Book" -> 3, "Theorem" -> 2] -> CompressedData["
1:eJylUAkOAkEIQ7NsKfMKv+QT9gO+1R9ZWLMeiZq4ZCBDBwrT03I5Lwczu1bY
aRFmZMQYCoxkZioLIpGslzgq6sapUie/cL1YEcUz+LnzT5t/fs3k81ybyHPb
xD17wQazTyPrzvdCHyVE93ZpPvrXirFlGgRQDg0FLOClGFwsgBcu1wjHu0hF
MrU6LXbcAC5BCps=
      "], 
     Association["Book" -> 3, "Theorem" -> 3] -> CompressedData["
1:eJytUNsRwyAMc3uxLcgUWSkjZIHOmo0qTEPhes1PojM+Y/khWLbXuj1EZC/u
KmYRICElqAJekPEJNPBUj+vEmEX6d5SOiP4heYPcAXbKxj4es6KEx3txITCS
HlbZ+mZvNQfl7T+8tX/nORflDElcpUJT40+hcmbD0h8NdFNHvwGSFwgu
      "], 
     Association["Book" -> 3, "Theorem" -> 4] -> CompressedData["
1:eJytUNsRwyAMc3vYFmyRlTpCFuis3ajCNDwul35FZ3zG8kOw7e/X/hCRT3U3
AMjIGaqAVxT8Ag081eOaGLNIL+foiuhfkvfIHbC/bOzjMatKeHwWFwIj6WGN
bW/2XnNQ3v/De/uY51xUCiRzlQpNjT+FxpktS08a6NJEfwGA5wgj
      "], 
     Association["Book" -> 3, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 3, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 3, "Theorem" -> 7] -> CompressedData["
1:eJytUEEOwkAIRFM6M4uNb/BLPqEf8K3+SNhGaxPrqbO7LDBkINzmx30+mdmz
zBGYAClIkYyIxmAgj1SZs4TIzJAhXNqVaV9g3nzN3n6aOGjcD8a/bDY3Y1aN
XCZY4b4Ji9xmepGWjexiJbOR5IbSNcGa+/UyqbcBHEB9GTl+iQzd6cvmCzM9
CoU=
      "], 
     Association["Book" -> 3, "Theorem" -> 8] -> CompressedData["
1:eJytUNsNwzAIdKpgOKiH6EodIQt01m7Ug0TpQ0o/qp4NEsfjEJfldl2m1to9
3V8AA8IIWIyhFhbKB4DciakI8zlDcRxOsXe4072Sx50/on/NUpyOVT03ofm+
iYjXgkV6/WLWnbdC4UWiGK9Sf/avFbFHpYamqhRVbaYyzgEVTuldkqdRQvTz
SDlkruvUse0BDiAKjw==
      "], 
     Association["Book" -> 3, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKoA9hZQQBEsiMBsCArE1iMlZ0FJMvGyorbEGStIN1A
mgHGBnOoDHA7BeIciBKwz6B+wQ7wShIDwLaxgkIRTDJA/A4xG7/VYLtZkJwB
AEV2B84=
      "], 
     Association["Book" -> 3, "Theorem" -> 10] -> CompressedData["
1:eJytUMERgCAMqx5t5nAlR2ABZ3UjaRAsd+qLHHCQtE3plo89LyJy+jEJALQA
6heDATejupJVJH+Y6mcNHcH8gZzXboX9qvQr28w78R/E5tggSXBVtU2hxTQJ
fR7o6U890A3CUKmfjmbR9G1IKcgXGukHyA==
      "], 
     Association["Book" -> 3, "Theorem" -> 11] -> CompressedData["
1:eJytUNsNAjEMK+jSxDE3BCsxwi3ArGyEm0PHQzo+EGkbKa4TWzkv18tyaK3d
RvpXzAwFgmTGHHQdAMKOIMjApBIG7I6I98hUegX3O3+M/vVX4kpi9eFELzcn
ZlkGC8y6hayeH0TTRlhIFjWf/SuDWyUhwBvMJOreJACcquxm5u7hPkTNP5c0
hky1nVp23AHqgwpr
      "], 
     Association["Book" -> 3, "Theorem" -> 12] -> CompressedData["
1:eJytUNsNAjEMK+hSxwkMwUqMcAswKxvh5tDxkI4PdGkbKa4TW7nMt+t8aK3d
R9ot0hX0zAw/e0KHpLAjk5nOSSWN3JzgnxGh9A5ud/4Z/eevxJXE6sOJXqxO
zKIMFhh1C1k8P4mmjWQhUdR49S+MXCsJkWg0kyjQJECequxmBsCBIWr4XtIY
MtV2atn+ANabCl4=
      "], 
     Association["Book" -> 3, "Theorem" -> 13] -> CompressedData["
1:eJytUIENwzAIS6cSMOSJvbQT+kBv3UcztOqqSdukaU5AimXA4bqst2Vqrd0z
/Q9GwGIMs7BQHgDkLghEGGY+IcDHBie4M53J95U/on/5Tyaqejph+OFExMtg
kV63mM3zLpTIRVRtSf1ZvynieHGQKhjKoarNVCIGVNild0mewRGir0vKJnNt
p5ZtD8T+Clw=
      "], 
     Association["Book" -> 3, "Theorem" -> 14] -> CompressedData["
1:eJytT8ERwzAIo73YgGScrtCVOkIW6KzdqNiXR9q79hXdgY0sEL5vz8d2EZHX
SCfiZjBEBCyMTgTJHrzSdV3RFgJaOn72w7MJoCca80hCMuctK3Oeu65I/fua
5iLpWevcZMaOUvwLdmB2Ye3u/VNkGQ7soj6qiemmYqWSovlpLWGRmWi1aipW
dVVztQNyHr2TWIAxmi3JN3M2Czg=
      "], 
     Association["Book" -> 3, "Theorem" -> 15] -> CompressedData["
1:eJytkMsNAjEMRA3axOPMRFsDLVHCNkCtdITNR4IDe9p3GI0cx7/LdrtuJzO7
lxwJBkByYIJQkK1N6uzuGeQi0pv497ukUagMkhQDBCl9IA4e1/r+NijJrJ4z
4YtA76Uv/6aVj0+kpE/kIXZ4PkZtnI3ySDbcs+lwU2vqa8ARqpyI1SPLj4z8
IqYsUXWClfsAAUAKsQ==
      "], 
     Association["Book" -> 3, "Theorem" -> 16] -> CompressedData["
1:eJytUNsNw0AIo1UINkzRlTpCFuis3ajmUlW5j+YrPunAvIx4bK/ndjOzd3+X
opIpoJChRxLAnR0sLqLp5N/uPABfa7uvMU0uxnqaHXpQ1ToWOsJ9op2cI6Oo
oEOc4JdsIdIt3KuMYSmBAJsywiMCMpLwmEforlptaXc/Nj7HiQpO
      "], 
     Association["Book" -> 3, "Theorem" -> 17] -> CompressedData["
1:eJytT9sNwkAMS1HT2LljCFZihC7ArGxELK5VBYKvWkqsvJ3b+rivk5k95c5F
71lAR7KYZBIXUV45F8OTP4c1KeQGJGyLlD9b7fK3CkhTdS3jfBn1A+COTxwy
OTINaF99em807UU9RrqFe6ZFWIYjQIUR4WUoqhMeh0XUrlbiZq18K8MLvtgK
Tg==
      "], 
     Association["Book" -> 3, "Theorem" -> 18] -> CompressedData["
1:eJytkMsRwzAIRInHmF2SJtKSS3ADrjUdhY8m9sU5GY1AINh50nvb1+0hIp90
N9vLw1DLnSSAKYL7k3OmSl7O5mRaS7SOoM5dvxt2+XsLpIuupXnQFGmq/uMc
gFUZpN2YlWbv1/gxj1PbeBhpwtB1MRNXhaFSM9PYiBCCaocG2AiOubL6bHwB
qMwKUQ==
      "], 
     Association["Book" -> 3, "Theorem" -> 19] -> CompressedData["
1:eJytUMERwzAIo70QJHWKrtQRskBnzUYVl6Tn9pFX5MMyYIPwc3m/lltErL1d
DRnwopmkiHuTHpzMSPHsKRo64DpxeB2/Wut8mgVak2/Ne3sbewYgE/8YItoj
+h5/sE05JnswMqMypagKVaLAdqsqbTC5RdZQiMdXTV1yU4YPk6IKLg==
      "], 
     Association["Book" -> 3, "Theorem" -> 20] -> CompressedData["
1:eJytUMENAjEMC+h6cZxTwwqsxAi3ALOyEUkf6CoEr7NUt4nd1Op9fz72i4i8
ik5HhKpraNdbRJC8xqYR5ouT2jb+vOmejkTujiI6heMMgIyzo65/VVZSpGut
UDiitaksce4MU6b2r+4BH7EeMlNRM0BUxa31vlUFpJBgEVBrGlGfhWUMsRHy
DQpXCqk=
      "], 
     Association["Book" -> 3, "Theorem" -> 21] -> CompressedData["
1:eJytUMENwzAIpFUwcLbjKBt0pY6QBTprNyrYUao+0ldOMmC4E2c/ttdzuxHR
O8L1WFmqzNJkba0BuLealsXylIHEBadC52oAUYygpL3u/flqp+nvVDWCs9Lw
o8NFgBmHz91g7+xOB5GLaumd8Rp89YNRjpsvEjESM1/KQjCutZiJqojEyZFd
JfoDwD9Vp14asqcP5QgKow==
      "], 
     Association["Book" -> 3, "Theorem" -> 22] -> CompressedData["
1:eJy1UMERwjAMC1wd2UraABuwEiN0AWZlI+ykV44HvEB3sR1bOiu5rvfbekgp
PSL8AYIZCxourTWSxzbn09nKVMgslR91ztUAoxhBk/a695dfG81fp6oRnJWH
Hx0uAiLcfW4Ge2dzOohSVWvvjNfwpR+Mut98EWAJZr5UkGgyz9UMqgDilMiu
gr6B9E/VqZfG4ukJy5UKkg==
      "], 
     Association["Book" -> 3, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJoATg4ONg52DnZWcGAiZWdjR3IZwGy2YAYpzZ2JMAK
REDMzgBjgzlUBridAnEORAkr2HJWPACvJDEAaBEnJysDyMusDBAGMATBUhy4
rWaHESxIfADUoweN
      "], 
     Association["Book" -> 3, "Theorem" -> 24] -> CompressedData["
1:eJy1j9ENgzAMRN0qlxdol2AlRmABZmUjnASpQSJ80Yt08UUX+zwt67y8zGzL
9BekD19GJIHeIkKKwWUU6v5SA47bak3RT8e8b1jm8YtV1wFdgU59NtFwa/VB
wyArizqbvyc/WcZeM2UDJVFoWu6v3QeD
      "], 
     Association["Book" -> 3, "Theorem" -> 25] -> CompressedData["
1:eJy1kOsNAjEMgwsisd3rDcFKjHALMCsb4RYhFRD84j4pkds8lfN2vWyHUsqt
u33AmmuqSqKOVZBQwy9YfS3izGK4sNgGItu/t8yfUbI7ZyVfESPGQjPTT33I
0EfSe58nHgREwdCAR0drbcQzIuGz2ZJKTPXWC+UznXrmODZ5BxG9Cc0=
      "], 
     Association["Book" -> 3, "Theorem" -> 26] -> CompressedData["
1:eJy1j9ENgCAMRKvheGpcwpUcwQWc1Y0saCIm4pceydEjR3udlnVeGjPbEv0E
RgYkgVoRoYvBZRSq/lEBztuOmqy/DvneMM/jinWsA3oClfpuouDS6oP6XpYX
dTZ/7/wkGWvNlAzkRKFouQOTxwdw
      "], 
     Association["Book" -> 3, "Theorem" -> 27] -> CompressedData["
1:eJy1jt0NwjAMhA1q6v+kK7ASI3QBZu1GvVQgyAM8lU9KnPPZsW/r475eiGjr
17/ghRsIs2sLbU18cjMt1b62RIR3EEMTeDp5hkKqui9n7zj/dK1vqqiaDegn
pQwSpoyZo8ix9JARwXlLfyk7pjHhIwxlobDSaoVr+hwcrKX3swykeaZO/YnS
QNgB/y4LBQ==
      "], 
     Association["Book" -> 3, "Theorem" -> 28] -> CompressedData["
1:eJy1T0EOgCAMm4aukPgJv+QT/IBv9UeOYeJMxJOWpKykbN28bss6iMhe6Tdw
IgASI6hk1mRSQXR/IIDnLa2m668jvjf0ebxitXVIPIGd+m5i4Gi1QaVAfFFj
sfdsp0rtNUM10BOl0PIAesQHXw==
      "], 
     Association["Book" -> 3, "Theorem" -> 29] -> CompressedData["
1:eJy1UNsRwjAMC1zdWGqMe2zASozQBZiVjbDTB1/wBbqcHVvKRfZtedyXUynl
meF/qFd3b+TZje7ahomkOD8+CBoJ5sUCMBQYsfXnXzscv7JAhlCN6ScPdohw
97ky1N7ZnK5CMcQA2WGXhkgBxaGwo+rDx6xUsoiW3JNdUEM31ZqSViFBo+ob
ULWWaxqyik9apBfEngrq
      "], 
     Association["Book" -> 3, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJoCNjZWYGAnYmdnY0dCFiAHDYgxqmeFQuAC7Lj00km
wG8g2D5WLM5ixyBh4uwwEXZ0PTgBKGRAFnFysjJA2OxgT3MAISgA2ZCtRTWW
A2YTC8J2VgBT/Ac6
      "], 
     Association["Book" -> 3, "Theorem" -> 31] -> CompressedData["
1:eJy1TkEOwjAMK2hdXCedKo0P8CWesA/wVn5Emk7bCU4QVW7jOK7v2/OxXVJK
rw7/rNYayWtbuN6ok3qTF36U+5jwips0A5EQ706j/Trf/HUKdHDVPPJgpOiV
M4+ce8C8Rw8cogrUYBhSnvtDUY/OPxKRJO7L1I9kMytZWJwXVS2iIoTI6YHi
m1bddgqTQifwBm4ZCmE=
      "], 
     Association["Book" -> 3, "Theorem" -> 32] -> CompressedData["
1:eJy1UMENwkAMO1BziZ0WCW6CrtQRugCzshHJqRUCAS+IIivxWefI83pd1kMp
5Zbw17qc6X5sJ7RmPjhpMvKj2t5Ukoymm02/Pq9+fQ3zgFDV55MAiCT2ZWc7
A7MdQoRNhIdqmzt2PcBIJYxUtWhuRRDWQk6AElKrujs02qj6GlB+N0REGMGc
7xsACgM=
      "], 
     Association["Book" -> 3, "Theorem" -> 33] -> CompressedData["
1:eJy1UEESwkAIQ6csCRx66PgAv+QT+gHf6o8EnGoverKZncxCssByXe+39SQi
j6Jjsbj7+TJzWSKmcKeGfzVjj3zocAiKwQww/3u68VMFitI1PjMVk1Qt7gD+
0rZMMwuD3oETfFfoe3OJm5KNVE2M+U9RCqhWBsuFpSsSFtaF9yuqk0vFlJNl
M49MPgEdYQoZ
      "], 
     Association["Book" -> 3, "Theorem" -> 34] -> CompressedData["
1:eJy1kEsOwjAQQwfUfOwpYlFxAK7EEXoBzsqN8ExTdQUr8OJpPlbi5L4+H+vJ
zF6BP4vu59vVl8V9ctKL87OX7CHuhWRbm5PLr8PVr1tdLshVj2BBALUGs+kj
ahmTJEJ1JmY1RMf+nK1OxnJs4ktKgTWAFI29NBnQCM3gUvOWB/dDcTNdmJRM
tqzfHi4KKA==
      "], 
     Association["Book" -> 3, "Theorem" -> 35] -> CompressedData["
1:eJy1UNEVAjEIQ9+1EOBaV3AlR7gFnNWNTK3P80e/NB80QAppz9v1sh1E5DbC
v3Fqx4D27rlkppY1P0oB+ABPxAhkMnkEk/i1t/q1y+Ui3FkrpjNMa0QpeGHW
rOzJU1Wbo2G/RJE5bE4aZTZtNh7bVKzUCFE+mv9kK2N4qioVXaFqrvYGzgu0
CF84kjQ6hXcr4Apv
      "], 
     Association["Book" -> 3, "Theorem" -> 36] -> CompressedData["
1:eJy1UMERAjEIROc4YCGXGmzJEq4Ba7UjYZLx7qMvZSYhwMIuue2P+34homdd
f7d+bZDeYUtECG/xEWkn8+lpvN2tW/u1tPVrNcmJPFFrCQGQZ6piPpSOnPIR
TBQ7Svq7KUEK0zGp0lnUuWWxMYloBEmQCTdtUAmECFeHQVi3E+/8JrhjgUPd
PGfbCxdpCmY=
      "], 
     Association["Book" -> 3, "Theorem" -> 37] -> CompressedData["
1:eJy1UMENAjEMC+jaxE5yzMBKjHALMCsbkZYTug+8wGqjuHadqtftfttOIvIY
5f84JyzTsUSEtYiPxnDHgPveFKQoBvUL8tcv61/VGl5FRDtJ1Jp7one88Tqz
diC7aSXWw6UyGWGYacNaog2drEGqKlStniGuzWkwrZLZOIej2doO+Rif5FnK
YqQlvLLxBBVkCnM=
      "], 
     Association["Book" -> 4, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoAJjAgBlGElCHClAEqe4wAs6GKMHmLCoDiOcZwOHD
AGYwE6sVXSEAiRUENQ==
      "], 
     Association["Book" -> 4, "Theorem" -> 2] -> CompressedData["
1:eJy9UMERwjAMM3dNLNnleHQCVmKELsCsbIRcLtAPvLj6odiKbCu5rvfbejKz
R8ERccllyZwyIlvGV1lEoCJGorBXqU7g/G9f/eetlguk6js/CpK9F27FYFsx
BAZINIMzNw0/zXgzutTJepwWtUbzqgy0QHME6UFxnpkCinLfGSlT9S+YlGpa
1OgnytcJ5A==
      "], 
     Association["Book" -> 4, "Theorem" -> 3] -> CompressedData["
1:eJy9TsERAjEIxJkjwEKKsCVLuAauVjsSkouOD3057jBLgCXsdT9u+4WI7kV/
gffusTng7PioAqAFzMcgyqxnv//aVvs6zeNJqWrlp0IXsrV8zgmUocvzKWyh
GqODIcVrfyriWeUhZiMRAUiFIOweJgwTqxgQqIi+fwJP2qoy65bpAalGCco=

      "], Association["Book" -> 4, "Theorem" -> 4] -> CompressedData["

1:eJy9kNENhEAIRDERdoa1CVuyBBu4Wq+jA9fE1Q/9MfcS2CFAhuy8fpZ1EJFv
pv/gTo4kJyVvhg64v9J0rNYoXsYebo4U15ptB/WYncpoQnnFfOt0ABHntSbS
iCauWqs4UsDALMOr5EyhFqBE9LS/GlM2A/wAsz8J2Q==
      "], 
     Association["Book" -> 4, "Theorem" -> 5] -> CompressedData["
1:eJy9kFEKwkAMRCM0zWQSUOgJvJJH6AU8qzcywVaKRb+kD7KbnQyb7F7n+20+
icijl4OYmEOQ0DO/elig4AoIWU+lX/491PizCvRM5Rpf7Tu8qE0Vn2wULkoA
sfP18xbTu9hfYmZiqplCF1ol6WoZpVtEuFUGdGz7JCOTQ19JZ4/2BJYaCbs=

      "], Association["Book" -> 4, "Theorem" -> 6] -> CompressedData["

1:eJy9UMENwkAMC1LT+JwIsQIrMUIXYFY2wlFLhUDwQljKXc72w+fzcr0sBzO7
9fEvjJySpB/50SKZEPgACFtfo4XTrzPNX1WgM8k1r7l6hqDLHa94YrgxCeSb
r7+3mXaxK4kIC/cq1WAMLTU8KsVHZo7QBvTsUJJiVnHqetRRR7sDeNcJpQ==

      "], Association["Book" -> 4, "Theorem" -> 7] -> CompressedData["

1:eJy9T9EJAlEMq3C9JnlTuJIj3ALO6kY2+E4ORb/EQN+jbUiT83a9bKeIuPn5
G7iQRIofGZJgaAeE2DvPf21p/boF7KlZ6zzfRWcAMvGKw0RzMoDxxnO8SXou
HYzMqEwpqkKVKNBtVWUX+usTWQchWmu0ucWSD2e4A0fuCU0=
      "], 
     Association["Book" -> 4, "Theorem" -> 8] -> CompressedData["
1:eJy9UEEOwjAMC9La2M4Eb+BLPGEf4K38iKTbEEKCE5qluoltNVGvy/22nMzs
UXQcppDYzvoaiDcogsy7RFaddPn3Rv2nq9o0T++kNnBFa1UNYTMwFEo7Zahc
aucKQcRLScputDkIoLl7DnVaeHOFvAkz4KVzJX4iv0ZTvoyZIRBPvjMKHw==

      "], Association["Book" -> 4, "Theorem" -> 9] -> CompressedData["

1:eJxTTMoPSmJkYGC4CCLoCFghgJA8CkARpLqDCDgXogSbs6gMIJ5ngPmYEisB
tpEGmQ==
      "], 
     Association["Book" -> 4, "Theorem" -> 10] -> CompressedData["
1:eJy9UNsRwjAMM3fEsRxzwAhdiRG6ALN2o8ptmusPfHHow4li+RFN8/s1X0Rk
yfBPPB9a4/4x7d6QcAIdQopO49f76NcshzNQpcc+Yy1VDOxvVk6ki8IRpyKK
zGGHakvaXrV9XqXSnps4xLVwppYSXtnIzKJannkdYD+0tOzKNtbQ6BRWfsAJ
1A==
      "], 
     Association["Book" -> 4, "Theorem" -> 11] -> CompressedData["
1:eJy9UEEOAjEIxEQYGLtejA/wSz5hP+Bb/ZGUrs1e9GScNJTMlIFyWx/39SAi
zx7+iqsrLx/VCMZEG5dMwqP9ehz7qmbzDPnK5gRvmGXuA1HHtZiNLFHZPzTy
zcD3Jl0cBb0RaQLDeRG6hClJqC4NEQrgFICmJ2KHrGRf2bF8WSt7AUzhCY0=

      "], Association["Book" -> 4, "Theorem" -> 12] -> CompressedData["
1:eJy9UEsSQiEMqzP2kwjuPIBX8gjvAp7VG9nC02GjK8csaEloCFy3+207iMij
lv8ilJePIkAsaA3o0l/bQPt1GvuqkpUpTxlmsnhHM8s+JrIyiw5mJ4eorLHZ
70+I1aTEOYBxm4mbn7swBKYkXbU3B9TdT+Gu6enrD+UkK9hx+LIV9wRCGQmI

      "], Association["Book" -> 4, "Theorem" -> 13] -> CompressedData["
1:eJy9kNENgCAMRGtiyx1M4UqO4ALO6ka24AeY6JfxJYVrruQalm1ft0lEjjh+
RslHL3fwuqVpksWbj7FXt+b5tmZ1oR6zoXUTyjuWq9MBeI3PmqhpJlm1FMkI
AQOj9awUM4magOTV0/5qDtkCcAJCjQmK
      "], 
     Association["Book" -> 4, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoDVhZ8UhhAhRBqruFsEtZsTuLygDieQaYj0mxkh2I
WZD4AJP6BoE=
      "], 
     Association["Book" -> 4, "Theorem" -> 15] -> CompressedData["
1:eJy9UMENAjEMCxJpazdlB1ZihFvgZmUjnCtFfOCFsCpHsaMm8nXbb9vJzO5J
fwc+O8CgoIpIImjqul4Hefn1KeWrS4pCUyVPmcCEO5/aMtqhgFykoXTBxTnU
iPZSROqOVotqhbGWMcxDSXhE0F15lOJzUVV1vKFlNOyd566vA8GG9gBIiQnf

      "], Association["Book" -> 4, "Theorem" -> 16] -> CompressedData["
1:eJzNUNsRAjEIxBlhWZKzB1uyhGvAWu1ISM7M/fnjhzsZwiywPO7787FfRORV
5o9ABhf6/GQRzv7rhvZlnDKZZWuCD8zS9wmO5zqYgxxBjVpo+oeAn0UqOAuq
UYQJDLdNwoWmkYTq1kEqgNYATU3whKyMOtl16EYr7g32BwlO
      "], 
     Association["Book" -> 5, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBgBtZmQEU6OAugAAmjwCfg==
      "], 
     Association["Book" -> 5, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAjAyMgyU1cMeAACY8gJ9
      "], 
     Association["Book" -> 5, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBDAxDJjVwx0AAJjxAn0=
      "], 
     Association["Book" -> 5, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCAyg1cMbAACWOwJ7
      "], 
     Association["Book" -> 5, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDDACbWcEk4MQgN02JAEAnPQCgQ==
      "], 
     Association["Book" -> 5, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 5, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 5, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDjAxMDAyMA6sG3ABJjAaigAAnmQCgw==
      "], 
     Association["Book" -> 5, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMA64G7ACRjAaigAAmkcCfw==
      "], 
     Association["Book" -> 5, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMDGCSEbGAXcKKmACo6EIAKI0AoY=
      "], 
     Association["Book" -> 5, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGORgCThxcAACWEwJ7
      "], 
     Association["Book" -> 5, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGDAOAjdgAYxgNBQBAJj0An4=
      "], 
     Association["Book" -> 5, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 5, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjAyMjAzAgGQNdBOQQJMYDQUAQCkugKI
      "], 
     Association["Book" -> 5, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbACEZDEQAAllMCfA==
      "], 
     Association["Book" -> 5, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAWBiYWJiYGJkHmh3IAEmRkYmBoZBEj4kAQCuhgKR

      "], Association["Book" -> 5, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbACEZDEQAAllMCfA==
      "], 
     Association["Book" -> 5, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAwC6g5GJcbC4BgSArmECu2vIAQChmwKH
      "], 
     Association["Book" -> 5, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCWBkYmAdaDcgASYgAJID7QwyAACmRQKM
      "], 
     Association["Book" -> 5, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGB2AC4sHjGrBbBpNzSAAAmi0Cfw==
      "], 
     Association["Book" -> 5, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDxhMrmECo6EIAJkKAn8=
      "], 
     Association["Book" -> 5, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbAAkZDEQAAmqYCgg==
      "], 
     Association["Book" -> 5, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDWBkGmgXIAMmkGsGlYuIBACc/wKD
      "], 
     Association["Book" -> 5, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbAysjIysAwFMMHAJ48Aoc=
      "], 
     Association["Book" -> 5, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbAxMzMBCQH2hlkAACdfwKG
      "], 
     Association["Book" -> 6, "Theorem" -> 1] -> CompressedData["
1:eJzNkMENQjEMQ4NEm8RNmzACKzHCX4BZ2Yi0fD43Thx4B1exrUTqdbvfthMR
Pab8ExE2iaRFuFsYWTjcAZiNX9+rX1NgSrYqDvRFWqnL2AMpKwPekqUBHbo6
e0sEKoeTYU4rWtdA4Do6idKFS++mhfuw1jiXpRQGmOWDiriP/JzzXJidls8T
yZsKeQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 2] -> CompressedData["
1:eJzNkNENwjAMRI1EEvsSx2qZgJUYoQswKxtxCaX88cUHT8pF9p1sydftfttO
IvIY8lesMbiQtpBYQvhaBICI/ut1+asLUCpTGQf2gi3qbOyGpukBb2Gow7rN
zJ5ShenRoclqWnMbBCV3FzVZS3Jvlor3VmvhMAorL0U/mCqPwuOcx0BmnN8T
+hcKqg==
      "], 
     Association["Book" -> 6, "Theorem" -> 3] -> CompressedData["
1:eJzNTsERAjEIxBkTYInhzrMBW7KEa8Ba7UiIZvz58uHOsEl2gc11v9/2AxE9
kv4Lnris29Z8WdxPRtYd7qoK779Oq19dIEijq2JAs14IKX/0FhNShgdMiqYO
7WNojolAZSppxmssyCBlYua4s9CZizUrXIBmcTJbVBEpLB/U2OfdHUeYqKll
zhOtgAov
      "], 
     Association["Book" -> 6, "Theorem" -> 4] -> CompressedData["
1:eJzNj90NwjAMhA2N2/gvUbIBKzFCF2BWNuIaSh+Q+sYDJ+Uk332yldv6uK8X
Inpu9mdqvfcKlQiz7o1aq16rubcevz42nzZpmpLqlchBzXpI3kIEH8FeZB6d
6sdE2FVcBrNTOavkI0GJaVS4KGKkyxJwpy4c4cKMf6sylm0sl8LyJTW8FOZi
YLH3Bcq3CoU=
      "], 
     Association["Book" -> 6, "Theorem" -> 5] -> CompressedData["
1:eJzNTdsNAjEMC1zbS9yEqMANwEqMcAswKxuR3gOJD/74wIqsyLbs2/y4zwci
enb6N0yXCYHmTfVaGzW4udeKdrZfb5WvThqGBByJNFIFUKyQFSEFL8JmcF48
YCeRLFtI9hQzhN9Kz/DaEovxEMYxfkhcNlPJBRrLhZmtci7uRT4RLRFIPWz1
1Htfo68KWw==
      "], 
     Association["Book" -> 6, "Theorem" -> 6] -> CompressedData["
1:eJzNjsENwjAMRQ1NXPvbIYoKA7ASI3QBZmUjnFRF6oEbB56ip+T/KM59fT7W
ExG9uv6O5Yag1eZ+tUYN1WvtyVJ+PYq/NmmaktmZyOMWA44N3YgoPIIuAySP
Dtilmnur2B2IQOWThOI0qpioaoR5jj00Vi7FNTM8JrOIFJPMtbIewfhaKu5a
7NKffAOKlApL
      "], 
     Association["Book" -> 6, "Theorem" -> 7] -> CompressedData["
1:eJzNjskNAjEMRY00SbwyiYACaIkSpgFqpSO+M4gbtznwlThenmPft+djOxHR
K83/6aZQ78P9aoOGrlBEjEscPan+rCw4ZnBWUFXVsZKZquxCChaxSS4Lh8us
TeKDFbyakOwUIFbh9JGfBtEsYVBrTFzQQ01IW0mqVXPHNGYO4VZ7z8RX7LkW
7hJnlzDPL99+wgpW
      "], 
     Association["Book" -> 6, "Theorem" -> 8] -> CompressedData["
1:eJzNjcsNwkAMRA2Jszv+yCyiAVqihDRArXSEdxOQOHDjwDuM5JmR57reb+uB
iB5d/hBN2qmZXaJR07CI7pz910PL12SeplnkSGTZWkRMNrCRVuow9qDyyERe
ArAJDKOzt2oV1LeTYV4jykVASUrxVJCC3Q3MarnMpfuFOYLxiUhvzG4G1+h/
n2gaCjQ=
      "], 
     Association["Book" -> 6, "Theorem" -> 9] -> CompressedData["
1:eJzNkMsNAjEMRI1E4m9i7aYCWqKEbYBa6YghLMuNEwdGykieebKlXLbbdTsR
0f1p/6gxRixQLkl4kamqmf3Xd+rX1gzmoKod0pcQwWewF1JmZ/Y2QN2062R2
SsRUjgQlplnNa0bGtTcSpZVLRGjh1sOdsQyGqTHLRyqCT8m0s7mpqzeAD2pQ
Cjg=
      "], 
     Association["Book" -> 6, "Theorem" -> 10] -> CompressedData["
1:eJzNjcsNAjEMRI1E4l8cK7sV0BIlbAPUSkcMYVlunDjwJI/kmZF92W7X7URE
96f8JesaA+RIwrRMVc3sv35Tv6ZmEEer2oG+gAWdxh5ImZnZW1Dqpl1nZ2+J
mMrhIMQ2o/nNyLj2IFFauEQ0LRy9uTOOQbAFs3xQkZF9DDubm7p6oPgAVOIK
Lg==
      "], 
     Association["Book" -> 6, "Theorem" -> 11] -> CompressedData["
1:eJzNkNENAjEMQ4NEm8RpGx1iAVZihFuAWdkIU47jjy8+sFRLsZ8SqZf1dl0P
InJ/2n/q3BYqlxS+lunumePXV+rXFqAFqYpd/hIj+gy2wsrsgLcRGvDhk9ko
M7jtCUtOs5rXINA6upjLSUtrzYv20SKUy2icuqp95Gb8lEwcEfDw6AQfPLUK
Ew==
      "], 
     Association["Book" -> 6, "Theorem" -> 12] -> CompressedData["
1:eJzNkNENAjEMQ4NE28RJG91twEqMcAswKxthynH88cUHlmop9lMi9bLdrttJ
RO5P+1PFQuWSwheZZpY5fn2kfm0BmpOqOGQvMaLPYC+0zA54G6EBGzaZnVKF
6ZGw5DSreQ2CVkcXNVlbiQgrrY9wb1xG49Rb049MlZ+SiTMc5uad4AMkLQn/

      "], Association["Book" -> 6, "Theorem" -> 13] -> CompressedData["
1:eJzNjcsNAjEMRI2E1/EnjqWtgJYoYRugVjpiCMuKEycOzGEizxvHl+123U5E
dH/av2qMiFWLSsfI7N2r/Nc3lq/UDJZoLXZIX0IEn8EOGk9m9jZVdlPX2dlb
rZm2IwHENBEOMQsJc+/ESi6MbgjGEGER6Y63ivVDWA6zTDv3DE0fgX8f+DEJ
wQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 14] -> CompressedData["
1:eJzNTcENAjEMy0m0idOoQtcJWIkRbgFmZaNzenASH148cFWrsV3ntj3u2yIi
z6S/xVjXuIbwIgIg9V+vqF9ddxKYqn4CB0pJ5shzwKaCdyBRu6Onf6bMHPZS
kCan+YuLVFU0O8VMAqW1Bmc+3CmjK4ryqfaB4X0Mv8xCNBZiB9YCCaE=
      "], 
     Association["Book" -> 6, "Theorem" -> 15] -> CompressedData["
1:eJzNjdENwjAMRI1EYp9jpZANWIkRugCzshHnFCrxwxcffVFOse9i39bHfT2J
yDPluIwRlxBeRACU/u8N9afrTgFT1XewUUoqS54Nmx18AkldHEv6e8rMYe8O
0mQ1f3GRqormTDGTQGmtwZkPd7bRFUX5VPvi6n0MP8+BaJ3yAr/oCY0=
      "], 
     Association["Book" -> 6, "Theorem" -> 16] -> CompressedData["
1:eJzNjksOAkEIRDGRhio6nUnGC8yVPMJcwLN6I+lPTNy4cmEtCFS9AMf5uJ8X
EXn28se6cQvhRpAAItqvD5SvKZkFSRW+hSnV3g1jBT6c+SkmVhrRMJhFuRO+
HPQwp8HnIVWIAZViLoRGRMZEdU8bzaBWaeYf2mvbd177wkDkQrwAnbAJdA==

      "], Association["Book" -> 6, "Theorem" -> 17] -> CompressedData["
1:eJzNTsENAjEMCxJpY6eqTuoErHQj3ALMykYkbT98ePHAj0ixLduP63leNxF5
5fln8KD4QZAA6P3X+eV7O+MgXMXduYAF1bkoiS3YZNZSLFvpRMf0bJcZYZtB
ivElPIpUITUDpJoQGp0hE80saPQKrY212gdG62PwnoEOj0C8AYYuCWE=
      "], 
     Association["Book" -> 6, "Theorem" -> 18] -> CompressedData["
1:eJzNjdENwjAMRA1NXPtiE4UNWKkjdIHOykY4KUXio398cIqekruL/Vi3Zb0Q
0bPjr1UaNVSrFUC7+6/H82mSpikBVyKLFgOGXborrOAw3oHkkQEHVHNPFQd7
SaDycQLxGlFsVC2EeY47NE52N81cLDaziHiRzLWyfiv+RiO5mXq59ZEvs9sJ
nQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 19] -> CompressedData["
1:eJzNkMsNAjEMRB2JxJ7Jb5cOaIkStgFqpSOc7ILEgdsemINlzxvZkm/b474F
EXmO8t/q0tdaagXYezt7e/qNQhDS/1M9lcjMXdgV4+imcQCbDsh3AVIjGmbm
SJkR9nEc+jSRH1RVUaBQzOSKmHN2TBbSbTRF1FpV7UsL2rrwMhZm5O7BF5Vy
CXw=
      "], 
     Association["Book" -> 6, "Theorem" -> 20] -> CompressedData["
1:eJzNTcENwkAMC/TSJE7KcUIMwEqM0AWYlY3IXVV48eOBpViJbTm39XFfD0T0
7PTvaO7V41Tr5Xr+dff81SnTVIAjUc3UDMAxYBuY+zaETYcOxYCdMtRdw849
pDB9K0l5DSs/iggJc4BUKIQdbiyIUBVVXUxZWhOPDzQC5gGUxcNyPCtfoeQJ
ng==
      "], 
     Association["Book" -> 6, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 6, "Theorem" -> 22] -> CompressedData["
1:eJzNTckRwkAMMySO5WPJMqmAlighDVArHeF1mOHFjwd6yLakkW/7476fiOg5
6O9hcQlvbd22/uvq5aszT9Psfia6ZmqxgruZHmAeW96u5ZihFK3EO8Y5bYT0
SGUIphh76kV5lZUfRYQke40gFMKe3SwWAQiApmDpXTw+wKAUbG4R2jzyj74A
jHEJlQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 23] -> CompressedData["
1:eJzNjd0NAjEMg4N0bWKnP+KYgJUY4RZgVjYiLcdJPPDGA1Zlpf6s5Lrdb9tJ
RB7D/l+11Eqi9/brzfkrWeKRMXi0Mg/hpZTGNIMd2ExAvg3IjWiYnb1lRtiR
BIzfRHFIVUWBQjGTFcndA5OFjBhNkbRWVftQZ7ucuRSMk97DnmCGCVM=
      "], 
     Association["Book" -> 6, "Theorem" -> 24] -> CompressedData["
1:eJzVUMERwjAMMzSJLTtOr3ddgJUYoQswKxvhpAVe/fFBDzmydMqdb9vjvl2I
6NnpD1DXFcA8t18Xl1MnTVNSvRIZERcz0x3YUUp/jcVhSB6e6psi1BQNI3Ok
RBTy2YQZqsv4hGKQcmlOAlo4u1dk9lbNOMqcu3Jm+QIicZRl0dQLAbOgF229
CXE=
      "], 
     Association["Book" -> 6, "Theorem" -> 25] -> CompressedData["
1:eJzVjcEVwkAIRNHIAgOb+IwN2JIlpAFrtSPZTaInb16cA2+Y+Q9uy+O+HIjo
2cY/CDZf56jnX98t36thIOBIFEkVd8cqW8XcXA+2QntiwD4Ssg2ynVKF6Ttp
jHbA8yGzkDC7kxhdhGtUzxVRCosIXDKr7PGRRYzmGHGKCRZp8+4LU7IJTg==

      "], Association["Book" -> 6, "Theorem" -> 26] -> CompressedData["
1:eJzVTcERwjAMMzSJLcdx6B0LsBIjdAFmZSOcttAXPz7oIVuWTr4tj/tyIqLn
oL/AFUDv/uva8tVJ05RUz0SViIuZ6QZsKGVs62E3JK+e6psi5ArHmtlTIgr5
XMIMNaTFxxikXLyRgC6cWzNkbm61cpQ1HqoxywGIzN37rElNYag1gi9KfAla

      "], Association["Book" -> 6, "Theorem" -> 27] -> CompressedData["
1:eJzVkMEVwyAMQ90GsC1smKErdYQskFmzUQ1J21NvvUQHg6z/xHs81u253oho
H+MaUqD39u/W8jNJy5KAO1ENqpgZDumhUsZtLs5A8syA9wioQZtO5qREoPLZ
RBhuWIsX4yBwaU6i1Dm7m2b2ZrVylBkP58zylYrEp/SOBBuV6gG+ADQPCT8=

      "], Association["Book" -> 6, "Theorem" -> 28] -> CompressedData["
1:eJzVjUEOwkAMAwMl2Th1BEJq73yJJ/QDvJUfsbtt4cSNC5ZiRfYouS2P+3IQ
kWezP9E8kZdfH7Xv1TBIxFGElTKSsQqrVNvWg63wniBitwphg7BT7gF/J43x
DtQnoqptIsQgV9Nk9jc001JKRCmaqSM/AisDZJx4Bphju/sCKeAJIA==
      "], 
     Association["Book" -> 6, "Theorem" -> 29] -> CompressedData["
1:eJzVjbsRAkEMQw2H12tZe8cQkNMSJVwD1EpHeO8DERkJCjQe6Y11mx/3+SAi
z27/oit5/vXP8r0aBgGOIkyqkMQqX6XaryXYirokDuyWkG+Q71St8PpOOlMX
IEdE1cRUI8RcLqaNLWcCLEXNDLDMmgY/cnL0wIgTJ3iLiPz7AhRJCRs=
      "], 
     Association["Book" -> 6, "Theorem" -> 30] -> CompressedData["
1:eJzVTcsNwlAMC5T0JY4jKjZgJUboAszKRuS9Fjhx44KlRPFH8XW939aDiDz6
+huQy69fzt+taRLgWK2VmsnEBt+g2q8h7IYNxYHXqlDAw0dmT5nB7a2UWazT
KpHWWh93UZdL00yyU5pp6UBzJTX4gZGByMTpTDgjqs2fAB4JFA==
      "], 
     Association["Book" -> 6, "Theorem" -> 31] -> CompressedData["
1:eJzVTsENwkAMC1JzOTtBJ9igKzFCF2DWbtSkpfDixwcrZ8V2lNy8PB/LRUTW
ov/BGL/e2L4mUxaZTb7W3J0HcEC1ut14BX13QJ6UQ5WCJ9dQJ/rbSUpV0vMQ
YGKqQUGXu6nT6wwjg8QVpoabeXzQI/JfwzkBKTxq5QbvWQj7
      "], 
     Association["Book" -> 6, "Theorem" -> 32] -> CompressedData["
1:eJzVjcENAjEMBA2XnL1rR0oLtEQJ1wC10hGOT4gXPz6MIitxVrO343E/LiLy
XOOPmL8W7l9/2rY18lqd+07S6xAnva8b14InVhuQ75GhIAKVgRdmhCE98BUK
5KsE2aiqoumlmMrQnnF0ZYSZmlnAus6pHh8sVkOQbfjA8EgrXuyaCQw=
      "], 
     Association["Book" -> 6, "Theorem" -> 33] -> CompressedData["
1:eJzlTdsNgCAQOxOBewDO4EqOwALO6kYWotH7cQGb3KPXQte2b20ioqO3PyN+
qqpoAldUQN4IwVGI7C/DZCLmLsyoh9rNdKQlwkcITUxZw1IrVJUrOCcJ/X1i
h6JWisx9hTVjnLAjCHo=
      "], 
     Association["Book" -> 7, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFGBkHHJOpjcAAJhIAn0=
      "], 
     Association["Book" -> 7, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFmBiGmgXDHYAAJloAn4=
      "], 
     Association["Book" -> 7, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGGGAaaAcMdgAAlyQCfA==
      "], 
     Association["Book" -> 7, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHGBEB2ARoDhIClMWG2BAplHMBpuBbBedPUc5AADe
BQLH
      "], 
     Association["Book" -> 7, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHmBkZGACIiYGIGZkBNKMQAAUBsuASSTAxIgFMCBh
lDAAcpgYUa0aagAA3vACyA==
      "], 
     Association["Book" -> 7, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIGACo1GAFQAAmVACfg==
      "], 
     Association["Book" -> 7, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJhi6Lqc1AACV/AJ7
      "], 
     Association["Book" -> 7, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJGBiYGRiYoRiBkYgYGAAIUYwGxkwoQuAFSPTyOYC
OUzIAoxDL4QA4bYCyw==
      "], 
     Association["Book" -> 7, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKmBhAWImIGBhYAQCBgYmBiCDgZEBzIUAkCyIZmFi
RAEMEMwE1YkAIGOQBRiHXggBAO5yAtg=
      "], 
     Association["Book" -> 7, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLmBiIl4UizpqOmUQAQCe5gKE
      "], 
     Association["Book" -> 7, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLGABIiYgYGFgBAIGBiYGIIOBkQHMhQCQLIhmYWJE
AQwQzATViQAgY5AFGIdeCAEA6hIC1A==
      "], 
     Association["Book" -> 7, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMmBlAgJWBkYgYGBgYgAyGBgZwFwIAMqygWhWJkYU
wADBTAxMTIwoYQAyBlmAceiFEADsTALX
      "], 
     Association["Book" -> 7, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMGDGKohVlEjNwwAAAKCmAoY=
      "], 
     Association["Book" -> 7, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGG2BmJlIdbZ0xYAAAnWcCgw==
      "], 
     Association["Book" -> 7, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGACAjYGRiAAshmADAZGBjAXAoCyrCCalYkRBTBA
MBNQPyNKGICMQRZgHHohBADnBgLS
      "], 
     Association["Book" -> 7, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNmBhYmJjYAECIJOBgZkZiICAhQVMMbOxsbGCaFYo
HwYgqphZGJiYgBwkAOQwIQugyg4JAABadQNP
      "], 
     Association["Book" -> 7, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOGBiYmJgAQIGBiBiZgYiZmZGFhZmGGACC7AyowAG
CMUCppFNA3KYwAKMYMSAKjskAABE6gM6
      "], 
     Association["Book" -> 7, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOmBmYmAGAiADiFgYWJiYWIBcFhDBzAQEIJqRkRkF
gHSAlEF1IhnGwMDEhMRnYaGzbygHADl2Ay4=
      "], 
     Association["Book" -> 7, "Theorem" -> 21] -> CompressedData["
1:eJzVy9ENACEIA9DSFPdwJUdwgZv1Njrqj94IvhBCQ9rnM2YAeL2uR6gAHkGk
MrVkoY8I/bjhB1pzc6tAnlm4zQdqfQNk
      "], 
     Association["Book" -> 7, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAgYGJhCDgRHChQAmJiZWEM3KxIgCGCCYiYGJ
iRElDEDGMKKYTGevUA4A2CACxA==
      "], 
     Association["Book" -> 7, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 24] -> CompressedData["
1:eJzdj8EVgDAIQ1NaknYMV3KELuCsbtTCybsH3/MDCeHGMa9zFgB3yB8YYwAC
SLBWuuR7JJLp1aio3pUgbuFmWx7sYB5LyUZrn/zzhgXesgPj
      "], 
     Association["Book" -> 7, "Theorem" -> 25] -> CompressedData["
1:eJzdj9EJgEAMQ3NtmuoYruQIt4CzutFdBcF/PwRfQ1tCKHTrx94bgLPaL1gX
IAEJclcEp8iQVJNuYtUNyiMTZpnPO8S0ammXZu6Td94wAIljA4U=
      "], 
     Association["Book" -> 7, "Theorem" -> 26] -> CompressedData["
1:eJzdj+EJgFAIhO/pebZGKzXCW6BZ2ygNgv73I+jzUDkOwXXu2xwAjm7/YAES
kCB3RbBEhqSedBO7btAemTDLfJ4hyuplXKrcJ9+84QSAAAN8
      "], 
     Association["Book" -> 7, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGDWBnYGBjY2BjZmZjZWdnBWJ2djY2NjDNzMTGDoIc
HOxgwAASA9FMTEAC1QwmVhCDEYwYWFgGxCuUAADCiQPI
      "], 
     Association["Book" -> 7, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGE2BjY2BjZmZjZWUBIhYWVjY2NhDNwszExgKCMMAA
EmNh4WRgYmJnRzaAhQEoBGIwghFQ3YD4gxIAAHIpA28=
      "], 
     Association["Book" -> 7, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGESDPO8MsEACWtgJ8
      "], 
     Association["Book" -> 7, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGPBhmgQAAlbECew==
      "], 
     Association["Book" -> 7, "Theorem" -> 33] -> CompressedData["
1:eJzdyYENgDAMA7Aky7a+wUucsAe4lY/oJiHBCWC1UaNu49gHAZwzfiUQpYTk
1myLZO2JDIe7b8gtdoWV5SGLNA+uwfv7CRdjZQNf
      "], 
     Association["Book" -> 7, "Theorem" -> 34] -> CompressedData["
1:eJzdj7ENgFAQQuEO9N8YruQILuCsbqRfY2JvYeILoYBQMC3rvBDA1u1nVGZF
usp2kvRwQDY3j75BL6WEIuK5FnAFPAXpkxNv2AFslwNp
      "], 
     Association["Book" -> 7, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 36] -> CompressedData["
1:eJzdj8ENgDAUQuF/0HYMV3KELuCsblSrMfHuwcQXwgHCgaVtayOAfdjfyKyR
rtV2kvR0QBYXz77BKKWEIuI5FnAFPAXpkw9v6GOOA2A=
      "], 
     Association["Book" -> 7, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHWCmqrIhBwCaZwKA
      "], 
     Association["Book" -> 7, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGKmAeaAfQCAAAl2oCfQ==
      "], 
     Association["Book" -> 7, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIWBiZeXkZGVlZWFkZGQFAUZGDlYOVnZWGGAAYmYW
FmYGFiYmJmSdLEDNYAFGMGJgYRkQD1ACAFPWA1A=
      "], 
     Association["Book" -> 8, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGI2BlAQNWIGADMRgZWVAAAxAzQ2kWZI1ADhMTMp+F
YagBADvWAzY=
      "], 
     Association["Book" -> 8, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJeBkY+Pk5ORmAwIOIGBmYmVnZWdhB7HZgYABiJnZ
2VkYWIAAWR87AwMTK4jBCARAClV2SAAAnxIDpQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJ2Bj5+Hh4mFkZGTlAQJGRhYOFg5WDi4ODk4OIGAA
YmZ2diYGFhYmJmRt7AwMTOwgBlAnKHBYWAbE9ZQAAKLzA6o=
      "], 
     Association["Book" -> 8, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKeBkY2NjYmRkZAXSbIyMHKwcrOxANisYMAAxMwsL
MwMLExMTsi4WBgaIACMYMbCwDIjjKQEARbsDQg==
      "], 
     Association["Book" -> 8, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGK2BjY2VmZGRkZQMCRkZ2NiAEsllZgYgVLMnCwsTA
wsKIEgYsDAxMTCAGUCdIgoVlQNxOCQAAQCsDPQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLeDjYWNkZGTlBQJGRhYOEOTi4ODkAAIGIGZmZ2dh
YAECZD3sDAxM7CAGUCcocFBlhwQAAJCEA5g=
      "], 
     Association["Book" -> 8, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGL+BhY2RkZOUFAkZGFg4Q5OLg4OQAAgYgZmZnZ2Fg
AQJkLewMDEzsIAZQJyhwUGWHBAAAgwIDig==
      "], 
     Association["Book" -> 8, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMWBjZGRk5QUCRkYWLiDk5ODg4uEAAgYgi5mdnYWB
BQiQdbAzMDCxgxhAnaDAQZUdEgAAf3kDhw==
      "], 
     Association["Book" -> 8, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGM2ADAg4gYGZiZWdlZ2EHsdmBgAGImdnZWRhYgABZ
AzsDAxMriMEIBEAKVXZIAABl+QNq
      "], 
     Association["Book" -> 8, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNWBnZ2UGAlYWZhTAAKFYGJiYgBwkAOQwMSHzmRmG
GgAAA+IC+Q==
      "], 
     Association["Book" -> 8, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGN2BnZQYCVhZmFMAAoVgYmJiAHCQA5DAxIfOZGYYa
AAD9LgLy
      "], 
     Association["Book" -> 8, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOWBlBgJWFmYUwAChWBiYmIAcJADkMDEh85kZhhoA
APaQAus=
      "], 
     Association["Book" -> 8, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGO2AEAlYmRhTAAMFMDExACRTFDAxMjCia6exaygEA
urYCpg==
      "], 
     Association["Book" -> 8, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGPeAV4GVkZOEAQS4ODk4OIGAAYmYgYmBhYWJCVsvB
wMDEDmIwAgGQYmEZECdTAgBfIQNk
      "], 
     Association["Book" -> 8, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGP+AXYGRk4QBBLi4OTg4gYABiZnZ2ZgYWFiYmZKXs
DAxM7CAGIxAAKRaWAXExJQAAVC0DWA==
      "], 
     Association["Book" -> 8, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAOBlZGThAEEuDg5ODiBgAGJmIGJgYWFiQlbJwcDA
xA5iMAIBkGJhGRAHUwIARB4DRw==
      "], 
     Association["Book" -> 8, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAmBkZOEAQS4uDk4OIGAAYmZ2dmYGFhYmJmSF7AwM
TOwQHYygwGFhGRD3UgIAN2sDOQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBGBiRAEMEMzEwASUQFbHCFKKzB96IQQAsmACnQ==

      "], Association["Book" -> 8, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBmBGAVAuCwMTE5CDrIwBKISijc7upBwAAN5lAtE=

      "], Association["Book" -> 8, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGCGDhYuFkgQEGIGZmYWFiYGFiRAkDFgYGJiYQgxGM
gOoGxLGUAAD1qgLs
      "], 
     Association["Book" -> 8, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGCmDhY+Hm5ODh4eLk5GTg5eJk5uFhZmBhYWJCVsTD
wMDEBWIwAgFIE8uAuJUSAABP2QNX
      "], 
     Association["Book" -> 8, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGDGDhZIEBBiBmZmFhYmBhYkQJAxYGBiYmEIMRjIDq
BsSplAAA6PYC3g==
      "], 
     Association["Book" -> 8, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGDmDh5uTg4eHi5ORk4OXiZObhYWZgYWFiQlbCw8DA
xAViMAIBSAvLgLiUEgAAP5cDRQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGEGBhZYEABiBmZmFhYmBhYkQJAxYGBiYmEIMRjIDq
BsShlAAA3igC0g==
      "], 
     Association["Book" -> 8, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGEuDg4OHh5OLiYuDj5GLm5WVmYGFhYkJWwMvAwMQD
YjACAZBiYRkQd1ICADfGAz0=
      "], 
     Association["Book" -> 8, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGFODg4OAEYgZOTg5mTk4mBhYWJiZkeU4GBiZ2EIMR
CIAUC8uAOJMSAAAQHQMO
      "], 
     Association["Book" -> 8, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGFuDk5AACBiBm5uBgYmBhYUQJAw4GBiZ2EIORESzB
wjIgrqQEAAAEIgMA
      "], 
     Association["Book" -> 9, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGGODjAAIGTg4OZi4uFgYWIECW5WJgYOIEMRiBAEih
yg4JAAAOWwMN
      "], 
     Association["Book" -> 9, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGGuAAAgZODg5mTk4WBhYgQJbkZGBg4gQxGIEASKHK
DgkAAP+OAvw=
      "], 
     Association["Book" -> 9, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGHODh4Wbg4uVh5uZmYmBhYUQJA24GBiYOEIORESzB
wjIgTqQEAAAIPQMG
      "], 
     Association["Book" -> 9, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGHhDgY+Dj4WHm5WViYGFhRAkDXgYGJh4Qg5ERLMHC
MiAupAQAAA5GAw4=
      "], 
     Association["Book" -> 9, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGIOBj4OPhYeblZWJgYWFECQNeBgYmHhCDkREswcIy
IA6kBAAAAGYC/g==
      "], 
     Association["Book" -> 9, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGJODi4WHm4GBiYGFhRAkDDgYGJg4Qg5ERLMHCMiDO
owQAAOGpAtk=
      "], 
     Association["Book" -> 9, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGJuDiZObjZ2ZgYWFiQhbmZ2Bg4gUxGIEASLGwDIjr
KAEA60YC5g==
      "], 
     Association["Book" -> 9, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGKOBh5uJiYmBhYUQJAy4GBiZ2EIORESzBwjIgjqME
AADTBALI
      "], 
     Association["Book" -> 9, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGKmDmZWdiYGFhRAkDdgYGJnYQg5ERLMHCMiBuowQA
AMaSArk=
      "], 
     Association["Book" -> 9, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGLhBgZmBhYWJCEWJgYOIDMRiBAEixsAyIyygBAM8v
AsU=
      "], 
     Association["Book" -> 9, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGMGBmYGFhYkIOBCEGBiY+EIMRCIAUC8vAuIwCAADE
JAK4
      "], 
     Association["Book" -> 9, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNGBiYmdH5rOAhEAMRjBiYGEZEGdRAgCwogKe
      "], 
     Association["Book" -> 9, "Theorem" -> 15] -> CompressedData["
1:eJzlkIENwzAIBEllAjzEO3SljJAFOms3CthKInWFnjDPA5Ylv4/PfixE9K30
1zAvv3/wuit3as2IkLibWZ3JKsMn2fSh65j6zHOrV1TnvogRF/1yyNdSSEVi
I1UKYQRYOCzMWERyJKzKog9ZY+uAtXIwlJzbDAij
      "], 
     Association["Book" -> 9, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNmBjYUHmAjlMTMh8FoahBgCr8AKY
      "], 
     Association["Book" -> 9, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOGBhQeExMDAx4ZQdEgAApwQCkg==
      "], 
     Association["Book" -> 9, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOuBE5rAwMDAxgRiMYMTAwjIgbqIEAACpQQKV
      "], 
     Association["Book" -> 9, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARJgYWBgYgIxGMGIgYVlgB1EOgAAofECjA==
      "], 
     Association["Book" -> 9, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQMjI4RkHA7hAQCZkQKA
      "], 
     Association["Book" -> 9, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARQMk8AAAJWpAns=
      "], 
     Association["Book" -> 9, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQwwMTENtBOoAACZgAKA
      "], 
     Association["Book" -> 9, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQgwMg6XsAAAlz0CfQ==
      "], 
     Association["Book" -> 9, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARgwDpewAACWcwJ8
      "], 
     Association["Book" -> 9, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQQMl7AAAJWqAns=
      "], 
     Association["Book" -> 9, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARwwMQ20C6gAAJf0An4=
      "], 
     Association["Book" -> 9, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQIwD7QDqAAAly4CfQ==
      "], 
     Association["Book" -> 9, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASpg4gORjEAApFhYBtg1pAMAqUoClg==
      "], 
     Association["Book" -> 9, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARpgRmIzMQ2YM8gFAJnpAoE=
      "], 
     Association["Book" -> 9, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbACARAioVloB1CMgAAnVgChg==
      "], 
     Association["Book" -> 10, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASZgHMIBAwCWWQJ8
      "], 
     Association["Book" -> 10, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARbANNAOIB8AAJZYAnw=
      "], 
     Association["Book" -> 10, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQ7ANtAOIA8AAJksAoA=
      "], 
     Association["Book" -> 10, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAXbAzsLAwDrQjiADAACgSgKK
      "], 
     Association["Book" -> 10, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQ7AwsDAOtBuIAMAAJtCAoM=
      "], 
     Association["Book" -> 10, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 9] -> CompressedData["
1:eJztjcENAjEMBH0Ske1dx748EG9aooRrgFrpiASQeNEB81jJs1r5etxvxyYi
jxV/fnI+bVuJmAEITqrG2PfMVOXMpXLKulS5LwO+fS4UlZhHJbNeuFfGx3CV
6WtCzGeqKtpaQEwltBH0pogwUzPrbk3HUMYXi4AzUKeewc4gOp5sEQo6
      "], 
     Association["Book" -> 10, "Theorem" -> 10] -> CompressedData["
1:eJztjssNAlEIRZlkXvhcHszbuLclS5gGrNWOBDVxZQeexU04QOB63m/nRkSP
jj+/2bctiUTM4Cgy1zqOiGBGZasomZdM1TaGt5/RQ5ZhVWQg8oVqhn8Muhna
K7C6xczEY7iRMDkPGHSwuYuwiEyVwWsx/Iu4m8It9xmOWT/a1Cdc2goj
      "], 
     Association["Book" -> 10, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAT7AM9AOIBkAAJ06AoY=
      "], 
     Association["Book" -> 10, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 14] -> CompressedData["
1:eJztjesNAkEIhDGRGxjYvcQObMkSrgFrtSPZXR8/bcAvYQLDBK7H/XacROQx
5M8vto0MrvKF6ug4DC5sOk6+pUJBD5+ZV8qMbh+nljXNVX0BIFDNFIVcoK1l
KrKnGcysuwH7jsgvPrV3nls0j+rq7hOxRAjb
      "], 
     Association["Book" -> 10, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 17] -> CompressedData["
1:eJztjdsNAkEIRdnEXeZygdiCLVnCNmCtdiTDRP20AU8Iz5NwOx/3cxOR50x/
fnJ1D5Ag3rjPnuwlaRXRV668pASyN2y1JAMCH6Putsx6YmZiemSIQkJ3d8eu
UZ+oU9GabKiOLxgjIzNx6amo8gLS+wjR
      "], 
     Association["Book" -> 10, "Theorem" -> 18] -> CompressedData["
1:eJztjdsNAkEIRdnEXeZygR5syRK2ga3VjmSYqJ824AnheRLu5/U4NxF5zvTn
N+4BEsQb99mTvSStIvrKlZeUQPaGrZZkQOBj1N2WWT/MTEyPDFFI6O7u2DXq
E3UqWpMN1fEFY2RkJm49FVVeyCsIwQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 19] -> CompressedData["
1:eJztjcsNAzEIRIkUGwZjLGsrSEspYRvYWrej4I+UawrIOyAz84Rf5/U+H0R0
j/HnB46mCyxyHq8Z7ELS7JaxtOwKx3S2JaKQnWCUsQ2O+CJSUs5eSUCNk5kh
cXUrhUOpHJsps3yBSG/euz7nQZQa4wOv0Qi0
      "], 
     Association["Book" -> 10, "Theorem" -> 20] -> CompressedData["
1:eJztjcsNAzEIRIkUGwZjLEvbQFpKCdvA1rodBX+kXFNA3gGZmSf8Oq/3+SCi
e4w/v9B0gUXO4zWDXUia3TKWll3hmM62RBSyE4wytsERP0RKytkrCahxMjMk
rm6lcCiVYzNlli8Q6c171+c8iFJjfACjbAii
      "], 
     Association["Book" -> 10, "Theorem" -> 21] -> CompressedData["
1:eJztjesNgDAIhDGx5VFLdQRXcoQu4KxuJLQ+/jqAH8klx11grftWBwA4XH4+
IR3uxOhq1qZDoQV3oZVUWD1/WkTCdG3YQ3PO3B+AYNQMxLBgyHnigFmnlNAq
JgFFEOmFiUrRUmRsBzm5nJV3CIE=
      "], 
     Association["Book" -> 10, "Theorem" -> 22] -> CompressedData["
1:eJztjO0JgjEQgytY7jN922sXcCVHeBdwVjfyioJ/HcAHLhxJyO183M9LKeW5
5c9vRMzVmjs2ZjN1zbmwGnAALNvpEaljfEoNhhUNC29YsoyZTmDsEII+xrDc
F6FCtaoWogKqaiqV1D0DZoZwpd7J/Au7K/KOK1p+5iqQF2ieClw=
      "], 
     Association["Book" -> 10, "Theorem" -> 23] -> CompressedData["
1:eJztjLsNAkEMRBcJy5/12LvcBaS0RAnXALXSEV4ISCmAJ2uCNyPfjsf9OLXW
niv+/Mh1zAgzLMxm5Swwwr1OdZmxbZVx+YwQa7QHJvyNapTbyuyYq4RizMys
98zcmMisEbfOJGZKbO4iLCKuQpzJ3b+IOyIgeUaH9aoU+gJJdAoi
      "], 
     Association["Book" -> 10, "Theorem" -> 24] -> CompressedData["
1:eJztjMsNAjEMRIOE5U88drK7DdASJWwD1EpHOHDgSgE8WXN4M/LtfNzPS2vt
ueLPr4wZYYaF2aycBUa416kuM/a9MrbPCLFGR2DC36hGub3MgblKKMbMzPrO
zI2JzBpx60xipsTmLsIi4irEmdz9i7gjApJXdFivSqEvOcwKCg==
      "], 
     Association["Book" -> 10, "Theorem" -> 25] -> CompressedData["
1:eJztjN0NAkEIhNdEwj97CxVcS5ZwDVirHcn64qsFOCFfyMDMeT0f122M8dr4
62dVhKqIi6tWsypLsr0elu1kZnMdDe+ncPXK8Op9iyXcOhaevvZxR9Za1uXM
OBDAbACORCAVBrQwIiQiYwKcE9W+4g/jmPeuElUTdn4DLR4KAQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 26] -> CompressedData["
1:eJztjMsNAjEMRINE5PFnrIQlKzjSEiVsA9RKR2S1F44UwDvY1huNH9vruZ1K
Ke99/PkdMxy4D4C3Me5cM9MyVXfTloXkpc+TnB6JsSRXHjUNw3Q0EL6HDLTe
+3X+FtECkWCBFmq1MJUaGu5VVSla5674QoCWvXU/U81dPYz2Afe/CWA=
      "], 
     Association["Book" -> 10, "Theorem" -> 27] -> CompressedData["
1:eJztjTEOwlAMQz8SX2nyEzulrcTKlThCL8BZuREpCzN73xAlsWU/9tdzv7TW
3sc4+YN7OEDFGFkzk6nc5gMz1icQVCKhQJk2DmSsTHw9s9pCq4CVYB5iXcEl
rKJFpEnvZq1ru0mPcO9i7tOkBVRFq2L4jxLrrKYrHLWHD+oHP4EKRg==
      "], 
     Association["Book" -> 10, "Theorem" -> 28] -> CompressedData["
1:eJztjUESAkEIA8cqp1gYSNjV8e6XfMJ+wLf6I1kvnr3bhxSQFLnvz8d+aq29
DvnzC+ERVIyRpZlM5VwPzFiXQLlEQoEK3TiQMZn4ZFa1Kw3gJJiHWVvwElaf
RaRJ72ata9ukR7h3Mfdl0QKqolUx/EuZtVbTGY6awwf1DTDxCjE=
      "], 
     Association["Book" -> 10, "Theorem" -> 29] -> CompressedData["
1:eJztjLERAkEMA80MHvtsWf9D8DktUcI3QK10hI8LiMnZQMHK1v18Ps6LiLxm
/PmJY99JqmZnNswgi4wxTeXynGhX3X9yEREMog24dMyXjB42MzFVQNTkZloF
qIFwN3fncLNts8SXAQTQu9e+7aIYFW/gBQly
      "], 
     Association["Book" -> 10, "Theorem" -> 30] -> CompressedData["
1:eJztjLENAlEMQ4NE5FziWCc2YCVGuAWYlY3I5wpqel7h4jnx/Xg+jouZvVb8
+Y19l+RekzWoUmopsUzX6bXwqab/5ElmKsUx1KlzvVTOLgCDO2kOu8G7SQfF
CESEtgAkFL9sZCZn9zq3U7Sy8w3QtglW
      "], 
     Association["Book" -> 10, "Theorem" -> 31] -> CompressedData["
1:eJztjDEOQjEMQ4tEiJrGTtWRjStxhH8BzsqNcGFgZucpciLHye143I9Ta+25
5c+PXFmMmNIpatZaquglBwXp2OMnFOrk5Hqzna4QCQ5qqUJVpd6aeXMzoGlY
bshMczAvFxfsW+gjv0Qm+tD9mQNaAIF4ARq0CfI=
      "], 
     Association["Book" -> 10, "Theorem" -> 32] -> CompressedData["
1:eJztjMENQjEMQ4tEftQ0dqpuwEqM8BdgVjbChQNn7jxFTuQ4uZ2P+3lprT23
/PkVFiOmdIqatZYqeslBQTr2+AmFOjm53mynK0SCg1qqUFWpr2be3AxoGpYb
MtMczONwwb6FPvJLZKIP3V85oAUQiBcKjgnY
      "], 
     Association["Book" -> 10, "Theorem" -> 33] -> CompressedData["
1:eJztjEsOwjAMRI2EXcefpkosUthxJY7QC3BWboSzYt09bzHyaJ78PN6v4wIA
nxl/TtOHSH30qEmGe7gL1qFhbiPCbPSIntJdYnMfNYUJs+9l3ruHzfHGKW9W
8ikiAhKVAqjQFnSzmlWMiJhZlBcyI7Uf2bWUdV2vIhqa9Na+MjQKMQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 34] -> CompressedData["
1:eJztzLENQkEMA9BDIrnLJXEC5EsIKlZihL8As7IRHxpqel5hybLk2/q4r7vW
2vMdf79biBLXyrQ8WwAGDElcygNxLou81nKiqZiVoZGFjzEAKaQGyt5jjHJP
t+2TiBoxizTSduzkZrHVaczcexftwu6s9sVmQ8Qc+zknVHQeEC8N1AnN
      "], 
     Association["Book" -> 10, "Theorem" -> 35] -> CompressedData["
1:eJztzTEOwkAMBMBDwo59Xm9OBHFp+RJPyAd4Kz/ikoaeOlOs5LVkP7f3a7uU
Uj57nP4gAi4dw4I5855pDkZnm/e+jeiL1GBd0YLoeXAjfeVo2op9SXskGnCc
lCKq7kWi3CZJYB5jhaqaWQ2bNFMDPwqEO5nXOl6FRzTyC+x2CZw=
      "], 
     Association["Book" -> 10, "Theorem" -> 36] -> CompressedData["
1:eJztjNENw0AIQ4lUZMxx1SkbdKWMkAU6azcKXNXmP995EjYGya/9ve2LiHxK
bq6wNpItYZvWvNcl96kl7IMc8/LNpHsO/4xfimxMEwOii5kE1MMVGgxSAeQL
aqawk9y9P7P0UcnpZQdL/AfT
      "], 
     Association["Book" -> 10, "Theorem" -> 37] -> CompressedData["
1:eJztjNENw0AIQ4lUZMxx0a2QlTpCFsis3ahwUZv/fOdJ2Bgkb/vx3hcR+ZQ8
3KKRbAnbtOa9LrlPLeE6yDEvZybdc/hn/FJkYZoYEF3MJKAertBgkAogX1Az
hV3k7n3N0lclp5d9AUGeB8I=
      "], 
     Association["Book" -> 10, "Theorem" -> 38] -> CompressedData["
1:eJztjEsOwjAMRINE5PFnHKWlgS1X4gi9AGflRqSsWLPmLcajN5Lv+/Oxn0op
ryP+/AbAcR2DW2ZaptptmmVdSS59VnJ6JLY1OYgPGobpaCD8GBlYeu+X+U9E
C0SCBVpCq4Wp1NBwr6pK0TpvxRcCtOyt+Zlq7uphtDev8wjp
      "], 
     Association["Book" -> 10, "Theorem" -> 39] -> CompressedData["
1:eJztzE0OwkAIBeAxEcrwMzQz1LbuvJJH6AU8qzeSrty79lsQXnjhcbyex6WU
8j7H349GeBruZmHG4HcJNV1HqO4jYjD7zststnoWTkS21SWXzRY9jzfK8qw1
3wFAAcRaC0jpE5iqZ2RFRCJioQlVUfQrs9TaWrsyS0gavX8A9vYJ0Q==
      "], 
     Association["Book" -> 10, "Theorem" -> 40] -> CompressedData["
1:eJztzMEJQkEMBNAVTHazSSZRI3wED7ZkCb8Ba7Ujv168e/YdBoaBua2P+7pr
rT3f8feryrRcLAADhiQu5YFYyiKvdT7RVMzK0MjCxxiAFFIDZe8xRrmn2/ZG
RI2YRRppO3Zys9jqNGbuvYt2YXdW+2KzIWKO/ZwTKjoPiBfaFwl3
      "], 
     Association["Book" -> 10, "Theorem" -> 41] -> CompressedData["
1:eJztzEEOQjEIBFBMpC0w0GoT3XwXXskj/At4Vm8kdeXatS+EZCCZ+/587Aci
eq319zOkie4+3ZvAsMXoMTcMxe0ym2poYFjg2j9EPCwir3ru6xkCx4BlGTMT
l6JKLCTMrTFW5FpzqlktrGD5ksHllJVHiJpJNI94A6VYCNQ=
      "], 
     Association["Book" -> 10, "Theorem" -> 42] -> CompressedData["
1:eJztjDEOwkAQAw+J09m7602OJCBKvsQT8gHeyo84qKipmcK2bMm3/XHfD6WU
51v+/M62XnXJTMskN0Dzskg69RGl0SOxLamz8IFhGJ0Mgr9HBebe+zq+WmNB
a6EClmC1MLYaDPdKUo11eMUXDZiyT92PornTw2QvjvAIsQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 43] -> CompressedData["
1:eJztjMsNwkAQQxeJ1Xg+nmSzJIgjLVFCGqBWOmLDiTtX3sG2bMn3/fnYT6WU
1yF/fmC98ZqZlqm2AZx7J7m0EcnRI7H25EZ80DCMjgbCj5GBubV2GVciWiAS
LNASWi1MpYaGe1VVitbhFV8IMGWbmp+p5q4eRnsDhG4Inw==
      "], 
     Association["Book" -> 10, "Theorem" -> 44] -> CompressedData["
1:eJztjLsNwzAQQxUgwvE+PEGWod4rZQQvkFmzUeRU6dPmFSRBAjzO5+O8lVJe
l/z5hcmZmZapNgFuY5Dc+ork6pHYR3ISHzQMq6OB8GtkYOu97+tJRAtEggVa
QquFqdTQcK+qStG6vOILAVr21vxONXf1MNobdzMIhw==
      "], 
     Association["Book" -> 10, "Theorem" -> 45] -> CompressedData["
1:eJztjDsOAjEQQ4NEFHtmPErYT8+VOMJegLNyI7JU9LS8wrZsyffj+TgupZTX
KX9+QntmWia5AerLIuk2ZpRmj8S2pHbhA8MwOxkEP0cF+hhjnUetsaC1UAGL
WC2MrQbDvZJUY51e8UUDeo4+/CqaOz1M9gZo2ghx
      "], 
     Association["Book" -> 10, "Theorem" -> 46] -> CompressedData["
1:eJztjDsOwkAMRI2EN/7OJspSpeJKHCEX4KzcCC8VPS2vGM/TSL6fz8d5IaLX
jD+/sXVgB0TD15HoOEZ0P8ZtFwuYlWAd9kHU4HWjAnOEWuaao/4wM3FrZsRB
yizCPpWXpbq4SWML1i9KUrdMv4aau7ok8AZvTwhm
      "], 
     Association["Book" -> 10, "Theorem" -> 47] -> CompressedData["
1:eJztjLsNAkEQQxeJ1Xg+nrth4XJaooRrgFrpiD0iclJeYFu25Pv+fOyn1trr
kD8/kpmWqbYBXMcgeakZydkjcRvJjfigYZgdDYQfIwNrVV3njYg2iAQbtIV2
C1PpoeHeVZWifXrHFwIsWUv5mWru6mG0N1XQCE0=
      "], 
     Association["Book" -> 10, "Theorem" -> 48] -> CompressedData["
1:eJztzLsVwkAMRFFxDkIafRevG6AlSnAD1EpH2BBQACk3eMEEc9se9+1ERM8j
f79ap6/TrKGVlY2uoVpVZg7DyLVHfSAcQFbsux3pQPaS2F+YhUQkghi0gOEs
wuGsynhjNmPB10V15HXOPDci3SKt/AVPiAhK
      "], 
     Association["Book" -> 10, "Theorem" -> 49] -> CompressedData["
1:eJztjDsOAkEMQ4PEaOL8Fle7LVfiCHsBzsqNyFDR0/IiWbEc534+H+dFRF5L
/vzMzmOHE6iNrCpSUQtHgbwVa/sAN/SYt+oK+6IrjH4CqNgYkWKQnCMs0BZh
NtDMdr3qF1M1nEG7JjQdHp72BmMTCGE=
      "], 
     Association["Book" -> 10, "Theorem" -> 50] -> CompressedData["
1:eJztjDsOAkEMQ4PEaOL8FlfQciWOsBfYs+6NyFDR0/IiWbEc57kfr/0iIueS
P7/D+wNOoDayqkhFLRwF8las7QPc0GPeqivsi64w+gegYmNEikFyjrBAW4TZ
QDPb9apfTNVwBu2a0HR4eNobVmcISg==
      "], 
     Association["Book" -> 10, "Theorem" -> 51] -> CompressedData["
1:eJztjMENw1AIQ6lUBNh8UEboSh0hC3TWbJTQSxbose9g2Zbs1/557w8ROUb+
/IDuCFYVOEqumoAuXKGL1V8i7oZjK2ZCXBdmJqaaEDdJU4Khhkx3c/cVrrZt
xrzxTAQT/VyVXExixQlAMwhT
      "], 
     Association["Book" -> 10, "Theorem" -> 52] -> CompressedData["
1:eJztjMsNAkEMQ4PEaOL8Fp/2TEuUsA1srdsRGU4UwJEXyYrlOM/jfB03EbmW
/PkFO5xAbWRVkYpaOArko1jbB7ihx7xVV9gXXWH0B0DFxogUg+QcYYG2CLOB
ZrbrVb+YquEM2j2h6fDwtDdAFAgh
      "], 
     Association["Book" -> 10, "Theorem" -> 53] -> CompressedData["
1:eJztjN0NwkAMg4PU08X5az0CKzFCF2BWNiLXpw7AI18kK5bjPM/363yIyGfJ
n58AJ1A7WVWkohaOAnkUa7+AG3rMW3WFfdEVxnoAFRsjUgySc4QF2iLMBprZ
rle9MVXDGbQtoenw8LQvM64ICg==
      "], 
     Association["Book" -> 10, "Theorem" -> 54] -> CompressedData["
1:eJztjNENw0AIQ4mUOzAGRboNslJGyAKdtRsV+pMF+llLfhgkfN6v695E5N34
6zdiAmA2j0Yt4QhklsGvFvvi7iDQsWa9JOtf1QRmkVJh2QjGgBVVp6p6eTqm
2qNhdnis8J3WlaxWfAD28Qdw
      "], 
     Association["Book" -> 10, "Theorem" -> 55] -> CompressedData["
1:eJztjNsNAlEIRDGRwAxzNxorsCVL2Aas1Y6E9cMG/PQkHCbhcd+fj/1kZq/R
nx9RaDTmEaGN2CARag+riAI5xgzx2bz1eTfLCC3LtBWeooeLqvKIQJcTHvml
87W58EzMy+PZG+hNB1o=
      "], 
     Association["Book" -> 10, "Theorem" -> 56] -> CompressedData["
1:eJzti8ENAjEQA4NEtPbuOlEuXAG0RAnXALXSETleFMCTediWLd+P5+O4lFJe
p/z5FYC2OSVtY0WpNUfDPpt24QPTsTo5hDhHJbYxxm29zVhglipgSVZPp9Vk
RlSSMtblFV8Y0NvoPa6iRzDS5W8VSQfV
      "], 
     Association["Book" -> 10, "Theorem" -> 57] -> CompressedData["
1:eJzty80NwjAMBWAjYde/SdUktBxZiRG6ALOyES6XLsCR7/DkJ9uP/fXcLwDw
PuLvZ2x4+NqHH9Gbar3rmCPW2uOLOTYZOWwx/FjeOI9nl3xGREAiEUCDZcJw
r1nViYiZ1XgidzI/ZTeRUspV1bql1pYPXs0Iwg==
      "], 
     Association["Book" -> 10, "Theorem" -> 58] -> CompressedData["
1:eJzty7sNAkEMBFAjYe/6b2BXuuQCWqKEa4Ba6Yg7Ehog5AUjjUZz356P7QQA
ryP+fmd6RizTMtY5byhaMiq1asRH75E8orRi2DFGH+7ltn8REZCIGVDh2tDN
cq9iRNRaY21M7qT2RWad2TzOIpLKKpfIN0YBCG4=
      "], 
     Association["Book" -> 10, "Theorem" -> 59] -> CompressedData["
1:eJztjEsKAjEQRFuwkvQnlYwDIgyz8EoeYS7gWb2RiSsP4NJH0/Cq6bofz8dx
EpHXXH9+CHvjukW32K9rMaMxujNu7YNqpZMjtUubR2rU6OHjFYAgJTOBigKl
IKYi5zHZPSdYQL8YUnUZledQc1eWSr4BHxMH3w==
      "], 
     Association["Book" -> 10, "Theorem" -> 60] -> CompressedData["
1:eJzti8sNwzAMQ1VbMS1RsVCgC3SljpAFMms2qt1b7z32geCBn+dxvo6biFzL
/vySTJKZALlzOAZGBpPjg8PBWS23VRJ4zJzzWSsEpQBSuwBba96KAqGqZgad
8q7bN2H3CNN9rq2xe/gbyk8Gvg==
      "], 
     Association["Book" -> 10, "Theorem" -> 61] -> CompressedData["
1:eJzti8sNAjEQQweJ0Xg+TjawFEBLlLANUCsdkXCiAI68gy092ffj+ThOIvJa
8een7CTbFeDqQMO4NW6MD6jAdFw+V7Cwjd73eVR1gVlR4HJxnVs3La9MdXea
q1YqvjCg9zFGnumR6ZnBeAPzKgeZ
      "], 
     Association["Book" -> 10, "Theorem" -> 62] -> CompressedData["
1:eJzti8sNAjEQQweJUTwfJ5rVijstbQnbwNZKRyScKIAj72BLT/bzvI7zJiKv
FX9+C8n+ALja0VF7Z9E/IB3TcflYwcRWY+zzp2qC1pICk810bq1pWkaombGZ
aobiiwaMUVVxp3mERTj9DesnB4o=
      "], 
     Association["Book" -> 10, "Theorem" -> 63] -> CompressedData["
1:eJzti8sNAjEQQwcJJxlPJh9tVoIjLVHCNkCtdETCiQL2yJNl6cny43g9j4uI
vFf9OZmx51J9jC3RSbbq97bzS1LSyOqdLGu8KXNu2eYNgCAEUqCiQEqwpYhx
JprFAHPoD1Ncuxe7mnIzLanX/gHuxQeG
      "], 
     Association["Book" -> 10, "Theorem" -> 64] -> CompressedData["
1:eJzti8sRwkAMQ5cZtLHlT5JhOXKgJUpIA9RKR3g5UQBH3kHWG43vx/NxnFpr
rxl/fs3w1W7jehF6kiXrNvhBlGl1vSLnmMqILUZ9AWjonWzwpoAIbCqWpboY
pYMO/aIkdI+wsyvN1CQy3+QvB18=
      "], 
     Association["Book" -> 10, "Theorem" -> 65] -> CompressedData["
1:eJztjMENAjEMBI3EJrG98eVAunvxoCVKuAaolY5weFEAT0aWpVnLez+ej+Mk
Iq+5/vwcDuNtuzazsODw4L58UO3hEZnaZZnHUHYOej4BEJRiJlBRoDVwKmrN
qe61wAj9IqXrmpVnqrlrtB7xBuoJB3U=
      "], 
     Association["Book" -> 10, "Theorem" -> 66] -> CompressedData["
1:eJzti9ENwkAMQw+JXG0nOeh1gq7ECF2AWdmIlC8G4JMn2bJkez+ej+PSWnud
9uf33IHMHEMYFcIz9cFTENxL8LP0QM5t8/qYsS2k1CpMGt1AiAZ0krHQutjx
RVUj17nqGrV05k2hN73vBxc=
      "], 
     Association["Book" -> 10, "Theorem" -> 67] -> CompressedData["
1:eJzty8sNAkEMA9AgEcXOZ6JhK6AlStgGqJWOmOFEA9x4Bx8s+34+H+dFRF47
/n4AqGU4BvoY1eUfSMfqavexoxKzu8e6qFJgliWg3KhrS9NkRijJMqpmKL4Y
0D3njGsREYzw8jfBYQcv
      "], 
     Association["Book" -> 10, "Theorem" -> 68] -> CompressedData["
1:eJzti8sNwkAMRI2EHX/Gmyi7C+FIS5SQBqiVjthwoQJuvMNonkZz35+P/URE
ryP+/ILaWnWfb96XzCtaflDNzfooW3Yc40UbsMDGg5mJRcyIg9aJE5iHOkRE
VT10EkACX4aHWSnl7B4tLKIu6xsCYwgD
      "], 
     Association["Book" -> 10, "Theorem" -> 69] -> CompressedData["
1:eJzty8sNwkAMBFAjYXv934BTQFqihDRArXREwoUKuPEOI41Gs+3Px34BgNcZ
fz/R6x3VUnuW1ez8GCNTOqdVtp9jjY6Y4ccBEQGJRAANbozhXkdVJyJmFmOh
CDL/Ivch4pFXVU0T0yXrDeP4B6Y=
      "], 
     Association["Book" -> 10, "Theorem" -> 70] -> CompressedData["
1:eJztjMsNwkAMRB2J2V1/dhJAIue0lBLSALXSEV5OaSA3nixLb2zNdrz3YxKR
z1h/ruH1bGY0xuKMdf6h2ulkpvaYx5EaPZbw/AcgKMVMoKJAa4ihqDWnutcC
C+iJlK73rLyFmruydfILxDIHJg==
      "], 
     Association["Book" -> 10, "Theorem" -> 71] -> CompressedData["
1:eJztytENwjAMBFAjcUl85zitWgZgJUboAszKRqSIFfjj6XzSSb4fz8dxMbPX
WX8/sjVyJ5euEbfx4S6Jk5KDG5nOiCX2+Q7AUAppcHOgNcQ5UetMlWoBA/7F
eXN0X3vqGk7Js/XMN7RLBwI=
      "], 
     Association["Book" -> 10, "Theorem" -> 72] -> CompressedData["
1:eJztzNENwjAMBFBzTmPnYiVB6gKs1BG6QGdlIww/TMAfTyd/nE5+nNdx3kTk
+T5/v2LG3cg1Jtf8cGc0zskYHFzGnkXacw1AoOomqFKx1YquaoZSMsWtKEg0
fiGC7c7eyzALb77l1xePgAZn
      "], 
     Association["Book" -> 10, "Theorem" -> 73] -> CompressedData["
1:eJztjLENgEAMA4MEJPZjpB+BlRiBBZiVjUhomICOKxydYnk7zv0YzOyq+PmM
DnS0BrTMggSEolXkn48pyySNPq+yCJNPkjC5loX06ngawz1eEEH1HB3LmGN5
bpDmBpg=
      "], 
     Association["Book" -> 10, "Theorem" -> 74] -> CompressedData["
1:eJztjdsJgFAMQyt4bRKsoBu4kiPcBZzVjWw/xAn881AKeUD2fh59MLOr3s93
kBslUvkLgMxjOazwUdnN2OTTEgZYeIsINo95lrw6nkpwxwsBxZoDY6kcSQM3
gOAGcQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 75] -> CompressedData["
1:eJztyssNwmAMA+AgEaXJn9iF8hBHVmKELsCsbETKhQm48R2sWM59fT7WnYi8
tvj7IQ6wzvOM5cPjwgB4JXi4bWOguNSpX81MTDVC1OVoWpWpFpnT5A3u5qSP
/OqxKwp7DPRdOehv23YHxA==
      "], 
     Association["Book" -> 10, "Theorem" -> 76] -> CompressedData["
1:eJztyssNwkAMBFAjYcef8W60WHzEiZYoIQ1QKx2RcEkF3HiH0Yw0j+X1XA5E
9N7i75e85sxLr/xSzZtt/ZqFfvc6awEzbH0yM7GIGXHQmDiBvk6HiKiqh06S
KYGdAGHWWju6R4VFnMb4AMOHB38=
      "], 
     Association["Book" -> 10, "Theorem" -> 77] -> CompressedData["
1:eJztjdEJgEAMQyt4Ng1WECdwJUe4BZzVjUx/dAL/fJRA0kD2fh59MLOr5OdT
goygtAAidFGJ2B6not5Gn5Y0wNJbZkbznGfSq+NyhDteAmCuGhjLaUQBbnES
Bk4=
      "], 
     Association["Book" -> 10, "Theorem" -> 78] -> CompressedData["
1:eJztyrENAkEMRFEjYZ+9M7ZXBJAcAS1RwjVArXTEQUIFZLzgSyPNbXvct4OI
PN/5+6050bzkR3h3rNXoXoke1/JzcjL2n6qKmkWIQk6LJtn7HDQzdx/wxTIN
/DISEVV1xEAjgNn1Aq/+B0M=
      "], 
     Association["Book" -> 10, "Theorem" -> 79] -> CompressedData["
1:eJztyrENQkEMg+EgEcVO4tM7QKJmJUZ4CzArG3GIhgXo+ArrL3zbH/f9YGbP
9/z92HnoKnx0IjGUEGoF1NjmnJd1i6AhomWgiZ6dDG92lZNU0H0lvgQwx9xO
dRSzipWpfAGEdAau
      "], 
     Association["Book" -> 10, "Theorem" -> 80] -> CompressedData["
1:eJztissNwlAQAxeJ1dr7yctDSJxpKSWkgdRKR2w40QA3RvYcLD/3Y9svIvI6
9efXLPUo/4B0OJbqIk5VYp1j3PulSoFZloByo/aXpsmMUJJlVI2evzBgHXPO
uBYZ0fHyN3+DBqE=
      "], 
     Association["Book" -> 10, "Theorem" -> 81] -> CompressedData["
1:eJztzMsRg0AMA1AzE2HLa2OyVJCWKIEGqJWO+FxSQW55B40OGn22fd0GETnu
+Pu5mqs/GEUyMzjRi84KZi3ZrhGgoqoRAkon2GCKcJiBD9xV+TWazfleer6u
u2we6VM7AYiXBro=
      "], 
     Association["Book" -> 10, "Theorem" -> 82] -> CompressedData["
1:eJztytENgzAQA9BDqpOc75JwAvW/KzECCzArGzX0iwX465NlyZI/+7Htk4ic
V/09b1n5U5Szkb0G2WoPvpXus6/jA0CQEilQUaAU+DWR80g2ywl06M0YVaM2
e7lyMW0lenwBdLAGhQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 83] -> CompressedData["
1:eJztyrENAgEMQ9EgEcV2ErgcLMBKjHALMCsbcYiGCah4xZcL37bHfTuY2fOd
vx9ofJQgnFpopHBBF5aZue6XCBoiqg20la4Sw4uV6SQ76L5PfAlgOc+seWwq
kym1XmnUBnQ=
      "], 
     Association["Book" -> 10, "Theorem" -> 84] -> CompressedData["
1:eJztjNEJwzAQQ69Q2dadfe5Bf/vRlTpCFsis2aiXkBnylSchEAh9l/W3PERk
2+PmCuYBOdzCXU3nVI9PsI/+6u9cABCUoiqgEGgNY6+oNV3NaoF2UFMnyD9G
Xj6dakZvw+MPeDsGig==
      "], 
     Association["Book" -> 10, "Theorem" -> 85] -> CompressedData["
1:eJztyrsNhEAMBFAjnfHfi0XAprRECTRArXTEEl0HF92TZjTB7Od1nBMA3G/9
/UTfet9ElvSsWrKy3pmSbVjHAZGAEFWBCJTQ2ARJ1eeZmNmFkVoj8y8eiVCt
T1iYmYeGPJLKBvk=
      "], 
     Association["Book" -> 10, "Theorem" -> 86] -> CompressedData["
1:eJztycsNwkAQA9BBYuL5b0Y5JFdaooQ0QK10xOZEB5x4ki1Lfpyv53kjovdV
f7+xH8euulZU91pdfc3SGtM2f2YQmM0IIAO7uDLMYlkgIqHCGAMeXzKTadb3
9HT3SEv9AIjyBuE=
      "], 
     Association["Book" -> 10, "Theorem" -> 87] -> CompressedData["
1:eJztycsNwkAQA9BBYuL5b0a57JWWKCENUCsdsTnRASeeZMuSH+fred6I6H3V
34/MOVX3iureq6uvWVpjOdbNDAKzGQFkYBdXhllsG0QkVBhjwONLVjLN+p6e
7h5pqR9//wbL
      "], 
     Association["Book" -> 10, "Theorem" -> 88] -> CompressedData["
1:eJztycsNwkAQA9BBYuL5b0Y5wDUtpYQ0QK10xOZEB5x4ki1L3s/Xcd6I6H3V
3688H6prRXWv1dXXLK0xbfNlBoHZjAAysIsrwyyWBSISKowx4PElM5lmfU9P
d4+01A918gay
      "], 
     Association["Book" -> 10, "Theorem" -> 89] -> CompressedData["
1:eJztybsNw0AMA1AZsEz9z0KK1FkpI3iBzJqNfK6yQSo/gAQBvo7P+1iI6HvV
7W+eqntFde/V1dcsrTE95skMArMZAWRgF1eGWWwbRCRUGGPA40dmMs16TU93
j7TUE2xiBpo=
      "], 
     Association["Book" -> 10, "Theorem" -> 90] -> CompressedData["
1:eJztycsNg0AQA9BByuD5LyMqSEspgQZSazpiOdEBpzzJliW/j+/nWIjod9Xf
c1S3iureqquvWVpj2ufHDAKzGQFkYBdXhlmsK0QkVBhjwOMmM5lm/UpPd4+0
1BNjTwaD
      "], 
     Association["Book" -> 10, "Theorem" -> 91] -> CompressedData["
1:eJztycENgDAMA8Ag0caOkx8LsBIjsACzshGFDxvw4iRblrzux7ZPZnbe9ftQ
KTJuCsaYrFBVaVzuMAJZ9oyWykYk0727e4z0YHe8GrAwl4xZBEWJ5AU/kwXi

      "], 
     Association["Book" -> 10, "Theorem" -> 92] -> CompressedData["
1:eJztjLENgDAMBI1EZF/iJAImYCVGYAFmZSOckgWouOKK/9fv53Wck4jcQz9f
4hR6dxptiI0SEE1KKgruoiYrKZcM6u5mRuRKslpN7cXCsm51LjnGjEMeRYEG
EA==
      "], 
     Association["Book" -> 10, "Theorem" -> 93] -> CompressedData["
1:eJzti8sJgEAMBSMYssnmp9iALVnCNmCtdmQ82YEn5zDwGN4+zmNMAHA9+vmU
lIzY0rPUs5YtuVgFIgJCFAFkWAnNVJFEtTUunJk4gru+VKzp5rN3V61HD74B
YqoGog==
      "], 
     Association["Book" -> 10, "Theorem" -> 94] -> CompressedData["
1:eJztyMsNg0AMhGFHwsaP8YI2G4S40VJK2AZSazoCTukgJ77DL83s/fPuDyL6
Xrn9l7XMXLNh2rwt+gJm2PkzM7GIGXFQHTmB6ZwOEVFVDx0lUwI/AoRZKWVw
jxYW8az1AFdSBnk=
      "], 
     Association["Book" -> 10, "Theorem" -> 95] -> CompressedData["
1:eJztytENgDAIBFBMrPQoUKsTuJIjdAFndSPplxv45UvuEsgd/Tr7RET3qN/H
YKaoiNqBhhI83iJCwosbMWjjpKpIbK6lMAKPy5jzCzm31Vu1WSBjY5EHPxAG
Ag==
      "], 
     Association["Book" -> 10, "Theorem" -> 96] -> CompressedData["
1:eJztyLENg1AMhGFHio397p6NKNJEFKzECG8BZs1GQJUNUuUrfuluG8c+HiLy
ufP3a1nIehPV1vRX58y4XlUVNYsQhSyTdrKu2Whm7t7gk/Vu4JeRiMjMJxoK
AcyVJ0g6BkI=
      "], 
     Association["Book" -> 10, "Theorem" -> 97] -> CompressedData["
1:eJztycERgzAQQ9FlJjvatSXbARqgJUqgAWqlI8gpHeSUd/gHaTvO/ZjM7Prk
7+eWpkW9aZaK5nfv6zMCMLiT5rAKj1LSQTICEcEMxxio/EpS5Gj9papSnyuV
N0YXBkE=
      "], 
     Association["Book" -> 10, "Theorem" -> 98] -> CompressedData["
1:eJztycERg0AMQ1FnJh7Zu9LikDRAS5RAA6k1HQVOdMCJd/gHadm+6/Yws9+R
2/WG3pqGZqlpflV99g2AwZ00h3V4tJYOkhGICGY4qtB5SlJkjemprtb3K5V/
P8sGMA==
      "], 
     Association["Book" -> 10, "Theorem" -> 99] -> CompressedData["
1:eJztybsRgEAIBFCckeHg+JyfxNCWLOEasFY7EiNLMPEFO+yy9/PoAwBcT/w+
0DymLWoLcYvF1pyICAhRBJBhJjRTRRLVUjg5M3EEV33lM6ubj149b9MafANC
pAZK
      "], 
     Association["Book" -> 10, "Theorem" -> 100] -> CompressedData["
1:eJztyLsNhEAMhGEjYePHeA8tCwEZLVECDVArHcFFV8IlfMEvzWzHuR8dEV3f
vP4hZ+Tqy6INGGHPw8zEImbEQXXgBD7PdIiIqnroIJkS+BEgzEopvXu0sIip
1hs11QYd
      "], 
     Association["Book" -> 10, "Theorem" -> 101] -> CompressedData["
1:eJztycENgDAMA8AgEdlJmxIQC7ASI7AAs7IR5cUIfDhZftjbce7HICLXU79P
xNxiibDIOXPtAwCBqrsopEDpbgr3SoJkNSoyUeqLPa2xTmOU8NIvC7sBLKsF
7Q==
      "], 
     Association["Book" -> 10, "Theorem" -> 102] -> CompressedData["
1:eJztx7ENgEAMBEEjYWP/3fsRARkBLVECDVArHQERJZAwwUq77se2dyJyPvl9
g2hlSZ8rR8b9qipqFiEKmQatZLu30MzcvcAHq9XAl5GIyMweBQ0BjC0vJgwF
4g==
      "], 
     Association["Book" -> 10, "Theorem" -> 103] -> CompressedData["
1:eJztx8ENgDAMQ9EgETltnFSCCViJEboAs7IRhQsjcOEdvuytH3ufROS88/sI
3cNrxLoudVwAAlVSFLJAI0gFk2YwsywGtAbnqzzNzNmd1ccqUS4j6gXn
      "], 
     Association["Book" -> 10, "Theorem" -> 104] -> CompressedData["
1:eJztybsNgEAMA9AgEfmSy0cCFmAlRrgFmJWNCBUj0PAKy5b3cR5jIqLrid9X
skdqeK6+1QJAYFYlFlrA7mYMNWtNSohAMqXbq86a4TFHj+puPeUGI9IF8Q==

      "], 
     Association["Book" -> 10, "Theorem" -> 105] -> CompressedData["
1:eJztx7sNgDAMRVEjYePPS0AhomclRsgCzMpGhIoRaDjFle7ezqMNRHQ9+X3G
66YVmGF9mJlYxIw4qEycgNTXISKq6qGTpCSBlwBhlnMe3aOGRaxLuQEXSwXB

      "], 
     Association["Book" -> 10, "Theorem" -> 106] -> CompressedData["
1:eJztissJgEAQQ0dwzQdHsARbsoRtwFrtyJ2TJXjxHUJeyNGvs08RcVf8fIdE
FR7ddhjLlkFGomWmGnJdbdQHw0yALyKdu625zPIY+AAF5gU4
      "], 
     Association["Book" -> 10, "Theorem" -> 107] -> CompressedData["
1:eJztx7ENgDAMRFEjYcfO2Q6ioGclRsgCzMpGhIoRaHjF193ez6NPRHQ9+X0o
dQtf3MZkZmIRM2LQWjjc27jVRURVK7RIhMBf4g6zzJxR0WDA0vIGDE8FlQ==

      "], 
     Association["Book" -> 10, "Theorem" -> 108] -> CompressedData["
1:eJzth8ENgDAMA4OEm8ZNKypYgJUYoQswKxuRTsEH62z5znFfYxGRZ86fL2N0
3/yIB0CQEikwMSBn+FSoBlqKJtBhxoCcDanWa+Pqxr1Yy33rL/ssBUM=
      "], 
     Association["Book" -> 10, "Theorem" -> 109] -> CompressedData["
1:eJztytEJgDAMBNAIhtylqdA6QVdyhC7grG5k/HQCf3xw4Tgy5nnMRUSu5/w+
xZKYRdXEyAgxSKd6cdIiAgBzNypqheGlsfW9rsXzmdwyN/7wBUc=
      "], 
     Association["Book" -> 10, "Theorem" -> 110] -> CompressedData["
1:eJztx8ERgDAIRFGcEQJZwEw6sCVLSAPWakfGkyV48R3+zu7jPMZCRNeT37fC
m9tcZiYWMSMG9cLhvs1bXURUtUKLRAj8Je4wy8wVFQkDWvYb/ioFaw==
      "], 
     Association["Book" -> 10, "Theorem" -> 111] -> CompressedData["
1:eJztxcENgDAMQ9HQpjFOokqMwEqM0AWYlY0AcWICLjx92evYtzGJyHHP72O9
97iuVghKAaTOAjQzt6JgqipJ6FN7Sy6Z1LDmtAhPPwHe7QSG
      "], 
     Association["Book" -> 10, "Theorem" -> 112] -> CompressedData["
1:eJztxcENgCAQRNE1cRzYARINMV5tyRJowFrtSDxagRff4f+9nUcbzOx68vva
uuVegEbA3SBbgBgxgZ5AUpJToZ/hLae5Vo05Bk9eipJu9Y4FLg==
      "], 
     Association["Book" -> 10, "Theorem" -> 113] -> CompressedData["
1:eJztxbENgDAQBMFH4jj/n40EskRMS5TgBqiVjjAhFZAwwe7ezqMNZnY9+X1u
Kz0AjUCEQbYC7pjAyCApKajUz/RW8lKrxuIpss+zsm7uhwUW
      "], 
     Association["Book" -> 10, "Theorem" -> 114] -> CompressedData["
1:eJztxckNgDAQQ9FBwjizRQKlAVqihDRArXREOFIBF5707b2fR59E5Hrm970c
ARQCZgKXDVDFAlqApLsbvYxnectYW/M5tVhorRZ2A+eYBP0=
      "], 
     Association["Book" -> 10, "Theorem" -> 115] -> CompressedData["
1:eJztysENgDAMQ9EgUZw4SauOwEqM0AWYlY0oJ1bgwDt8yZL3cR5jEZHrye8b
SAqx1RQ1aSgRYQVZwx1mlpgrCOjLVHurvXOlcX48Z27wIgUV
      "], 
     Association["Book" -> 11, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAAjHh4pAEAlWgCfA==
      "], 
     Association["Book" -> 11, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweADlMQLAJUeAns=
      "], 
     Association["Book" -> 11, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaABjEgk2QAAlWYCfA==
      "], 
     Association["Book" -> 11, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAB3BwM7Bxs7KwMnKwMHGysHBwcrECKlZOTE0iC
AAeUhgMgnwMEWFkgfHYQAQDQOwQW
      "], 
     Association["Book" -> 11, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABHAzsHGzsrAycrAwcbGwcHBysbGysrJycnEAS
BNigNBwA+RwgwMoC4bODCADMswQI
      "], 
     Association["Book" -> 11, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAB3BxAwMDBysDBycbOwcrKwcbOzsnJycbKysoO
4bEiAxCPA6iMnQWiAiwNAM7LBBU=
      "], 
     Association["Book" -> 11, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSADFEYKAJUhAns=
      "], 
     Association["Book" -> 11, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSACbGwcHAxcrAwcHGwcrBysrKzs7FycnKwgBpgA
k3AA4nEAVbGzsIF5YGkAyekEAA==
      "], 
     Association["Book" -> 11, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWACvOysDKwcDKzcbOwcrOy8nKysXGxsHKysrOzs
rNwcIBIVAIU4OFhZgBJs7BBZAMu1BBE=
      "], 
     Association["Book" -> 11, "Theorem" -> 10] -> CompressedData["
1:eJztycENQFAYBOFfYnZXiB60pITXgFp1xIuTDhx8h7nM1o69DVV19vw+xRQu
Zjl4nUBSAJslvW8hCeM95Odex8gEAg==
      "], 
     Association["Book" -> 11, "Theorem" -> 11] -> CompressedData["
1:eJztx8EJgEAMRNEIhsxMgtiCLVnCNrC12pHxZAsefIcP/xjzHIuZXU9+37Kb
wyTPFD22ikZS0QdF4EWglFVaKUCsLm7XfgSY
      "], 
     Association["Book" -> 11, "Theorem" -> 12] -> CompressedData["
1:eJztx7ENgEAMBEEjYfnu7OBboCVK+AaolY4wES0Q/AQr7TGvc25mdr9Z/sZh
kmeKHqOikVT0QRH4EChllXYKEKuLB9PiBIo=
      "], 
     Association["Book" -> 11, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 11, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaADrAwcrGwc7BysrKzs7JwcIBorYAdisCwLgs8K
AL5vA8k=
      "], 
     Association["Book" -> 11, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAEnNysXBxcnFxcXBysrKwgGsTkBAJuTjTABcQs
HFxAxZzcQCUcANIZBIg=
      "], 
     Association["Book" -> 11, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 11, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAE7Ky8vDycrOy8fDzc3OycnJxAAsjjZWfnQABO
Dg5+fj5+fi4WLk4ukBpeIAEA07sEow==
      "], 
     Association["Book" -> 11, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAEbOwcrOycHOzsXGxs7KysrOzsrJxAGsSEAyAH
qIiDg52FjZ2dDayGlRUAvbED1w==
      "], 
     Association["Book" -> 11, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAEbGxsrCAAo4kFALVAA5g=
      "], 
     Association["Book" -> 11, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAFPHycrKwcHBzs7Kzs7OwgCshjZedAA1xAzAJi
cHJygigAwwgEJA==
      "], 
     Association["Book" -> 11, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAFvJycHEDAyckOBFwggoMDhJEBNxcPNzcHC4jJ
xcnFBaQAxMoEPQ==
      "], 
     Association["Book" -> 11, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAFnKysHBwc7Oys7OzsIArIY2XnQAOcQMwCZnCC
mBwAvXAECQ==
      "], 
     Association["Book" -> 11, "Theorem" -> 23] -> CompressedData["
1:eJzth8sNgFAQAtfE/bE0YUuW8BqwVjsSu/DgECZwrOtcm5ndr36+i0dNuEdm
dqcHGBQzqlJAoUnsRXZr6T7DVgRa
      "], 
     Association["Book" -> 11, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAGnKysPGxsHKysrBzsrLwcrOzsrKiAg5WDg4OV
hZWdg40dSABFALXIA7U=
      "], 
     Association["Book" -> 11, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAG3Jw8bGw8nJycPFyc3DwgxIECQEI8HCycXEBp
IBMoAgDDUwRJ
      "], 
     Association["Book" -> 11, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAGPOxAwMnJycXOycoOJNk5EICTg4OHi5uHh4uF
i5uDk5uThwsoCAC9FgQk
      "], 
     Association["Book" -> 11, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAGPOzs7JycnHzsnKzsQJIDCXBycPDw8PEIcrFw
83JxcnPy8gEVAgC97wQ9
      "], 
     Association["Book" -> 11, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAHnJysOAAHEIEAKwuEzw4iAK2LA4A=
      "], 
     Association["Book" -> 11, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAH/KwQwM6KBoACHCDAyoKkAACtLwN+
      "], 
     Association["Book" -> 11, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAHrBDAzooGgAIcIMDKgqQAAKp9A28=
      "], 
     Association["Book" -> 11, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAH3NzcPFzcrEJcvLzcHCiAEwS4WLg4uTm5ubh5
gBwAuoUEGw==
      "], 
     Association["Book" -> 11, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAAQrzcHEJsXBxc3LzcCMABRJzcHBxcLJxcPJwg
EU4uTgC88gQ5
      "], 
     Association["Book" -> 11, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAAvNyc4my8HJzcfDwIwMHNzcfFzcHBwcLDxcXJ
zcPNzcnFCQC85QRC
      "], 
     Association["Book" -> 11, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAA3JxCbBwcnNx8PAjAwc3NwcXNwcfBwsPNxcnN
w83NycUJALk5BCw=
      "], 
     Association["Book" -> 11, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAAHNysnBzcnMiAg4ODk5sXyGDh4OAGyvHwcHBy
AACxagPj
      "], 
     Association["Book" -> 11, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRABnBzsnNxc3EiAg5ubk5uPn5OThYeTj5OTm5eT
k4sTALQTBAs=
      "], 
     Association["Book" -> 11, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABbLwcnNx8PAjAwc3Nx8XNwcHBwsPFxcnNw83N
ycUJALQXBAo=
      "], 
     Association["Book" -> 11, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZABbKzsrOyogBUoxMrKwsrBygrmsbICAKZqA14=

      "], 
     Association["Book" -> 11, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdABXLy83BwogBMEuFi4OLk5ubm4eYAcAKx5A8I=

      "], Association["Book" -> 12, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRACwuw8SICDh4eLl4+Pi4uFl4+Xk4+bl4eTixMA
sgMEDg==
      "], 
     Association["Book" -> 12, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAC7DxIgIuHh5OLi5+fi4WHm4eDj5sXyOUEAK8q
A/U=
      "], 
     Association["Book" -> 12, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZACvFwIwAEiOIGYhZuTn5Obm5uPg4MDAKrpA8M=

      "], Association["Book" -> 12, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdACgoKCAgIC/AL8/PwcfHxcnFx8XBwsPJxcnNzc
3LxACgCwzwQA
      "], 
     Association["Book" -> 12, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRADEpISIuLiouJi4hzCwlzcvHxcHCycQMDNy83N
ycUJALbaBEA=
      "], 
     Association["Book" -> 12, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVADoqJCoqIiQMghJMTFzcvHxcHCCQTcvNzcnFyc
ALG3BBI=
      "], 
     Association["Book" -> 12, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZADEiKioiJAyCEszMXNy8fFwcIJBNy83NycXJwA
r/cEBA==
      "], 
     Association["Book" -> 12, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdADIiAgKsIhLs7FzcvHxcHCCQTcQMDJxQkArYMD
7w==
      "], 
     Association["Book" -> 12, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAEIuKC4hLiHIJCXNy8fFycLJxAwM3Lzc3JxQkA
q2cD3A==
      "], 
     Association["Book" -> 12, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAEkqKSkpIcfEJc3Lx8XBwsnEDADQScXJwAqgED
zQ==
      "], 
     Association["Book" -> 12, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAEorKyshx8/FzcvHxcHCycQMANBJxcnACoUgO9

      "], 
     Association["Book" -> 12, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAEIsLCHCIiXNy8fFxcLJxAwA0EnFycAKScA5g=

      "], 
     Association["Book" -> 12, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAFsrIcfPxc3Lx8XBwsnEDADQScXJwAo40Diw==

      "], 
     Association["Book" -> 12, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAFshx8/FzcvHxcHCycQMANBJxcnACg8gNu
      "], 
     Association["Book" -> 12, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAFHHz8XNy8fFwcLJxAwA0EnFycAJ50A1E=
      "], 
     Association["Book" -> 12, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAFHBxc3FxAwMLFxcHBCQQcHBwAnE0DKw==
      "], 
     Association["Book" -> 12, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAGUpycXDw8XCy83BycPNzcQB4nAJ1vA0Y=
      "], 
     Association["Book" -> 12, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAGnJxcPDxcLLzcHJw83NxAHicAm4EDLA==
      "], 
     Association["Book" -> 13, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAG3IJiwlwsvNxcnJyc3PxAAgCccAM/
      "], 
     Association["Book" -> 13, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAGgvz8XCxcnFyc3EAIBACa1QMg
      "], 
     Association["Book" -> 13, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHXFycLBxAwMXJxQmkAJjbAvk=
      "], 
     Association["Book" -> 13, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAHwlwsvNxcnJyc3PxAAgCZWwMN
      "], 
     Association["Book" -> 13, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAHXCy83FycnJzc/EACAJhRAvo=
      "], 
     Association["Book" -> 13, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAHLLzcvNzc3Dw8XLycAJgYAvw=
      "], 
     Association["Book" -> 13, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTAArAgAAJYsArE=
      "], 
     Association["Book" -> 13, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAAHKycnGxcvJxcnACWqALS
      "], 
     Association["Book" -> 13, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAA/Jz8PNzcnFycAJb9At8=
      "], 
     Association["Book" -> 13, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAAnNy83LycXJwAlmgCzw==
      "], 
     Association["Book" -> 13, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTABPGwcnMJ8QgCWDQLQ
      "], 
     Association["Book" -> 13, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXABXHxcnFycAJXEArg=
      "], 
     Association["Book" -> 13, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAB3LycXJwAlYUCrg==
      "], 
     Association["Book" -> 13, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAB3BycHACVQAKe
      "], 
     Association["Book" -> 13, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTACnJycAJUYApU=
      "], 
     Association["Book" -> 13, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXACfGIAlRQCng==
      "], 
     Association["Book" -> 13, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAC/ACU8QKJ
      "], 
     Association["Book" -> 13, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "]},
    SelectWithContents->True,
    Selectable->False]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"#", "[", 
           RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", "]"}],
          "\[Rule]", " ", 
         RowBox[{"N", "[", 
          RowBox[{"Mean", "[", 
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "]"}], "]"}]}], "&"}], "/@",
        "resDepth"}]}], ",", "vals", ",", "acc", ",", "xval"}], "}"}],
    ",", "\[IndentingNewLine]", 
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", " ", "#1"}], 
         "&"}], ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListLinePlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"Axes", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"False", ",", "True"}], "}"}]}], ",", 
         RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
         RowBox[{"Frame", "\[Rule]", " ", "True"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", 
            "\"\<average depth reduction\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", " ", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}]}], " ", "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", "10"}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
}, Open  ]]
					

(The peak in Book 9 is 9.15, which reduces the depth of many subsequent theorems by 10 steps, though—in a possible goof—is not actually used by Euclid in the proofs of any of them.)

Here is the maximum depth reduction achieved by adding each possible theorem:

Module
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"resDepth", "=", 
   InterpretationBox[
    DynamicModuleBox[{Typeset`open = False}, 
     TemplateBox[{"List", "ListIcon", 
       GridBox[{{
          RowBox[{
            TagBox["\"Head: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["List", "IconizedItem"]}]}, {
          RowBox[{
            TagBox["\"Length: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["465", "IconizedItem"]}]}, {
          RowBox[{
            TagBox["\"Byte count: \"", "IconizedLabel"], 
            "\[InvisibleSpace]", 
            TagBox["5397840", "IconizedItem"]}]}}, 
        GridBoxAlignment -> {"Columns" -> {{Left}}}, DefaultBaseStyle -> 
        "Column", 
        GridBoxItemSize -> {
         "Columns" -> {{Automatic}}, "Rows" -> {{Automatic}}}], 
       Dynamic[Typeset`open]},
      "IconizedObject"]],
    {Association["Book" -> 1, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 1, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIYmBiYgICBdoAJjR4F1AbIIQsArJwCig==
      "], 
     Association["Book" -> 1, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKAgImJiRkGgCxWZlZWKBNIgCShGESAaRQA1s/EDJOA
6mNGqGRGpphBGnAADKNBpqNaRV2A30CwfUzYnUVlAPE8OCJAJAO2cMYB4JED
BQACnQUe
      "], 
     Association["Book" -> 1, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 1, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJAgJkFGbCysAIBC7GAgQFTDEU3M5okAy6A1XSidJIJ
8BsItg+r96gOIJ5ngPmYEisB1kMHXg==
      "], 
     Association["Book" -> 1, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLAgBkEmKAEKzMrK4IPBMxQDCLANApgYABLwSRA+qAQ
JoJMMYM04AAYRoNMR7WKugC/gWD7mLA7i8oA4nkGcPgwgBmY4YwDoCsEAPhf
BRc=
      "], 
     Association["Book" -> 1, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIggJV8QFAzM5RmgWvAAbCajmoVdQF+A8H2EfYeNQDE
8wwwH5NqJQsSGwDHtwiE
      "], 
     Association["Book" -> 1, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKggJVsQFAvM5RmgWvAAbCajmoVdQF+A8H2EfYeNQDE
8wwwH5NqJQsSGwC+xQh/
      "], 
     Association["Book" -> 1, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJggA0MWJEBOzsrO6oAK5oASD0DA5BmR9aEpooJSrNA
aQZcgBULQBHEqZNMgN9AsH2s2J1FZQDxPAPMx6RYyY7GBwDfHQiW
      "], 
     Association["Book" -> 1, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLggJWdFQHYoRy4GDuUQFbEysoGFADqZGVjR9XJDoZQ
wARXCwEMuAArFoAiiFMnmQC/gWD7WLE7i8oA4nkQZgeRpFjJjsYHAN7VCJk=

      "], Association["Book" -> 1, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIQgJWdnZ0NGaBxsQBWIGZgANKseBQxQWkWqA4GXACb
ZhRBnDrJBPgNBNvHhtVZWPxLjAhOAPE8A8zHDCRpRlUMAKIqCZo=
      "], 
     Association["Book" -> 1, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKQACsIsLGxsbKxsrOzgXnsrFDADiXgAhDFQAzSx4YQ
ZmcHK0RSx8SKChhwAVYsgIEonWQC/AaC7WPF7iwqA4jnGWA+JsVKdjQ+AMpC
CIs=
      "], 
     Association["Book" -> 1, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKQAQcHGzJA42IBrEDMwACkWfEoYoLSLFAdDLgANs0o
gjh1kgnwGwi2jw27s7AHBdkA4nkGmI8ZSDQOWTEAlYQJlQ==
      "], 
     Association["Book" -> 1, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQAAcbMmBnZ0MFrBgAKMjAAJIAy4IUQNggLkw5E1SC
BSLLim4pHKBYBTWUAcaGWEVdgNspEOdAlED8iel3pFDAJ0kMANvGygBhs4EQ
Kyy4sQY7HgAABbcIwA==
      "], 
     Association["Book" -> 1, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJQARsy4OBgIwBYgRikiZUVjyImKM0C1YFhKVbLoQBF
EKdOMgF+A8H2sWF1Fhb/EiOCE0A8zwDzMQNJmlEVAwB3oQmC
      "], 
     Association["Book" -> 1, "Theorem" -> 16] -> CompressedData["
1:eJydUIkNgCAMrAY4yhau5Ags4KxuZAuYoDwxXkKvvV5TYIvHHhciOjW8wOzZ
KzhkngLiJhIGV+6Q6VbWkhoN1vt2aUFvw0McTv6Em3bTPjnONbeytpU+KEOQ
fjwIgOQsj0Ya1lIpm7RCd9xU+QVhTQvA
      "], 
     Association["Book" -> 1, "Theorem" -> 17] -> CompressedData["
1:eJytUNsNhDAMCyfy6hasdCOwwM16G2GnCJUD8XWWGiWNHbdZ1s97nUTky/CL
jAwiW0P0zF4icw/V2KEFB08k3MFjEzXyElLQSS91g9Zn5IZzY9oRpQaV8x1q
5ELXsmPxZ9hjt/zwWjM+TEeYncpz8yBxI5fbgXs0A0ZYoDT+ElH6LuvTGdcZ
w6jqzsPoDSMYCb0=
      "], 
     Association["Book" -> 1, "Theorem" -> 18] -> CompressedData["
1:eJytUNkNwlAMC6ip7bcFKzFCF2BWNmqOorai4gtLOZxEseTH8nouNzN7Z/rC
UGOIVATRnBlUZzYQUzMJqPFgHYzYC0Lf8q5upwi4dCVa0AHcxOzTF/kz5p/b
0mNczZsRO9xPNJfnSR0djLrEviw1N+RfE0zuaULJAA4gSzDH1ZOJ7U+WFaJN
C+U=
      "], 
     Association["Book" -> 1, "Theorem" -> 19] -> CompressedData["
1:eJytkd0NwkAMgwNq6tidgpUYoQswKxuRpKC24sQT30PO+VEs5W7r475ezOxZ
4ZtlUUIxQooIlE6lqjDJGBtg0EwCpCXHsyQyR0Mgttm4sqWmDHByaFroQFvn
ax/dyZ+Zf3bbL3Jqfh9ix/2UVvNc6aHDoYbszTQi3VB7jTC51xe0DeAA6snM
MVoytehjxwvdpQwX
      "], 
     Association["Book" -> 1, "Theorem" -> 20] -> CompressedData["
1:eJydkesNwjAMhA2qe2e7DMFKjNAFmJWNuKRCbaXAD74ojl/JRcl9fT7Wi5m9
mhmwNCKDXIokUn4ms2VCyHIDwVD/DcisLV1VyWJBo/fyGoFSZlIIjxiLijwg
NWqmfXyZ+rrzT+afVYmbUV0ztxvsuJ/CVjxnetPhoYbsRQlFuKGdawFL9/YF
XQZwAG1R5BgdMnWnPzbf/bgMMg==
      "], 
     Association["Book" -> 1, "Theorem" -> 21] -> CompressedData["
1:eJydUUEOwyAM86bSOAmf2Jf2hH5gb92P5lBNbaV2hxmITBwIMo/l9VxuAN4V
ztAF0sjezYzhGmERTrqgyBXmNMDdJCYZSvUUSaZpjFrepWd6Ttpac79oCsQO
1NQKfLlCXp78E/NPVc0Bqmrm+oINrR22JR4zo2hn1Ck2sRrJyTKbEInyqcut
8Qut8lqt6Nkl0yDDbH4A0T8MGw==
      "], 
     Association["Book" -> 1, "Theorem" -> 22] -> CompressedData["
1:eJydUcENwjAMNAj3znY6BCsxAgswKxtxSYXaSi0PLopjnx1f5Nyfr8fzYmbv
bg4xtxYZ5FwkkfIzmZ0JQZYLEAyz1oDMWuiqShYLWqOW1wiUmJtCeMSZqOUG
UqN22teXqdObf2L6mZW4GVU1cXnBCvdd2JN7ZhRtBnWINSmhCDf0vhawdG/6
giEDOIB+KHIcNbkNZwybH9XuDBs=
      "], 
     Association["Book" -> 1, "Theorem" -> 23] -> CompressedData["
1:eJydUUEOwjAMC2iZnWQ8gi/xhH2At/Ij3A60Tdo44Epp4rhx1d7n52O+mNmr
hWNMU2SQtyKJVJ7JbEwIivwgGU0MZNZCV1WyWNDqWl4jUGIGlfCIU9PcQG6a
nmnfXKFOT/6J8WdX5maUauRygxXuu7I190wXbR/qCGtTRhFuaHMtYOk+6Qu6
DeAA2qbKcTRk6El/bL4BwpQMEA==
      "], 
     Association["Book" -> 1, "Theorem" -> 24] -> CompressedData["
1:eJydUMENwkAMC4jUTtKqM7ASI7AAs7IRviuoVGp54NPlEseKo7veH7f7ycye
LRxgjgxyLJJI5ZnMxoSgyDeCIe0EZNZCV1WyWNDpWp4jUGIuKuERh575BblR
N+2TK9Txuv9h+NmVuRmlGrhssMJ9U7bmlumi74/aw9qUUYQb2lwLWLrP4xTd
BnAA7VHl2Bty6Un/bL4AttMMDQ==
      "], 
     Association["Book" -> 1, "Theorem" -> 25] -> CompressedData["
1:eJydUO0Ng1AIpI14B2icoSt1BBforN2oPGyjJuqP3svj4yAc4TG/nvNNRN7N
nMHcyCFIwjN2pzfGEmn5hdFEphFwj4WOCGcwkK96eTdDJNNlCjU7lfQNUo35
XX5xmrhY9y/0l9UUF2F29Vw2WKG6S1txz1TT9lBHWIspZKaCNlcM4qrTMFrJ
AAqgucwUR0O6CurY/ACc/Av+
      "], 
     Association["Book" -> 1, "Theorem" -> 26] -> CompressedData["
1:eJytUMENAyEMy1Vnh7DFrdQRboHO2o3qEJ3Eo5z6qAUmIUYOOc7X89zM7J20
RI/WmrekgTwArQm6DbOsRGSxpP16U6IHnJntiqm9dPQJGGbudsUj+TPWrVQ7
JamP4ga3xV8go95pTjosoAAaPahxaZYsEfnNNmmf8g9Dowkc
      "], 
     Association["Book" -> 1, "Theorem" -> 27] -> CompressedData["
1:eJytUMENgDAIRFMKuIUrOUIXcFY3kqM2qSY1PrwUCjnoQdeyb2UiogNuDFNV
gVXgZvbTwWkjAmOmzCj3eIlGsat2ZsloTx5nt6GgdOAQE6EWR/IzxqPUcWpJ
LHrf+4FX8gtcyMx3xVNk7NL4y6B0LC3NpS4/ATroCRA=
      "], 
     Association["Book" -> 1, "Theorem" -> 28] -> CompressedData["
1:eJyVkQsOAiEMRKvZ6Uy5hVfyCHsBz+qNbGE1YOImvkBTwvQHt/1x3y9m9ixz
gkS1CHUYpCZAEoQidelHpNFQtCMgd3GVvO42ujvgJ/VGQE/fU4jG7gOV/rzd
//ndSkGmQaq8WsFMzTHBvr7x7FqrbhXqfarB8p2tlSStZTkdQ4fWHGvZ8RXb
5wy8AMq5CaQ=
      "], 
     Association["Book" -> 1, "Theorem" -> 29] -> CompressedData["
1:eJytUMERgDAIi57pwy26kiN0AWd1IyuFU6H2ZR5cCFxSmsu+lQnAcZUR1uRA
3kxQGaC6Dm2HtjOTIi4Uws+4V5IawXiL+hdjQ8mjPYtP+JY9yAVR7dDUjoec
XCvsQxmjnVWYnoasCIk=
      "], 
     Association["Book" -> 1, "Theorem" -> 30] -> CompressedData["
1:eJytUEEOgCAMq8byD7/kE/iAb/VHQmHRbIR4sIel25qWsefzyAuAq5YpkgP5
MKGwqhLpS9PQNCup4UYRfkpjN4LxFvUv5obKoz2Lb/iWI+iCOB3Q1I6HTi4V
9qGM0c4qbG91pAh/
      "], 
     Association["Book" -> 1, "Theorem" -> 31] -> CompressedData["
1:eJyVUIENwzAIY1NTDCRH9KWd0Ad26z6ayVa1lbpOdQJyDAHEND8f801EXunO
0aAKNx5XdwOMoMcXBhWpVRkMwKm0IAk0DY1Ijns4qcXAr1rCfjbzDcBLc1k4
Xfs77kWMp1E2FwGzRnwmWFHK7pnBvdKTtos6whrMRtykctkQEtdSa+W2qLJQ
6rSS9KjI0IewPuMbTTcL2Q==
      "], 
     Association["Book" -> 1, "Theorem" -> 32] -> CompressedData["
1:eJyVUIkNwkAMC6hX5xPpCqzECF2AWdmIJEhAhaDCp7PyyXnO6/WyHojoVrQD
ZRbTeqxqZp4/DS03SQaI3IGOZTEvEYAhcEKakcFjOCLEpqzAcP3aq6W12xgX
pSZp2ymsuuyP+x/m36vXpJxVsz5We2GMjVvJbaSLcmr7iL7hmaxGIiCIMBNA
JsPdy8vzCxJaxFx/I1HH4qlFpIe8A/tfDH0=
      "], 
     Association["Book" -> 1, "Theorem" -> 33] -> CompressedData["
1:eJyVkYsNwzAIRGmV445s0ZU6QhborN0o4HxkV4raPlkIojMHzmN5PZebmb0r
fCNCDQapDpAEoTCrPKI+qDSc9wuqgryTXslEdwf80kpNvtmgtRCNLQeyvX4Y
9y+uRynIDEiV1yjoqT062M4nnlNr1I1CHVUtlu9sc0kyWtppXzo09hhtt18x
nTWwAoCTCXg=
      "], 
     Association["Book" -> 1, "Theorem" -> 34] -> CompressedData["
1:eJyVkIENwjAMBA3q+53QJViJEboAs7IRfqugFCkCTlHkJPbn7et2v20nM3to
+8qlheBKxgBIgohmprg1XSRB9soP7pxJ18NCdwd8+pOqSkLyu4axYiDl4xe7
/zC3IsjckFkuKxhRHwOs9Ymn6zjmHRPjdVJjvTfrnoOyDgtX/1Wx5mQnSL9m
tbxtAE9x0Ql3
      "], 
     Association["Book" -> 1, "Theorem" -> 35] -> CompressedData["
1:eJyVkQEOwjAIRdHs84HsEl7JI+wCntUb2V/n0mkW9aWh0MAH0styuy4nM7vL
fCdDcCZjACRBRJrJz9RDI8jq+cGVM+m6J7o74IeNVNUlJL9qGLsPNPn4adw/
OB5FkM2gZblGwYj2GGA/73ibOvZ5+8R4RVqsKq3UygoWrv17xVz1Kb21fX7F
tMXAA16uCW4=
      "], 
     Association["Book" -> 1, "Theorem" -> 36] -> CompressedData["
1:eJyNkGsOAjEIhNEszCzZS3glj7AX8KzeSIYf2ppo+rWQQimP3s7H/byY2VNq
ATZJcMCBkp27GQi4k7J0hYK9misQ8m6ICPf4V6cfdPpORFRynd1Vb63ddX63
IoBSXlHdvY9ojgH0/iaqa85xs+dtabDMtNTsRrcqB4kTeRyfKjP96fqrbQh4
AV+gCXg=
      "], 
     Association["Book" -> 1, "Theorem" -> 37] -> CompressedData["
1:eJyNkIkNwjAMRQ1q4uPbpiuwEiN0AWZlI5yKq0igfsuJnh3Fx3m5XpYDEd3G
sUcxl7kj3fMhB9xVWJTIjdnMBqucIpjBOSwiABzDNVMwwUwb7HeZ9TlQd6i7
GRwEDy1UBXJnu7vV/2ZtdFrj9V7DmX6q9w1WUpp+q6dqbiIi5W/MJ9lazci4
Z5AIBTevdTYeWweP6blIhMtfqv8sslqbBkFRAbkDa0kNMA==
      "], 
     Association["Book" -> 1, "Theorem" -> 38] -> CompressedData["
1:eJyNkI0NwkAIhdG0BzwO4gqu5AhdwFndyHdNtNZE0+9yJA8If9flfltOIvIY
5hB5yeo9agMOlJuai3SoAujdzY15qqHF1wn9ZwZ6t5gC8Dnws0tVxmDU92Gi
QqLSM92d/qPjHqX9jWJMyvVa4xLwT1rbSQZt9m9audfOY8a/yXoprN0g0FYp
PGnpzNv5rMmrh44cpTJT/jeshyyONq1qbWBPVHYNIg==
      "], 
     Association["Book" -> 1, "Theorem" -> 39] -> CompressedData["
1:eJyNkIsNwjAMRA1q4s/ZpiuwEiN0AWZlI5yKX5FAPcmJnh3FPp+X62U5ENFt
HPsUc7gj3fMhB9xVWJTIjdnMBqucIpjBycEZEQCO4ZopmGCmDfa7yfocqDvU
3QwOgocWqgK5e9yd6n+rNiYte72XOdNP9b7BKkrTb/VUzU1GpOKN+SRbuxkZ
9wwSoeDmtc7GY+vg4Z6LRLjipfrPImu0aRAUlZA7OWYNEg==
      "], 
     Association["Book" -> 1, "Theorem" -> 40] -> CompressedData["
1:eJyNkIsJAkEMRFe53WTy68GWLOEasFY7cvZATwXlHiQwScjvst6u66m1dp/u
IFkRXjsGs4KKorUwETOLgEJZJ+JSkhKE8TMTEeqLm6HDfs6oSp/M/pjOy5tX
IhMA48fXPcb4m7W5Kc8bg0cY3hnjQzKpHd+MAuojokrbZT2VbdOsmYzKxpeW
dP4OXZJfd5k1QqUqtBfsZ1lcbdnUNkAfH/ANAg==
      "], 
     Association["Book" -> 1, "Theorem" -> 41] -> CompressedData["
1:eJyVkIkNwjAMRQ1q4uPbZoauxAhdgFnZCKfiKhKo/MiJnm3Fx7xczsuBiK7j
2qtwR7rnXQ64q7AokRuzmQ1WOUUwg3OciABwDNdMwQQzbbDvJdZ0oN5QdzM4
CB5aqArkH+3uUv8ZtdFpjdd7DWf6rt43WEFp+qmeqrnxiJS9MB9kazUj455B
IhTcvNbZeGwdPKbnIhEue6r+s8hqbRoERTnkBgqrDPY=
      "], 
     Association["Book" -> 1, "Theorem" -> 42] -> CompressedData["
1:eJyVUNsRwyAMo72AHzj2DFmpI2SBztqNKsM17U97qeBEjAQO2vb7bb+UUh5J
p2HhH7Bu5taIBYoyqaq7sLCbEXVyWskmrpZm7gu+exX72iGAPJCrjiIMbUMw
RdDvn989g/ZTVQXhea3pAZloLRklxgTXIbwMw+QqnvrhYlZENHckRVTj1Oim
Ran5WkhKUEUOUmlF0J1gAVWETMRvCHOEI6tlXCg96QlnJw11
      "], 
     Association["Book" -> 1, "Theorem" -> 43] -> CompressedData["
1:eJyVkIENwjAMBA3q+52IJViJEboAs3Yj/C6gFClSOUWRk9ift+/r87FezGzT
dh4yBkASRDQzxa3pIgmy613BzpV0PSx0d8CnH6iqJCT/1jBWDKR8/GX3BHMr
e7+5IbNcVjCiPgZY6xdP13HMOybG56TGem/WPQdlHRau/qvilpOdIP2a1fK1
AbwA7fsJKA==
      "], 
     Association["Book" -> 1, "Theorem" -> 44] -> CompressedData["
1:eJyVUNkNQkEIRONbjuHQEmzJEmzAWu1I9nl9qXGyGRaGDITj+XI6b4joOukP
HBLIzJgPnokYLErkJmxmgIpKAMzgYGc00n07IwS7GRbgo39lRTWyesr8VDhF
lNZeVaPiv3V/Y3xVzZp62zHsBb2jS81r4SHIsmpmT+qmMA1dex5dItYnelZa
7GyVepAqyHhkECslL+6uC0e6GbdZ8sycWd5QkZqHsh18uqEN9QYv6Q1p
      "], 
     Association["Book" -> 1, "Theorem" -> 45] -> CompressedData["
1:eJyVUO2NQzEIS6tLwA4QdYRb6UboAjdrNyq8fkj90VbPURDBxkn4Pf//nQ+t
tUuFPQj3iCDpAN3sNKhojVQh4Q5VNTPRqSG5PAEc3Rih/MlG9sm39svTM1Ft
XAlza2VRLnCLnc/9ivGRZb009xh8Ajf0XtlWuBO6VUA+AjA8Z4JNc1epEvqs
JJmnjcqLRKRJzTGz5tJLpeS0VNTvBZ1zUvQFa8ZaOdnUTJQWV9uRDQ4=
      "], 
     Association["Book" -> 1, "Theorem" -> 46] -> CompressedData["
1:eJyVUNsRAjEIROcSHoHkLMGWLOEasFY7csmN+qXO7SQwLBsCXLf7bTsR0SPN
Ibit69ojRpi6e6uqCtYrW0JFJcYo3Ni584gIsOcI7V3a0qAozb5WT7niAXxk
eYee1EIRAuZH2/2H+jOboxFurTlaHn0BVDY0+ZkxKTNru91FQ3VMZk4FiChW
9FaMd4SPkCbj2oNEKLjkDgpHLoFTw4hMmOUDFbG44IMlo2wF7gmYOwzA
      "], 
     Association["Book" -> 1, "Theorem" -> 47] -> CompressedData["
1:eJyVUMENwzAIpFUcOMA4HaErdYQs0Fm7UbETReqjrXK2DgwnfPZ9fT7WCxG9
Op3DLdyrmVvUWBafIUzkOrMqWhOBtB5MgoObD/HVjWsVnUyVi+vX4Tkj12Bo
JDRAiF5DQttpu38w/+wCnVI1D0/D1o5Seq66O84tZfO48SZywEdlvCqR/wPB
ofDjlBeVAuI+gFhIUcws2yrOnGU0RmEVZvlANa8VU08N5hneeCUMkQ==
      "], 
     Association["Book" -> 1, "Theorem" -> 48] -> CompressedData["
1:eJyVkAEOwjAIRatZB/9Dy67glTzCLuBZvZFMl7iZuGQvgaT0U365zY/7fCml
PJd0kk4aQE7Rp/ARKqU4RyExhUIREaqmXbo0d29mV4dEkIORUjv/vo0N1hN0
lIxlYPain7d7zHh4C2SyVI0fS/y6qxU/6KayCit3PYtIM3J5q4jL6U0OqlWK
AO5FNEdXs/y2Ol0kywhBldy26I4GtsYBzJ3DPCsvQC0MXQ==
      "], 
     Association["Book" -> 2, "Theorem" -> 1] -> CompressedData["
1:eJydUO0NQjEIRGPLx+uVGVzJEd4CzupGHjXqL0300kLgrhQ479fLfhCRW5lf
kTmBifAxxtbdXWSMrlFwc0Nm002hqQmA2SPgmbadNiraiI+lS+58QI8qP6gX
DzhDIvBHu1/Rv7I1mvD2XqPV8SeYqoZWfjFhbbHxsA/RdJ8rs6YizJwreinm
K+JHpCW0A2Im0FY7aIpagpZGGYWp2htuFkh+cKqoWqG7A1wEDJc=
      "], 
     Association["Book" -> 2, "Theorem" -> 2] -> CompressedData["
1:eJydUNsNwzAIpFVsHgZbGaErdYQs0Fm7UQ9HTb8aqT3ZILgzBm7b475diOiZ
5mesPWKEqbu3qqpE7pUtoaISYxRu7Nx5RASy1wjtXdrSoCjNvlZOueIBfGR5
h57UQhEC5v+0e4Z6yuZohFtrjpZH30AqG5r5yZiUydpud9FQHTMzpwJEFCs6
FOOI8BFoMq49SISCS+6gcOQSODWMyIRZPlARixUfLBllK3AvQdIMiw==
      "], 
     Association["Book" -> 2, "Theorem" -> 3] -> CompressedData["
1:eJydUNsNwzAIpFVsHgZbGaErdYQs0Fm7UQ9HTb8aqT3ZILgzBm7b475diOiZ
5nf0iBGm7t6qqhK5V7aEikqMUbixc+cREcheI7R3aUuDojT7WjjligfwkeUd
elILRQiY/9XuCeopm6MRbq05Wh59A6lsaOYnY1Ima7vdRUN1zMycChBRrOhQ
jCPCR6DJuPYgEQouuYPCkUvg1DAiE2b5QEUsVnywZJStwL0AJkMMeg==
      "], 
     Association["Book" -> 2, "Theorem" -> 4] -> CompressedData["
1:eJydkN0VwjAIhdEjBW6gdgVXcoQu4Kxu5E20+lQf+p2T8Bsg3NbHfT2JyLNf
B6haCoFEmKuKIFWj4+RaZdp01tIi9J4LVuWhDTBL7NbNrNZJEplAyyZt6CyD
Nh8bd5/pbxR90mDWBNIniA2dht3dGAKuI4r3PZKmJWIZnu2hO8+3CIMfazSK
EOd6StwlTbleNeXf2c3MGDJ1V/Mf1FEzG1y6xUG6eAHLbQwW
      "], 
     Association["Book" -> 2, "Theorem" -> 5] -> CompressedData["
1:eJydUMsRQjEIRMcEdgOoJdiSJdiAtdqRJG/Ukx7cSUhg+V9u9+ttJyKPKf7B
OU9EhI8OQMS9K8lBGMwzmw51TU0vkHsPuNs4DBIN/Jo2IohKFPOzxULARB6r
Duh/tvsV/Sc7R5O6vVdPmAcvlGk2tOyLobXFcpObUwK5LGuqghlqRW+PfGtV
qGih9gwxSGir9aJpuI+h5eJaGk3VPoBZRmbisLRCPU/9vQxq
      "], 
     Association["Book" -> 2, "Theorem" -> 6] -> CompressedData["
1:eJydUMsRQjEIRMcEdgNoC7ZkCa8Ba7UjSd6oJz24k5DA8r9u99t2EJHHFH8h
L0SEjw5AxL0ryUEYzDObDnVNTS+QRw+42zgNEg38mjUiiEoU87PHQsBEnqsO
6P+2+w39JztHk7q9V0+YBy+UaTa07IuhtcVyl7tTArksa6qCGWpFb498a1Wo
aKH2DDFIaKv1omm4j6Hl4loaTdU+gFlGZuK0tEI9T+JhDFk=
      "], 
     Association["Book" -> 2, "Theorem" -> 7] -> CompressedData["
1:eJydUNsNwzAIpFVsHgZHHiErdYQs0Fm7UQ9XTb/Sj5xsENwZA9v+fOw3Inql
uYYwdfdWVZXIvbIlVFRiXQs3du68RgSy9wjtXdrSoCjNzotCrngAH1neoSe1
UISA+eV2T1D/sjka4daao+XRL5DKhmZ+MiZlsvaxU1SG6piZORUgoljRoRhH
hI9Ak3HtQSIUXHIHhSOXwKlhRCbM8oOKWAx8sGSUrcC9Ac+YDEg=
      "], 
     Association["Book" -> 2, "Theorem" -> 8] -> CompressedData["
1:eJydkNERAyEIREnmFBYRa7iWUsI1kFrTUcBLLj9JPu6NI4I7yrJu99t2IaJH
bidR9G6tAiAyq6wJBGLuhRsbO7sFqlfrMJO2tFAU6M833R2J9zj0AKYEGxgj
qgo73+536n+L2WnYqzWt5cKbKGVHsz6DSpm3uu+7KNz4rLw0KoIY0aHwI5u/
KSlX7yQg5xKjQ+EccuOQGEemwiwfIJKjciwzCyI8Ab/yDEg=
      "], 
     Association["Book" -> 2, "Theorem" -> 9] -> CompressedData["
1:eJylkNkNQkEIRUcjywXGrQNbsgQbsFY78o7rM1E/9CQQGLaB3eG4P8xaa6eh
fmWbtQoTs9bKRADtXUVks1yGpa11pb16D2DegzGBVKZa5ceWmcg7KJJMpvAV
QOT6j+++Rb9GI6jALOVw7jFh7DuBQZcX95IUD/OGO+W17GqNQZBm7hxKN53H
KoEHyl3NrCg6TH8i7hVRhcVozTNR+xlsMQvF
      "], 
     Association["Book" -> 2, "Theorem" -> 10] -> CompressedData["
1:eJylUNtxwzAMU3ohRfBhuyN0pYzgBTprNwqk1P5q8tHgjpJIggBPX/v3bb+0
1n7G8W9UrNml99Z4CaDLoiLyuSzeo2+6amVVAR/lWiUQd9ee/lQxwuMAOFpR
0RhgCnhs76z7F/Rl18emIEtp7mMDHBCZOTEPwmbll/cgjsrsn4NmjFNklG2q
YBhBWjejKdDCJCOFKdKM/lg7RPkk5QT76Z6J65DmN5Frd2oJC98=
      "], 
     Association["Book" -> 2, "Theorem" -> 11] -> CompressedData["
1:eJylkNENAjEMQwu6traTa2AEVmIEFmBWNqLt3Q8S8AH+qBrnyYlyud2vt0NK
6TGe33WKtRiQklTgrgiAiIgKQ6uthrtHa0c3rCu09BLZ/GMgKQ6JEntOMCwp
uPm2tr/WfaPytSuNnTpVNDaa4qacx28aewPT4UZsWGli42R2ChCxOxzNXk2+
D8qZqZKuVNEvkWWdqYIDA4rKDBcqXhR2jtCiHuy0caonsrsMRw==
      "], 
     Association["Book" -> 2, "Theorem" -> 12] -> CompressedData["
1:eJylUNkRQjEIREeO5Yg12JIl2IC12pEkz+PD0R93JgGWhRBOl+v5siOi27z+
QLmYKtEIZgCZzMI1hmloyVFGVjW/r+AqNg531vSv/TIdWI0y20QATo7ECuH1
37ifkJ9Zn5P2EfEXsIF5eovwZdwWg02xyaQctZiHxs0chmdVJzta+n5IFeSi
WWRGOffYW9X1cVbtjaowjNXeYDPE6PLDjBIxzR0cLAun
      "], 
     Association["Book" -> 2, "Theorem" -> 13] -> CompressedData["
1:eJylUNsNAkEIRHO7MAPr5kqwJUu4BqzVjmT34iXG6I8ECDATXtftfttOIvIY
7h/xaoBIsKo7ejfCWu80t6Zde0Rc3M+RaSMXJ600fm1HwJPjwAxSsztA7ob2
57ofUn+i4zRJq5U5fiheUsqIOesTycMmyt3vpBVY9+0nNUn5LsPBWI9sTlMZ
fSlqI3Bz1EKLOp6MrlDLBmpvAo8EF4NZZJyVJ0YHC8Y=
      "], 
     Association["Book" -> 2, "Theorem" -> 14] -> CompressedData["
1:eJylkNENAjEMQwuiTeyk1xlYiRFYgFnZCPc4TkIIfnhVozSO0sjn6+1yPZRS
7jP8RYNZKUAz0sdo7p4Z1cIWndEFcOyJZfE4kUQNfp0WkZhkZGII9CzRgxJS
Yfy77sf6P1XOTXVb4w6e1DqztbAJvlZAvoJM6UTH2rN1uRO+VyTqtUr6yOSk
mSuvVsKqMuhSlsxxaajMoPkbo8smnkI9iZz2PQBQHwwQ
      "], 
     Association["Book" -> 3, "Theorem" -> 1] -> CompressedData["
1:eJytj9ENgDAIRE/Tg67hSo7QBZzVjSwQk/pRYqIvKb2mwMHWjr0tAE4L3yAF
UNYqQlLtDNBZqdK1lq4tbdrsUWrV/cat/fEz81FinEihmzMh/XyDuxG2skXE
7tE7t3bvMoxxAX9CB/M=
      "], 
     Association["Book" -> 3, "Theorem" -> 2] -> CompressedData["
1:eJylUAkOAkEIQ7NsKfMKv+QT9gO+1R9ZWLMeiZq4ZCBDBwrT03I5Lwczu1bY
aRFmZMQYCoxkZioLIpGslzgq6sapUie/cL1YEcUz+LnzT5t/fs3k81ybyHPb
xD17wQazTyPrzvdCHyVE93ZpPvrXirFlGgRQDg0FLOClGFwsgBcu1wjHu0hF
MrU6LXbcAC5BCps=
      "], 
     Association["Book" -> 3, "Theorem" -> 3] -> CompressedData["
1:eJytUNsRwyAMc3uxLcgUWSkjZIHOmo0qTEPhes1PojM+Y/khWLbXuj1EZC/u
KmYRICElqAJekPEJNPBUj+vEmEX6d5SOiP4heYPcAXbKxj4es6KEx3txITCS
HlbZ+mZvNQfl7T+8tX/nORflDElcpUJT40+hcmbD0h8NdFNHvwGSFwgu
      "], 
     Association["Book" -> 3, "Theorem" -> 4] -> CompressedData["
1:eJytUNsRwyAMc3vYFmyRlTpCFuis3ajCNDwul35FZ3zG8kOw7e/X/hCRT3U3
AMjIGaqAVxT8Ag081eOaGLNIL+foiuhfkvfIHbC/bOzjMatKeHwWFwIj6WGN
bW/2XnNQ3v/De/uY51xUCiRzlQpNjT+FxpktS08a6NJEfwGA5wgj
      "], 
     Association["Book" -> 3, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 3, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 3, "Theorem" -> 7] -> CompressedData["
1:eJytUEEOwkAIRFM6M4uNb/BLPqEf8K3+SNhGaxPrqbO7LDBkINzmx30+mdmz
zBGYAClIkYyIxmAgj1SZs4TIzJAhXNqVaV9g3nzN3n6aOGjcD8a/bDY3Y1aN
XCZY4b4Ji9xmepGWjexiJbOR5IbSNcGa+/UyqbcBHEB9GTl+iQzd6cvmCzM9
CoU=
      "], 
     Association["Book" -> 3, "Theorem" -> 8] -> CompressedData["
1:eJytUNsNwzAIdKpgOKiH6EodIQt01m7Ug0TpQ0o/qp4NEsfjEJfldl2m1to9
3V8AA8IIWIyhFhbKB4DciakI8zlDcRxOsXe4072Sx50/on/NUpyOVT03ofm+
iYjXgkV6/WLWnbdC4UWiGK9Sf/avFbFHpYamqhRVbaYyzgEVTuldkqdRQvTz
SDlkruvUse0BDiAKjw==
      "], 
     Association["Book" -> 3, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKoA9hZQQBEsiMBsCArE1iMlZ0FJMvGyorbEGStIN1A
mgHGBnOoDHA7BeIciBKwz6B+wQ7wShIDwLaxgkIRTDJA/A4xG7/VYLtZkJwB
AEV2B84=
      "], 
     Association["Book" -> 3, "Theorem" -> 10] -> CompressedData["
1:eJytUMERgCAMqx5t5nAlR2ABZ3UjaRAsd+qLHHCQtE3plo89LyJy+jEJALQA
6heDATejupJVJH+Y6mcNHcH8gZzXboX9qvQr28w78R/E5tggSXBVtU2hxTQJ
fR7o6U890A3CUKmfjmbR9G1IKcgXGukHyA==
      "], 
     Association["Book" -> 3, "Theorem" -> 11] -> CompressedData["
1:eJytUNsNAjEMK+jSxDE3BCsxwi3ArGyEm0PHQzo+EGkbKa4TWzkv18tyaK3d
RvpXzAwFgmTGHHQdAMKOIMjApBIG7I6I98hUegX3O3+M/vVX4kpi9eFELzcn
ZlkGC8y6hayeH0TTRlhIFjWf/SuDWyUhwBvMJOreJACcquxm5u7hPkTNP5c0
hky1nVp23AHqgwpr
      "], 
     Association["Book" -> 3, "Theorem" -> 12] -> CompressedData["
1:eJytUNsNAjEMK+hSxwkMwUqMcAswKxvh5tDxkI4PdGkbKa4TW7nMt+t8aK3d
R9ot0hX0zAw/e0KHpLAjk5nOSSWN3JzgnxGh9A5ud/4Z/eevxJXE6sOJXqxO
zKIMFhh1C1k8P4mmjWQhUdR49S+MXCsJkWg0kyjQJECequxmBsCBIWr4XtIY
MtV2atn+ANabCl4=
      "], 
     Association["Book" -> 3, "Theorem" -> 13] -> CompressedData["
1:eJytUIENwzAIS6cSMOSJvbQT+kBv3UcztOqqSdukaU5AimXA4bqst2Vqrd0z
/Q9GwGIMs7BQHgDkLghEGGY+IcDHBie4M53J95U/on/5Tyaqejph+OFExMtg
kV63mM3zLpTIRVRtSf1ZvynieHGQKhjKoarNVCIGVNild0mewRGir0vKJnNt
p5ZtD8T+Clw=
      "], 
     Association["Book" -> 3, "Theorem" -> 14] -> CompressedData["
1:eJytT8ERwzAIo73YgGScrtCVOkIW6KzdqNiXR9q79hXdgY0sEL5vz8d2EZHX
SCfiZjBEBCyMTgTJHrzSdV3RFgJaOn72w7MJoCca80hCMuctK3Oeu65I/fua
5iLpWevcZMaOUvwLdmB2Ye3u/VNkGQ7soj6qiemmYqWSovlpLWGRmWi1aipW
dVVztQNyHr2TWIAxmi3JN3M2Czg=
      "], 
     Association["Book" -> 3, "Theorem" -> 15] -> CompressedData["
1:eJytkMsNAjEMRA3axOPMRFsDLVHCNkCtdITNR4IDe9p3GI0cx7/LdrtuJzO7
lxwJBkByYIJQkK1N6uzuGeQi0pv497ukUagMkhQDBCl9IA4e1/r+NijJrJ4z
4YtA76Uv/6aVj0+kpE/kIXZ4PkZtnI3ySDbcs+lwU2vqa8ARqpyI1SPLj4z8
IqYsUXWClfsAAUAKsQ==
      "], 
     Association["Book" -> 3, "Theorem" -> 16] -> CompressedData["
1:eJytUNsNw0AIo1UINkzRlTpCFuis3ajmUlW5j+YrPunAvIx4bK/ndjOzd3+X
opIpoJChRxLAnR0sLqLp5N/uPABfa7uvMU0uxnqaHXpQ1ToWOsJ9op2cI6Oo
oEOc4JdsIdIt3KuMYSmBAJsywiMCMpLwmEforlptaXc/Nj7HiQpO
      "], 
     Association["Book" -> 3, "Theorem" -> 17] -> CompressedData["
1:eJytT9sNwkAMS1HT2LljCFZihC7ArGxELK5VBYKvWkqsvJ3b+rivk5k95c5F
71lAR7KYZBIXUV45F8OTP4c1KeQGJGyLlD9b7fK3CkhTdS3jfBn1A+COTxwy
OTINaF99em807UU9RrqFe6ZFWIYjQIUR4WUoqhMeh0XUrlbiZq18K8MLvtgK
Tg==
      "], 
     Association["Book" -> 3, "Theorem" -> 18] -> CompressedData["
1:eJytkMsRwzAIRInHmF2SJtKSS3ADrjUdhY8m9sU5GY1AINh50nvb1+0hIp90
N9vLw1DLnSSAKYL7k3OmSl7O5mRaS7SOoM5dvxt2+XsLpIuupXnQFGmq/uMc
gFUZpN2YlWbv1/gxj1PbeBhpwtB1MRNXhaFSM9PYiBCCaocG2AiOubL6bHwB
qMwKUQ==
      "], 
     Association["Book" -> 3, "Theorem" -> 19] -> CompressedData["
1:eJytUMERwzAIo70QJHWKrtQRskBnzUYVl6Tn9pFX5MMyYIPwc3m/lltErL1d
DRnwopmkiHuTHpzMSPHsKRo64DpxeB2/Wut8mgVak2/Ne3sbewYgE/8YItoj
+h5/sE05JnswMqMypagKVaLAdqsqbTC5RdZQiMdXTV1yU4YPk6IKLg==
      "], 
     Association["Book" -> 3, "Theorem" -> 20] -> CompressedData["
1:eJytUMENAjEMC+h6cZxTwwqsxAi3ALOyEUkf6CoEr7NUt4nd1Op9fz72i4i8
ik5HhKpraNdbRJC8xqYR5ouT2jb+vOmejkTujiI6heMMgIyzo65/VVZSpGut
UDiitaksce4MU6b2r+4BH7EeMlNRM0BUxa31vlUFpJBgEVBrGlGfhWUMsRHy
DQpXCqk=
      "], 
     Association["Book" -> 3, "Theorem" -> 21] -> CompressedData["
1:eJytUMENwzAIpFUwcLbjKBt0pY6QBTprNyrYUao+0ldOMmC4E2c/ttdzuxHR
O8L1WFmqzNJkba0BuLealsXylIHEBadC52oAUYygpL3u/flqp+nvVDWCs9Lw
o8NFgBmHz91g7+xOB5GLaumd8Rp89YNRjpsvEjESM1/KQjCutZiJqojEyZFd
JfoDwD9Vp14asqcP5QgKow==
      "], 
     Association["Book" -> 3, "Theorem" -> 22] -> CompressedData["
1:eJy1UMERwjAMC1wd2UraABuwEiN0AWZlI+ykV44HvEB3sR1bOiu5rvfbekgp
PSL8AYIZCxourTWSxzbn09nKVMgslR91ztUAoxhBk/a695dfG81fp6oRnJWH
Hx0uAiLcfW4Ge2dzOohSVWvvjNfwpR+Mut98EWAJZr5UkGgyz9UMqgDilMiu
gr6B9E/VqZfG4ukJy5UKkg==
      "], 
     Association["Book" -> 3, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJoATg4ONg52DnZWcGAiZWdjR3IZwGy2YAYpzZ2JMAK
REDMzgBjgzlUBridAnEORAkr2HJWPACvJDEAaBEnJysDyMusDBAGMATBUhy4
rWaHESxIfADUoweN
      "], 
     Association["Book" -> 3, "Theorem" -> 24] -> CompressedData["
1:eJy1j9ENgzAMRN0qlxdol2AlRmABZmUjnASpQSJ80Yt08UUX+zwt67y8zGzL
9BekD19GJIHeIkKKwWUU6v5SA47bak3RT8e8b1jm8YtV1wFdgU59NtFwa/VB
wyArizqbvyc/WcZeM2UDJVFoWu6v3QeD
      "], 
     Association["Book" -> 3, "Theorem" -> 25] -> CompressedData["
1:eJy1kOsNAjEMgwsisd3rDcFKjHALMCsb4RYhFRD84j4pkds8lfN2vWyHUsqt
u33AmmuqSqKOVZBQwy9YfS3izGK4sNgGItu/t8yfUbI7ZyVfESPGQjPTT33I
0EfSe58nHgREwdCAR0drbcQzIuGz2ZJKTPXWC+UznXrmODZ5BxG9Cc0=
      "], 
     Association["Book" -> 3, "Theorem" -> 26] -> CompressedData["
1:eJy1j9ENgCAMRKvheGpcwpUcwQWc1Y0saCIm4pceydEjR3udlnVeGjPbEv0E
RgYkgVoRoYvBZRSq/lEBztuOmqy/DvneMM/jinWsA3oClfpuouDS6oP6XpYX
dTZ/7/wkGWvNlAzkRKFouQOTxwdw
      "], 
     Association["Book" -> 3, "Theorem" -> 27] -> CompressedData["
1:eJy1jt0NwjAMhA1q6v+kK7ASI3QBZu1GvVQgyAM8lU9KnPPZsW/r475eiGjr
17/ghRsIs2sLbU18cjMt1b62RIR3EEMTeDp5hkKqui9n7zj/dK1vqqiaDegn
pQwSpoyZo8ix9JARwXlLfyk7pjHhIwxlobDSaoVr+hwcrKX3swykeaZO/YnS
QNgB/y4LBQ==
      "], 
     Association["Book" -> 3, "Theorem" -> 28] -> CompressedData["
1:eJy1T0EOgCAMm4aukPgJv+QT/IBv9UeOYeJMxJOWpKykbN28bss6iMhe6Tdw
IgASI6hk1mRSQXR/IIDnLa2m668jvjf0ebxitXVIPIGd+m5i4Gi1QaVAfFFj
sfdsp0rtNUM10BOl0PIAesQHXw==
      "], 
     Association["Book" -> 3, "Theorem" -> 29] -> CompressedData["
1:eJy1UNsRwjAMC1zdWGqMe2zASozQBZiVjbDTB1/wBbqcHVvKRfZtedyXUynl
meF/qFd3b+TZje7ahomkOD8+CBoJ5sUCMBQYsfXnXzscv7JAhlCN6ScPdohw
97ky1N7ZnK5CMcQA2WGXhkgBxaGwo+rDx6xUsoiW3JNdUEM31ZqSViFBo+ob
ULWWaxqyik9apBfEngrq
      "], 
     Association["Book" -> 3, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJoCNjZWYGAnYmdnY0dCFiAHDYgxqmeFQuAC7Lj00km
wG8g2D5WLM5ixyBh4uwwEXZ0PTgBKGRAFnFysjJA2OxgT3MAISgA2ZCtRTWW
A2YTC8J2VgBT/Ac6
      "], 
     Association["Book" -> 3, "Theorem" -> 31] -> CompressedData["
1:eJy1TkEOwjAMK2hdXCedKo0P8CWesA/wVn5Emk7bCU4QVW7jOK7v2/OxXVJK
rw7/rNYayWtbuN6ok3qTF36U+5jwips0A5EQ706j/Trf/HUKdHDVPPJgpOiV
M4+ce8C8Rw8cogrUYBhSnvtDUY/OPxKRJO7L1I9kMytZWJwXVS2iIoTI6YHi
m1bddgqTQifwBm4ZCmE=
      "], 
     Association["Book" -> 3, "Theorem" -> 32] -> CompressedData["
1:eJy1UMENwkAMO1BziZ0WCW6CrtQRugCzshHJqRUCAS+IIivxWefI83pd1kMp
5Zbw17qc6X5sJ7RmPjhpMvKj2t5Ukoymm02/Pq9+fQ3zgFDV55MAiCT2ZWc7
A7MdQoRNhIdqmzt2PcBIJYxUtWhuRRDWQk6AElKrujs02qj6GlB+N0REGMGc
7xsACgM=
      "], 
     Association["Book" -> 3, "Theorem" -> 33] -> CompressedData["
1:eJy1UEESwkAIQ6csCRx66PgAv+QT+gHf6o8EnGoverKZncxCssByXe+39SQi
j6Jjsbj7+TJzWSKmcKeGfzVjj3zocAiKwQww/3u68VMFitI1PjMVk1Qt7gD+
0rZMMwuD3oETfFfoe3OJm5KNVE2M+U9RCqhWBsuFpSsSFtaF9yuqk0vFlJNl
M49MPgEdYQoZ
      "], 
     Association["Book" -> 3, "Theorem" -> 34] -> CompressedData["
1:eJy1kEsOwjAQQwfUfOwpYlFxAK7EEXoBzsqN8ExTdQUr8OJpPlbi5L4+H+vJ
zF6BP4vu59vVl8V9ctKL87OX7CHuhWRbm5PLr8PVr1tdLshVj2BBALUGs+kj
ahmTJEJ1JmY1RMf+nK1OxnJs4ktKgTWAFI29NBnQCM3gUvOWB/dDcTNdmJRM
tqzfHi4KKA==
      "], 
     Association["Book" -> 3, "Theorem" -> 35] -> CompressedData["
1:eJy1UNEVAjEIQ9+1EOBaV3AlR7gFnNWNTK3P80e/NB80QAppz9v1sh1E5DbC
v3Fqx4D27rlkppY1P0oB+ABPxAhkMnkEk/i1t/q1y+Ui3FkrpjNMa0QpeGHW
rOzJU1Wbo2G/RJE5bE4aZTZtNh7bVKzUCFE+mv9kK2N4qioVXaFqrvYGzgu0
CF84kjQ6hXcr4Apv
      "], 
     Association["Book" -> 3, "Theorem" -> 36] -> CompressedData["
1:eJy1UMERAjEIROc4YCGXGmzJEq4Ba7UjYZLx7qMvZSYhwMIuue2P+34homdd
f7d+bZDeYUtECG/xEWkn8+lpvN2tW/u1tPVrNcmJPFFrCQGQZ6piPpSOnPIR
TBQ7Svq7KUEK0zGp0lnUuWWxMYloBEmQCTdtUAmECFeHQVi3E+/8JrhjgUPd
PGfbCxdpCmY=
      "], 
     Association["Book" -> 3, "Theorem" -> 37] -> CompressedData["
1:eJy1UMENAjEMC+jaxE5yzMBKjHALMCsbkZYTug+8wGqjuHadqtftfttOIvIY
5f84JyzTsUSEtYiPxnDHgPveFKQoBvUL8tcv61/VGl5FRDtJ1Jp7one88Tqz
diC7aSXWw6UyGWGYacNaog2drEGqKlStniGuzWkwrZLZOIej2doO+Rif5FnK
YqQlvLLxBBVkCnM=
      "], 
     Association["Book" -> 4, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoAJjAgBlGElCHClAEqe4wAs6GKMHmLCoDiOcZwOHD
AGYwE6sVXSEAiRUENQ==
      "], 
     Association["Book" -> 4, "Theorem" -> 2] -> CompressedData["
1:eJy9UMERwjAMM3dNLNnleHQCVmKELsCsbIRcLtAPvLj6odiKbCu5rvfbejKz
R8ERccllyZwyIlvGV1lEoCJGorBXqU7g/G9f/eetlguk6js/CpK9F27FYFsx
BAZINIMzNw0/zXgzutTJepwWtUbzqgy0QHME6UFxnpkCinLfGSlT9S+YlGpa
1OgnytcJ5A==
      "], 
     Association["Book" -> 4, "Theorem" -> 3] -> CompressedData["
1:eJy9TsERAjEIxJkjwEKKsCVLuAauVjsSkouOD3057jBLgCXsdT9u+4WI7kV/
gffusTng7PioAqAFzMcgyqxnv//aVvs6zeNJqWrlp0IXsrV8zgmUocvzKWyh
GqODIcVrfyriWeUhZiMRAUiFIOweJgwTqxgQqIi+fwJP2qoy65bpAalGCco=

      "], Association["Book" -> 4, "Theorem" -> 4] -> CompressedData["

1:eJy9kNENhEAIRDERdoa1CVuyBBu4Wq+jA9fE1Q/9MfcS2CFAhuy8fpZ1EJFv
pv/gTo4kJyVvhg64v9J0rNYoXsYebo4U15ptB/WYncpoQnnFfOt0ABHntSbS
iCauWqs4UsDALMOr5EyhFqBE9LS/GlM2A/wAsz8J2Q==
      "], 
     Association["Book" -> 4, "Theorem" -> 5] -> CompressedData["
1:eJy9kFEKwkAMRCM0zWQSUOgJvJJH6AU8qzcywVaKRb+kD7KbnQyb7F7n+20+
icijl4OYmEOQ0DO/elig4AoIWU+lX/491PizCvRM5Rpf7Tu8qE0Vn2wULkoA
sfP18xbTu9hfYmZiqplCF1ol6WoZpVtEuFUGdGz7JCOTQ19JZ4/2BJYaCbs=

      "], Association["Book" -> 4, "Theorem" -> 6] -> CompressedData["

1:eJy9UMENwkAMC1LT+JwIsQIrMUIXYFY2wlFLhUDwQljKXc72w+fzcr0sBzO7
9fEvjJySpB/50SKZEPgACFtfo4XTrzPNX1WgM8k1r7l6hqDLHa94YrgxCeSb
r7+3mXaxK4kIC/cq1WAMLTU8KsVHZo7QBvTsUJJiVnHqetRRR7sDeNcJpQ==

      "], Association["Book" -> 4, "Theorem" -> 7] -> CompressedData["

1:eJy9T9EJAlEMq3C9JnlTuJIj3ALO6kY2+E4ORb/EQN+jbUiT83a9bKeIuPn5
G7iQRIofGZJgaAeE2DvPf21p/boF7KlZ6zzfRWcAMvGKw0RzMoDxxnO8SXou
HYzMqEwpqkKVKNBtVWUX+usTWQchWmu0ucWSD2e4A0fuCU0=
      "], 
     Association["Book" -> 4, "Theorem" -> 8] -> CompressedData["
1:eJy9UEEOwjAMC9La2M4Eb+BLPGEf4K38iKTbEEKCE5qluoltNVGvy/22nMzs
UXQcppDYzvoaiDcogsy7RFaddPn3Rv2nq9o0T++kNnBFa1UNYTMwFEo7Zahc
aucKQcRLScputDkIoLl7DnVaeHOFvAkz4KVzJX4iv0ZTvoyZIRBPvjMKHw==

      "], Association["Book" -> 4, "Theorem" -> 9] -> CompressedData["

1:eJxTTMoPSmJkYGC4CCLoCFghgJA8CkARpLqDCDgXogSbs6gMIJ5ngPmYEisB
tpEGmQ==
      "], 
     Association["Book" -> 4, "Theorem" -> 10] -> CompressedData["
1:eJy9UNsRwjAMM3fEsRxzwAhdiRG6ALN2o8ptmusPfHHow4li+RFN8/s1X0Rk
yfBPPB9a4/4x7d6QcAIdQopO49f76NcshzNQpcc+Yy1VDOxvVk6ki8IRpyKK
zGGHakvaXrV9XqXSnps4xLVwppYSXtnIzKJannkdYD+0tOzKNtbQ6BRWfsAJ
1A==
      "], 
     Association["Book" -> 4, "Theorem" -> 11] -> CompressedData["
1:eJy9UEEOAjEIxEQYGLtejA/wSz5hP+Bb/ZGUrs1e9GScNJTMlIFyWx/39SAi
zx7+iqsrLx/VCMZEG5dMwqP9ehz7qmbzDPnK5gRvmGXuA1HHtZiNLFHZPzTy
zcD3Jl0cBb0RaQLDeRG6hClJqC4NEQrgFICmJ2KHrGRf2bF8WSt7AUzhCY0=

      "], Association["Book" -> 4, "Theorem" -> 12] -> CompressedData["
1:eJy9UEsSQiEMqzP2kwjuPIBX8gjvAp7VG9nC02GjK8csaEloCFy3+207iMij
lv8ilJePIkAsaA3o0l/bQPt1GvuqkpUpTxlmsnhHM8s+JrIyiw5mJ4eorLHZ
70+I1aTEOYBxm4mbn7swBKYkXbU3B9TdT+Gu6enrD+UkK9hx+LIV9wRCGQmI

      "], Association["Book" -> 4, "Theorem" -> 13] -> CompressedData["
1:eJy9kNENgCAMRGtiyx1M4UqO4ALO6ka24AeY6JfxJYVrruQalm1ft0lEjjh+
RslHL3fwuqVpksWbj7FXt+b5tmZ1oR6zoXUTyjuWq9MBeI3PmqhpJlm1FMkI
AQOj9awUM4magOTV0/5qDtkCcAJCjQmK
      "], 
     Association["Book" -> 4, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCLoDVhZ8UhhAhRBqruFsEtZsTuLygDieQaYj0mxkh2I
WZD4AJP6BoE=
      "], 
     Association["Book" -> 4, "Theorem" -> 15] -> CompressedData["
1:eJy9UMENAjEMCxJpazdlB1ZihFvgZmUjnCtFfOCFsCpHsaMm8nXbb9vJzO5J
fwc+O8CgoIpIImjqul4Hefn1KeWrS4pCUyVPmcCEO5/aMtqhgFykoXTBxTnU
iPZSROqOVotqhbGWMcxDSXhE0F15lOJzUVV1vKFlNOyd566vA8GG9gBIiQnf

      "], Association["Book" -> 4, "Theorem" -> 16] -> CompressedData["
1:eJzNUNsRAjEIxBlhWZKzB1uyhGvAWu1ISM7M/fnjhzsZwiywPO7787FfRORV
5o9ABhf6/GQRzv7rhvZlnDKZZWuCD8zS9wmO5zqYgxxBjVpo+oeAn0UqOAuq
UYQJDLdNwoWmkYTq1kEqgNYATU3whKyMOtl16EYr7g32BwlO
      "], 
     Association["Book" -> 5, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBgBtZmQEU6OAugAAmjwCfg==
      "], 
     Association["Book" -> 5, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAjAyMgyU1cMeAACY8gJ9
      "], 
     Association["Book" -> 5, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGBDAxDJjVwx0AAJjxAn0=
      "], 
     Association["Book" -> 5, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCAyg1cMbAACWOwJ7
      "], 
     Association["Book" -> 5, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDDACbWcEk4MQgN02JAEAnPQCgQ==
      "], 
     Association["Book" -> 5, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 5, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 5, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDjAxMDAyMA6sG3ABJjAaigAAnmQCgw==
      "], 
     Association["Book" -> 5, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMA64G7ACRjAaigAAmkcCfw==
      "], 
     Association["Book" -> 5, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGFDAyMDGCSEbGAXcKKmACo6EIAKI0AoY=
      "], 
     Association["Book" -> 5, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGORgCThxcAACWEwJ7
      "], 
     Association["Book" -> 5, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGDAOAjdgAYxgNBQBAJj0An4=
      "], 
     Association["Book" -> 5, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 5, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGGjAyMjAzAgGQNdBOQQJMYDQUAQCkugKI
      "], 
     Association["Book" -> 5, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbACEZDEQAAllMCfA==
      "], 
     Association["Book" -> 5, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAWBiYWJiYGJkHmh3IAEmRkYmBoZBEj4kAQCuhgKR

      "], Association["Book" -> 5, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbACEZDEQAAllMCfA==
      "], 
     Association["Book" -> 5, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGAwC6g5GJcbC4BgSArmECu2vIAQChmwKH
      "], 
     Association["Book" -> 5, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGCWBkYmAdaDcgASYgAJID7QwyAACmRQKM
      "], 
     Association["Book" -> 5, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGB2AC4sHjGrBbBpNzSAAAmi0Cfw==
      "], 
     Association["Book" -> 5, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDxhMrmECo6EIAJkKAn8=
      "], 
     Association["Book" -> 5, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbAAkZDEQAAmqYCgg==
      "], 
     Association["Book" -> 5, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCIGDWBkGmgXIAMmkGsGlYuIBACc/wKD
      "], 
     Association["Book" -> 5, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbAysjIysAwFMMHAJ48Aoc=
      "], 
     Association["Book" -> 5, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbAxMzMBCQH2hlkAACdfwKG
      "], 
     Association["Book" -> 6, "Theorem" -> 1] -> CompressedData["
1:eJzNkMENQjEMQ4NEm8RNmzACKzHCX4BZ2Yi0fD43Thx4B1exrUTqdbvfthMR
Pab8ExE2iaRFuFsYWTjcAZiNX9+rX1NgSrYqDvRFWqnL2AMpKwPekqUBHbo6
e0sEKoeTYU4rWtdA4Do6idKFS++mhfuw1jiXpRQGmOWDiriP/JzzXJidls8T
yZsKeQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 2] -> CompressedData["
1:eJzNkNENwjAMRI1EEvsSx2qZgJUYoQswKxtxCaX88cUHT8pF9p1sydftfttO
IvIY8lesMbiQtpBYQvhaBICI/ut1+asLUCpTGQf2gi3qbOyGpukBb2Gow7rN
zJ5ShenRoclqWnMbBCV3FzVZS3Jvlor3VmvhMAorL0U/mCqPwuOcx0BmnN8T
+hcKqg==
      "], 
     Association["Book" -> 6, "Theorem" -> 3] -> CompressedData["
1:eJzNTsERAjEIxBkTYInhzrMBW7KEa8Ba7UiIZvz58uHOsEl2gc11v9/2AxE9
kv4Lnris29Z8WdxPRtYd7qoK779Oq19dIEijq2JAs14IKX/0FhNShgdMiqYO
7WNojolAZSppxmssyCBlYua4s9CZizUrXIBmcTJbVBEpLB/U2OfdHUeYqKll
zhOtgAov
      "], 
     Association["Book" -> 6, "Theorem" -> 4] -> CompressedData["
1:eJzNj90NwjAMhA2N2/gvUbIBKzFCF2BWNuIaSh+Q+sYDJ+Uk332yldv6uK8X
Inpu9mdqvfcKlQiz7o1aq16rubcevz42nzZpmpLqlchBzXpI3kIEH8FeZB6d
6sdE2FVcBrNTOavkI0GJaVS4KGKkyxJwpy4c4cKMf6sylm0sl8LyJTW8FOZi
YLH3Bcq3CoU=
      "], 
     Association["Book" -> 6, "Theorem" -> 5] -> CompressedData["
1:eJzNTdsNAjEMC1zbS9yEqMANwEqMcAswKxuR3gOJD/74wIqsyLbs2/y4zwci
enb6N0yXCYHmTfVaGzW4udeKdrZfb5WvThqGBByJNFIFUKyQFSEFL8JmcF48
YCeRLFtI9hQzhN9Kz/DaEovxEMYxfkhcNlPJBRrLhZmtci7uRT4RLRFIPWz1
1Htfo68KWw==
      "], 
     Association["Book" -> 6, "Theorem" -> 6] -> CompressedData["
1:eJzNjsENwjAMRQ1NXPvbIYoKA7ASI3QBZmUjnFRF6oEbB56ip+T/KM59fT7W
ExG9uv6O5Yag1eZ+tUYN1WvtyVJ+PYq/NmmaktmZyOMWA44N3YgoPIIuAySP
Dtilmnur2B2IQOWThOI0qpioaoR5jj00Vi7FNTM8JrOIFJPMtbIewfhaKu5a
7NKffAOKlApL
      "], 
     Association["Book" -> 6, "Theorem" -> 7] -> CompressedData["
1:eJzNjskNAjEMRY00SbwyiYACaIkSpgFqpSO+M4gbtznwlThenmPft+djOxHR
K83/6aZQ78P9aoOGrlBEjEscPan+rCw4ZnBWUFXVsZKZquxCChaxSS4Lh8us
TeKDFbyakOwUIFbh9JGfBtEsYVBrTFzQQ01IW0mqVXPHNGYO4VZ7z8RX7LkW
7hJnlzDPL99+wgpW
      "], 
     Association["Book" -> 6, "Theorem" -> 8] -> CompressedData["
1:eJzNjcsNwkAMRA2Jszv+yCyiAVqihDRArXSEdxOQOHDjwDuM5JmR57reb+uB
iB5d/hBN2qmZXaJR07CI7pz910PL12SeplnkSGTZWkRMNrCRVuow9qDyyERe
ArAJDKOzt2oV1LeTYV4jykVASUrxVJCC3Q3MarnMpfuFOYLxiUhvzG4G1+h/
n2gaCjQ=
      "], 
     Association["Book" -> 6, "Theorem" -> 9] -> CompressedData["
1:eJzNkMsNAjEMRI1E4m9i7aYCWqKEbYBa6YghLMuNEwdGykieebKlXLbbdTsR
0f1p/6gxRixQLkl4kamqmf3Xd+rX1gzmoKod0pcQwWewF1JmZ/Y2QN2062R2
SsRUjgQlplnNa0bGtTcSpZVLRGjh1sOdsQyGqTHLRyqCT8m0s7mpqzeAD2pQ
Cjg=
      "], 
     Association["Book" -> 6, "Theorem" -> 10] -> CompressedData["
1:eJzNjcsNAjEMRI1E4l8cK7sV0BIlbAPUSkcMYVlunDjwJI/kmZF92W7X7URE
96f8JesaA+RIwrRMVc3sv35Tv6ZmEEer2oG+gAWdxh5ImZnZW1Dqpl1nZ2+J
mMrhIMQ2o/nNyLj2IFFauEQ0LRy9uTOOQbAFs3xQkZF9DDubm7p6oPgAVOIK
Lg==
      "], 
     Association["Book" -> 6, "Theorem" -> 11] -> CompressedData["
1:eJzNkNENAjEMQ4NEm8RpGx1iAVZihFuAWdkIU47jjy8+sFRLsZ8SqZf1dl0P
InJ/2n/q3BYqlxS+lunumePXV+rXFqAFqYpd/hIj+gy2wsrsgLcRGvDhk9ko
M7jtCUtOs5rXINA6upjLSUtrzYv20SKUy2icuqp95Gb8lEwcEfDw6AQfPLUK
Ew==
      "], 
     Association["Book" -> 6, "Theorem" -> 12] -> CompressedData["
1:eJzNkNENAjEMQ4NE28RJG91twEqMcAswKxthynH88cUHlmop9lMi9bLdrttJ
RO5P+1PFQuWSwheZZpY5fn2kfm0BmpOqOGQvMaLPYC+0zA54G6EBGzaZnVKF
6ZGw5DSreQ2CVkcXNVlbiQgrrY9wb1xG49Rb049MlZ+SiTMc5uad4AMkLQn/

      "], Association["Book" -> 6, "Theorem" -> 13] -> CompressedData["
1:eJzNjcsNAjEMRI2E1/EnjqWtgJYoYRugVjpiCMuKEycOzGEizxvHl+123U5E
dH/av2qMiFWLSsfI7N2r/Nc3lq/UDJZoLXZIX0IEn8EOGk9m9jZVdlPX2dlb
rZm2IwHENBEOMQsJc+/ESi6MbgjGEGER6Y63ivVDWA6zTDv3DE0fgX8f+DEJ
wQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 14] -> CompressedData["
1:eJzNTcENAjEMy0m0idOoQtcJWIkRbgFmZaNzenASH148cFWrsV3ntj3u2yIi
z6S/xVjXuIbwIgIg9V+vqF9ddxKYqn4CB0pJ5shzwKaCdyBRu6Onf6bMHPZS
kCan+YuLVFU0O8VMAqW1Bmc+3CmjK4ryqfaB4X0Mv8xCNBZiB9YCCaE=
      "], 
     Association["Book" -> 6, "Theorem" -> 15] -> CompressedData["
1:eJzNjdENwjAMRI1EYp9jpZANWIkRugCzshHnFCrxwxcffVFOse9i39bHfT2J
yDPluIwRlxBeRACU/u8N9afrTgFT1XewUUoqS54Nmx18AkldHEv6e8rMYe8O
0mQ1f3GRqormTDGTQGmtwZkPd7bRFUX5VPvi6n0MP8+BaJ3yAr/oCY0=
      "], 
     Association["Book" -> 6, "Theorem" -> 16] -> CompressedData["
1:eJzNjksOAkEIRDGRhio6nUnGC8yVPMJcwLN6I+lPTNy4cmEtCFS9AMf5uJ8X
EXn28se6cQvhRpAAItqvD5SvKZkFSRW+hSnV3g1jBT6c+SkmVhrRMJhFuRO+
HPQwp8HnIVWIAZViLoRGRMZEdU8bzaBWaeYf2mvbd177wkDkQrwAnbAJdA==

      "], Association["Book" -> 6, "Theorem" -> 17] -> CompressedData["
1:eJzNTsENAjEMCxJpY6eqTuoErHQj3ALMykYkbT98ePHAj0ixLduP63leNxF5
5fln8KD4QZAA6P3X+eV7O+MgXMXduYAF1bkoiS3YZNZSLFvpRMf0bJcZYZtB
ivElPIpUITUDpJoQGp0hE80saPQKrY212gdG62PwnoEOj0C8AYYuCWE=
      "], 
     Association["Book" -> 6, "Theorem" -> 18] -> CompressedData["
1:eJzNjdENwjAMRA1NXPtiE4UNWKkjdIHOykY4KUXio398cIqekruL/Vi3Zb0Q
0bPjr1UaNVSrFUC7+6/H82mSpikBVyKLFgOGXborrOAw3oHkkQEHVHNPFQd7
SaDycQLxGlFsVC2EeY47NE52N81cLDaziHiRzLWyfiv+RiO5mXq59ZEvs9sJ
nQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 19] -> CompressedData["
1:eJzNkMsNAjEMRB2JxJ7Jb5cOaIkStgFqpSOc7ILEgdsemINlzxvZkm/b474F
EXmO8t/q0tdaagXYezt7e/qNQhDS/1M9lcjMXdgV4+imcQCbDsh3AVIjGmbm
SJkR9nEc+jSRH1RVUaBQzOSKmHN2TBbSbTRF1FpV7UsL2rrwMhZm5O7BF5Vy
CXw=
      "], 
     Association["Book" -> 6, "Theorem" -> 20] -> CompressedData["
1:eJzNTcENwkAMC/TSJE7KcUIMwEqM0AWYlY3IXVV48eOBpViJbTm39XFfD0T0
7PTvaO7V41Tr5Xr+dff81SnTVIAjUc3UDMAxYBuY+zaETYcOxYCdMtRdw849
pDB9K0l5DSs/iggJc4BUKIQdbiyIUBVVXUxZWhOPDzQC5gGUxcNyPCtfoeQJ
ng==
      "], 
     Association["Book" -> 6, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 6, "Theorem" -> 22] -> CompressedData["
1:eJzNTckRwkAMMySO5WPJMqmAlighDVArHeF1mOHFjwd6yLakkW/7476fiOg5
6O9hcQlvbd22/uvq5aszT9Psfia6ZmqxgruZHmAeW96u5ZihFK3EO8Y5bYT0
SGUIphh76kV5lZUfRYQke40gFMKe3SwWAQiApmDpXTw+wKAUbG4R2jzyj74A
jHEJlQ==
      "], 
     Association["Book" -> 6, "Theorem" -> 23] -> CompressedData["
1:eJzNjd0NAjEMg4N0bWKnP+KYgJUY4RZgVjYiLcdJPPDGA1Zlpf6s5Lrdb9tJ
RB7D/l+11Eqi9/brzfkrWeKRMXi0Mg/hpZTGNIMd2ExAvg3IjWiYnb1lRtiR
BIzfRHFIVUWBQjGTFcndA5OFjBhNkbRWVftQZ7ucuRSMk97DnmCGCVM=
      "], 
     Association["Book" -> 6, "Theorem" -> 24] -> CompressedData["
1:eJzVUMERwjAMMzSJLTtOr3ddgJUYoQswKxvhpAVe/fFBDzmydMqdb9vjvl2I
6NnpD1DXFcA8t18Xl1MnTVNSvRIZERcz0x3YUUp/jcVhSB6e6psi1BQNI3Ok
RBTy2YQZqsv4hGKQcmlOAlo4u1dk9lbNOMqcu3Jm+QIicZRl0dQLAbOgF229
CXE=
      "], 
     Association["Book" -> 6, "Theorem" -> 25] -> CompressedData["
1:eJzVjcEVwkAIRNHIAgOb+IwN2JIlpAFrtSPZTaInb16cA2+Y+Q9uy+O+HIjo
2cY/CDZf56jnX98t36thIOBIFEkVd8cqW8XcXA+2QntiwD4Ssg2ynVKF6Ttp
jHbA8yGzkDC7kxhdhGtUzxVRCosIXDKr7PGRRYzmGHGKCRZp8+4LU7IJTg==

      "], Association["Book" -> 6, "Theorem" -> 26] -> CompressedData["
1:eJzVTcERwjAMMzSJLcdx6B0LsBIjdAFmZSOcttAXPz7oIVuWTr4tj/tyIqLn
oL/AFUDv/uva8tVJ05RUz0SViIuZ6QZsKGVs62E3JK+e6psi5ArHmtlTIgr5
XMIMNaTFxxikXLyRgC6cWzNkbm61cpQ1HqoxywGIzN37rElNYag1gi9KfAla

      "], Association["Book" -> 6, "Theorem" -> 27] -> CompressedData["
1:eJzVkMEVwyAMQ90GsC1smKErdYQskFmzUQ1J21NvvUQHg6z/xHs81u253oho
H+MaUqD39u/W8jNJy5KAO1ENqpgZDumhUsZtLs5A8syA9wioQZtO5qREoPLZ
RBhuWIsX4yBwaU6i1Dm7m2b2ZrVylBkP58zylYrEp/SOBBuV6gG+ADQPCT8=

      "], Association["Book" -> 6, "Theorem" -> 28] -> CompressedData["
1:eJzVjUEOwkAMAwMl2Th1BEJq73yJJ/QDvJUfsbtt4cSNC5ZiRfYouS2P+3IQ
kWezP9E8kZdfH7Xv1TBIxFGElTKSsQqrVNvWg63wniBitwphg7BT7gF/J43x
DtQnoqptIsQgV9Nk9jc001JKRCmaqSM/AisDZJx4Bphju/sCKeAJIA==
      "], 
     Association["Book" -> 6, "Theorem" -> 29] -> CompressedData["
1:eJzVjbsRAkEMQw2H12tZe8cQkNMSJVwD1EpHeO8DERkJCjQe6Y11mx/3+SAi
z27/oit5/vXP8r0aBgGOIkyqkMQqX6XaryXYirokDuyWkG+Q71St8PpOOlMX
IEdE1cRUI8RcLqaNLWcCLEXNDLDMmgY/cnL0wIgTJ3iLiPz7AhRJCRs=
      "], 
     Association["Book" -> 6, "Theorem" -> 30] -> CompressedData["
1:eJzVTcsNwlAMC5T0JY4jKjZgJUboAszKRuS9Fjhx44KlRPFH8XW939aDiDz6
+huQy69fzt+taRLgWK2VmsnEBt+g2q8h7IYNxYHXqlDAw0dmT5nB7a2UWazT
KpHWWh93UZdL00yyU5pp6UBzJTX4gZGByMTpTDgjqs2fAB4JFA==
      "], 
     Association["Book" -> 6, "Theorem" -> 31] -> CompressedData["
1:eJzVTsENwkAMC1JzOTtBJ9igKzFCF2DWbtSkpfDixwcrZ8V2lNy8PB/LRUTW
ov/BGL/e2L4mUxaZTb7W3J0HcEC1ut14BX13QJ6UQ5WCJ9dQJ/rbSUpV0vMQ
YGKqQUGXu6nT6wwjg8QVpoabeXzQI/JfwzkBKTxq5QbvWQj7
      "], 
     Association["Book" -> 6, "Theorem" -> 32] -> CompressedData["
1:eJzVjcENAjEMBA2XnL1rR0oLtEQJ1wC10hGOT4gXPz6MIitxVrO343E/LiLy
XOOPmL8W7l9/2rY18lqd+07S6xAnva8b14InVhuQ75GhIAKVgRdmhCE98BUK
5KsE2aiqoumlmMrQnnF0ZYSZmlnAus6pHh8sVkOQbfjA8EgrXuyaCQw=
      "], 
     Association["Book" -> 6, "Theorem" -> 33] -> CompressedData["
1:eJzlTdsNgCAQOxOBewDO4EqOwALO6kYWotH7cQGb3KPXQte2b20ioqO3PyN+
qqpoAldUQN4IwVGI7C/DZCLmLsyoh9rNdKQlwkcITUxZw1IrVJUrOCcJ/X1i
h6JWisx9hTVjnLAjCHo=
      "], 
     Association["Book" -> 7, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFGBkHHJOpjcAAJhIAn0=
      "], 
     Association["Book" -> 7, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGFmBiGmgXDHYAAJloAn4=
      "], 
     Association["Book" -> 7, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGGGAaaAcMdgAAlyQCfA==
      "], 
     Association["Book" -> 7, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHGBEB2ARoDhIClMWG2BAplHMBpuBbBedPUc5AADe
BQLH
      "], 
     Association["Book" -> 7, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHmBkZGACIiYGIGZkBNKMQAAUBsuASSTAxIgFMCBh
lDAAcpgYUa0aagAA3vACyA==
      "], 
     Association["Book" -> 7, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIGACo1GAFQAAmVACfg==
      "], 
     Association["Book" -> 7, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJhi6Lqc1AACV/AJ7
      "], 
     Association["Book" -> 7, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJGBiYGRiYoRiBkYgYGAAIUYwGxkwoQuAFSPTyOYC
OUzIAoxDL4QA4bYCyw==
      "], 
     Association["Book" -> 7, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKmBhAWImIGBhYAQCBgYmBiCDgZEBzIUAkCyIZmFi
RAEMEMwE1YkAIGOQBRiHXggBAO5yAtg=
      "], 
     Association["Book" -> 7, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLmBiIl4UizpqOmUQAQCe5gKE
      "], 
     Association["Book" -> 7, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLGABIiYgYGFgBAIGBiYGIIOBkQHMhQCQLIhmYWJE
AQwQzATViQAgY5AFGIdeCAEA6hIC1A==
      "], 
     Association["Book" -> 7, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMmBlAgJWBkYgYGBgYgAyGBgZwFwIAMqygWhWJkYU
wADBTAxMTIwoYQAyBlmAceiFEADsTALX
      "], 
     Association["Book" -> 7, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMGDGKohVlEjNwwAAAKCmAoY=
      "], 
     Association["Book" -> 7, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGG2BmJlIdbZ0xYAAAnWcCgw==
      "], 
     Association["Book" -> 7, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNGACAjYGRiAAshmADAZGBjAXAoCyrCCalYkRBTBA
MBNQPyNKGICMQRZgHHohBADnBgLS
      "], 
     Association["Book" -> 7, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNmBhYmJjYAECIJOBgZkZiICAhQVMMbOxsbGCaFYo
HwYgqphZGJiYgBwkAOQwIQugyg4JAABadQNP
      "], 
     Association["Book" -> 7, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOGBiYmJgAQIGBiBiZgYiZmZGFhZmGGACC7AyowAG
CMUCppFNA3KYwAKMYMSAKjskAABE6gM6
      "], 
     Association["Book" -> 7, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOmBmYmAGAiADiFgYWJiYWIBcFhDBzAQEIJqRkRkF
gHSAlEF1IhnGwMDEhMRnYaGzbygHADl2Ay4=
      "], 
     Association["Book" -> 7, "Theorem" -> 21] -> CompressedData["
1:eJzVy9ENACEIA9DSFPdwJUdwgZv1Njrqj94IvhBCQ9rnM2YAeL2uR6gAHkGk
MrVkoY8I/bjhB1pzc6tAnlm4zQdqfQNk
      "], 
     Association["Book" -> 7, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGAWAEAgYGJhCDgRHChQAmJiZWEM3KxIgCGCCYiYGJ
iRElDEDGMKKYTGevUA4A2CACxA==
      "], 
     Association["Book" -> 7, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 24] -> CompressedData["
1:eJzdj8EVgDAIQ1NaknYMV3KELuCsbtTCybsH3/MDCeHGMa9zFgB3yB8YYwAC
SLBWuuR7JJLp1aio3pUgbuFmWx7sYB5LyUZrn/zzhgXesgPj
      "], 
     Association["Book" -> 7, "Theorem" -> 25] -> CompressedData["
1:eJzdj9EJgEAMQ3NtmuoYruQIt4CzutFdBcF/PwRfQ1tCKHTrx94bgLPaL1gX
IAEJclcEp8iQVJNuYtUNyiMTZpnPO8S0ammXZu6Td94wAIljA4U=
      "], 
     Association["Book" -> 7, "Theorem" -> 26] -> CompressedData["
1:eJzdj+EJgFAIhO/pebZGKzXCW6BZ2ygNgv73I+jzUDkOwXXu2xwAjm7/YAES
kCB3RbBEhqSedBO7btAemTDLfJ4hyuplXKrcJ9+84QSAAAN8
      "], 
     Association["Book" -> 7, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGDWBnYGBjY2BjZmZjZWdnBWJ2djY2NjDNzMTGDoIc
HOxgwAASA9FMTEAC1QwmVhCDEYwYWFgGxCuUAADCiQPI
      "], 
     Association["Book" -> 7, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGE2BjY2BjZmZjZWUBIhYWVjY2NhDNwszExgKCMMAA
EmNh4WRgYmJnRzaAhQEoBGIwghFQ3YD4gxIAAHIpA28=
      "], 
     Association["Book" -> 7, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGESDPO8MsEACWtgJ8
      "], 
     Association["Book" -> 7, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGPBhmgQAAlbECew==
      "], 
     Association["Book" -> 7, "Theorem" -> 33] -> CompressedData["
1:eJzdyYENgDAMA7Aky7a+wUucsAe4lY/oJiHBCWC1UaNu49gHAZwzfiUQpYTk
1myLZO2JDIe7b8gtdoWV5SGLNA+uwfv7CRdjZQNf
      "], 
     Association["Book" -> 7, "Theorem" -> 34] -> CompressedData["
1:eJzdj7ENgFAQQuEO9N8YruQILuCsbqRfY2JvYeILoYBQMC3rvBDA1u1nVGZF
usp2kvRwQDY3j75BL6WEIuK5FnAFPAXpkxNv2AFslwNp
      "], 
     Association["Book" -> 7, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 7, "Theorem" -> 36] -> CompressedData["
1:eJzdj8ENgDAUQuF/0HYMV3KELuCsblSrMfHuwcQXwgHCgaVtayOAfdjfyKyR
rtV2kvR0QBYXz77BKKWEIuI5FnAFPAXpkw9v6GOOA2A=
      "], 
     Association["Book" -> 7, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGHWCmqrIhBwCaZwKA
      "], 
     Association["Book" -> 7, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGKmAeaAfQCAAAl2oCfQ==
      "], 
     Association["Book" -> 7, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGIWBiZeXkZGVlZWFkZGQFAUZGDlYOVnZWGGAAYmYW
FmYGFiYmJmSdLEDNYAFGMGJgYRkQD1ACAFPWA1A=
      "], 
     Association["Book" -> 8, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGI2BlAQNWIGADMRgZWVAAAxAzQ2kWZI1ADhMTMp+F
YagBADvWAzY=
      "], 
     Association["Book" -> 8, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJeBkY+Pk5ORmAwIOIGBmYmVnZWdhB7HZgYABiJnZ
2VkYWIAAWR87AwMTK4jBCARAClV2SAAAnxIDpQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGJ2Bj5+Hh4mFkZGTlAQJGRhYOFg5WDi4ODk4OIGAA
YmZ2diYGFhYmJmRt7AwMTOwgBlAnKHBYWAbE9ZQAAKLzA6o=
      "], 
     Association["Book" -> 8, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGKeBkY2NjYmRkZAXSbIyMHKwcrOxANisYMAAxMwsL
MwMLExMTsi4WBgaIACMYMbCwDIjjKQEARbsDQg==
      "], 
     Association["Book" -> 8, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGK2BjY2VmZGRkZQMCRkZ2NiAEsllZgYgVLMnCwsTA
wsKIEgYsDAxMTCAGUCdIgoVlQNxOCQAAQCsDPQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGLeDjYWNkZGTlBQJGRhYOEOTi4ODkAAIGIGZmZ2dh
YAECZD3sDAxM7CAGUCcocFBlhwQAAJCEA5g=
      "], 
     Association["Book" -> 8, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGL+BhY2RkZOUFAkZGFg4Q5OLg4OQAAgYgZmZnZ2Fg
AQJkLewMDEzsIAZQJyhwUGWHBAAAgwIDig==
      "], 
     Association["Book" -> 8, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGMWBjZGRk5QUCRkYWLiDk5ODg4uEAAgYgi5mdnYWB
BQiQdbAzMDCxgxhAnaDAQZUdEgAAf3kDhw==
      "], 
     Association["Book" -> 8, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGM2ADAg4gYGZiZWdlZ2EHsdmBgAGImdnZWRhYgABZ
AzsDAxMriMEIBEAKVXZIAABl+QNq
      "], 
     Association["Book" -> 8, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGNWBnZ2UGAlYWZhTAAKFYGJiYgBwkAOQwMSHzmRmG
GgAAA+IC+Q==
      "], 
     Association["Book" -> 8, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGN2BnZQYCVhZmFMAAoVgYmJiAHCQA5DAxIfOZGYYa
AAD9LgLy
      "], 
     Association["Book" -> 8, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGOWBlBgJWFmYUwAChWBiYmIAcJADkMDEh85kZhhoA
APaQAus=
      "], 
     Association["Book" -> 8, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGO2AEAlYmRhTAAMFMDExACRTFDAxMjCia6exaygEA
urYCpg==
      "], 
     Association["Book" -> 8, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGPeAV4GVkZOEAQS4ODk4OIGAAYmYgYmBhYWJCVsvB
wMDEDmIwAgGQYmEZECdTAgBfIQNk
      "], 
     Association["Book" -> 8, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCKGP+AXYGRk4QBBLi4OTg4gYABiZnZ2ZgYWFiYmZKXs
DAxM7CAGIxAAKRaWAXExJQAAVC0DWA==
      "], 
     Association["Book" -> 8, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAOBlZGThAEEuDg5ODiBgAGJmIGJgYWFiQlbJwcDA
xA5iMAIBkGJhGRAHUwIARB4DRw==
      "], 
     Association["Book" -> 8, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAmBkZOEAQS4uDk4OIGAAYmZ2dmYGFhYmJmSF7AwM
TOwQHYygwGFhGRD3UgIAN2sDOQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBGBiRAEMEMzEwASUQFbHCFKKzB96IQQAsmACnQ==

      "], Association["Book" -> 8, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGBmBGAVAuCwMTE5CDrIwBKISijc7upBwAAN5lAtE=

      "], Association["Book" -> 8, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGCGDhYuFkgQEGIGZmYWFiYGFiRAkDFgYGJiYQgxGM
gOoGxLGUAAD1qgLs
      "], 
     Association["Book" -> 8, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGCmDhY+Hm5ODh4eLk5GTg5eJk5uFhZmBhYWJCVsTD
wMDEBWIwAgFIE8uAuJUSAABP2QNX
      "], 
     Association["Book" -> 8, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGDGDhZIEBBiBmZmFhYmBhYkQJAxYGBiYmEIMRjIDq
BsSplAAA6PYC3g==
      "], 
     Association["Book" -> 8, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGDmDh5uTg4eHi5ORk4OXiZObhYWZgYWFiQlbCw8DA
xAViMAIBSAvLgLiUEgAAP5cDRQ==
      "], 
     Association["Book" -> 8, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGEGBhZYEABiBmZmFhYmBhYkQJAxYGBiYmEIMRjIDq
BsShlAAA3igC0g==
      "], 
     Association["Book" -> 8, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGEuDg4OHh5OLiYuDj5GLm5WVmYGFhYkJWwMvAwMQD
YjACAZBiYRkQd1ICADfGAz0=
      "], 
     Association["Book" -> 8, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGFODg4OAEYgZOTg5mTk4mBhYWJiZkeU4GBiZ2EIMR
CIAUC8uAOJMSAAAQHQMO
      "], 
     Association["Book" -> 8, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGFuDk5AACBiBm5uBgYmBhYUQJAw4GBiZ2EIORESzB
wjIgrqQEAAAEIgMA
      "], 
     Association["Book" -> 9, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGGODjAAIGTg4OZi4uFgYWIECW5WJgYOIEMRiBAEih
yg4JAAAOWwMN
      "], 
     Association["Book" -> 9, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGGuAAAgZODg5mTk4WBhYgQJbkZGBg4gQxGIEASKHK
DgkAAP+OAvw=
      "], 
     Association["Book" -> 9, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGHODh4Wbg4uVh5uZmYmBhYUQJA24GBiYOEIORESzB
wjIgTqQEAAAIPQMG
      "], 
     Association["Book" -> 9, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGHhDgY+Dj4WHm5WViYGFhRAkDXgYGJh4Qg5ERLMHC
MiAupAQAAA5GAw4=
      "], 
     Association["Book" -> 9, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGIOBj4OPhYeblZWJgYWFECQNeBgYmHhCDkREswcIy
IA6kBAAAAGYC/g==
      "], 
     Association["Book" -> 9, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGJODi4WHm4GBiYGFhRAkDDgYGJg4Qg5ERLMHCMiDO
owQAAOGpAtk=
      "], 
     Association["Book" -> 9, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGJuDiZObjZ2ZgYWFiQhbmZ2Bg4gUxGIEASLGwDIjr
KAEA60YC5g==
      "], 
     Association["Book" -> 9, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGKOBh5uJiYmBhYUQJAy4GBiZ2EIORESzBwjIgjqME
AADTBALI
      "], 
     Association["Book" -> 9, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGKmDmZWdiYGFhRAkDdgYGJnYQg5ERLMHCMiBuowQA
AMaSArk=
      "], 
     Association["Book" -> 9, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGLhBgZmBhYWJCEWJgYOIDMRiBAEixsAyIyygBAM8v
AsU=
      "], 
     Association["Book" -> 9, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGMGBmYGFhYkIOBCEGBiY+EIMRCIAUC8vAuIwCAADE
JAK4
      "], 
     Association["Book" -> 9, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNGBiYmdH5rOAhEAMRjBiYGEZEGdRAgCwogKe
      "], 
     Association["Book" -> 9, "Theorem" -> 15] -> CompressedData["
1:eJzlkIENwzAIBEllAjzEO3SljJAFOms3CthKInWFnjDPA5Ylv4/PfixE9K30
1zAvv3/wuit3as2IkLibWZ3JKsMn2fSh65j6zHOrV1TnvogRF/1yyNdSSEVi
I1UKYQRYOCzMWERyJKzKog9ZY+uAtXIwlJzbDAij
      "], 
     Association["Book" -> 9, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGNmBjYUHmAjlMTMh8FoahBgCr8AKY
      "], 
     Association["Book" -> 9, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOGBhQeExMDAx4ZQdEgAApwQCkg==
      "], 
     Association["Book" -> 9, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGOuBE5rAwMDAxgRiMYMTAwjIgbqIEAACpQQKV
      "], 
     Association["Book" -> 9, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARJgYWBgYgIxGMGIgYVlgB1EOgAAofECjA==
      "], 
     Association["Book" -> 9, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQMjI4RkHA7hAQCZkQKA
      "], 
     Association["Book" -> 9, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARQMk8AAAJWpAns=
      "], 
     Association["Book" -> 9, "Theorem" -> 23] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQwwMTENtBOoAACZgAKA
      "], 
     Association["Book" -> 9, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQgwMg6XsAAAlz0CfQ==
      "], 
     Association["Book" -> 9, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARgwDpewAACWcwJ8
      "], 
     Association["Book" -> 9, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQQMl7AAAJWqAns=
      "], 
     Association["Book" -> 9, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARwwMQ20C6gAAJf0An4=
      "], 
     Association["Book" -> 9, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQIwD7QDqAAAly4CfQ==
      "], 
     Association["Book" -> 9, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASpg4gORjEAApFhYBtg1pAMAqUoClg==
      "], 
     Association["Book" -> 9, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 9, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARpgRmIzMQ2YM8gFAJnpAoE=
      "], 
     Association["Book" -> 9, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGATbACARAioVloB1CMgAAnVgChg==
      "], 
     Association["Book" -> 10, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGASZgHMIBAwCWWQJ8
      "], 
     Association["Book" -> 10, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGARbANNAOIB8AAJZYAnw=
      "], 
     Association["Book" -> 10, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQ7ANtAOIA8AAJksAoA=
      "], 
     Association["Book" -> 10, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAXbAzsLAwDrQjiADAACgSgKK
      "], 
     Association["Book" -> 10, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAQ7AwsDAOtBuIAMAAJtCAoM=
      "], 
     Association["Book" -> 10, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 9] -> CompressedData["
1:eJztjcENAjEMBH0Ske1dx748EG9aooRrgFrpiASQeNEB81jJs1r5etxvxyYi
jxV/fnI+bVuJmAEITqrG2PfMVOXMpXLKulS5LwO+fS4UlZhHJbNeuFfGx3CV
6WtCzGeqKtpaQEwltBH0pogwUzPrbk3HUMYXi4AzUKeewc4gOp5sEQo6
      "], 
     Association["Book" -> 10, "Theorem" -> 10] -> CompressedData["
1:eJztjssNAlEIRZlkXvhcHszbuLclS5gGrNWOBDVxZQeexU04QOB63m/nRkSP
jj+/2bctiUTM4Cgy1zqOiGBGZasomZdM1TaGt5/RQ5ZhVWQg8oVqhn8Muhna
K7C6xczEY7iRMDkPGHSwuYuwiEyVwWsx/Iu4m8It9xmOWT/a1Cdc2goj
      "], 
     Association["Book" -> 10, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGAT7AM9AOIBkAAJ06AoY=
      "], 
     Association["Book" -> 10, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 14] -> CompressedData["
1:eJztjesNAkEIhDGRGxjYvcQObMkSrgFrtSPZXR8/bcAvYQLDBK7H/XacROQx
5M8vto0MrvKF6ug4DC5sOk6+pUJBD5+ZV8qMbh+nljXNVX0BIFDNFIVcoK1l
KrKnGcysuwH7jsgvPrV3nls0j+rq7hOxRAjb
      "], 
     Association["Book" -> 10, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 10, "Theorem" -> 17] -> CompressedData["
1:eJztjdsNAkEIRdnEXeZygdiCLVnCNmCtdiTDRP20AU8Iz5NwOx/3cxOR50x/
fnJ1D5Ag3rjPnuwlaRXRV668pASyN2y1JAMCH6Putsx6YmZiemSIQkJ3d8eu
UZ+oU9GabKiOLxgjIzNx6amo8gLS+wjR
      "], 
     Association["Book" -> 10, "Theorem" -> 18] -> CompressedData["
1:eJztjdsNAkEIRdnEXeZygR5syRK2ga3VjmSYqJ824AnheRLu5/U4NxF5zvTn
N+4BEsQb99mTvSStIvrKlZeUQPaGrZZkQOBj1N2WWT/MTEyPDFFI6O7u2DXq
E3UqWpMN1fEFY2RkJm49FVVeyCsIwQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 19] -> CompressedData["
1:eJztjcsNAzEIRIkUGwZjLGsrSEspYRvYWrej4I+UawrIOyAz84Rf5/U+H0R0
j/HnB46mCyxyHq8Z7ELS7JaxtOwKx3S2JaKQnWCUsQ2O+CJSUs5eSUCNk5kh
cXUrhUOpHJsps3yBSG/euz7nQZQa4wOv0Qi0
      "], 
     Association["Book" -> 10, "Theorem" -> 20] -> CompressedData["
1:eJztjcsNAzEIRIkUGwZjLEvbQFpKCdvA1rodBX+kXFNA3gGZmSf8Oq/3+SCi
e4w/v9B0gUXO4zWDXUia3TKWll3hmM62RBSyE4wytsERP0RKytkrCahxMjMk
rm6lcCiVYzNlli8Q6c171+c8iFJjfACjbAii
      "], 
     Association["Book" -> 10, "Theorem" -> 21] -> CompressedData["
1:eJztjesNgDAIhDGx5VFLdQRXcoQu4KxuJLQ+/jqAH8klx11grftWBwA4XH4+
IR3uxOhq1qZDoQV3oZVUWD1/WkTCdG3YQ3PO3B+AYNQMxLBgyHnigFmnlNAq
JgFFEOmFiUrRUmRsBzm5nJV3CIE=
      "], 
     Association["Book" -> 10, "Theorem" -> 22] -> CompressedData["
1:eJztjO0JgjEQgytY7jN922sXcCVHeBdwVjfyioJ/HcAHLhxJyO183M9LKeW5
5c9vRMzVmjs2ZjN1zbmwGnAALNvpEaljfEoNhhUNC29YsoyZTmDsEII+xrDc
F6FCtaoWogKqaiqV1D0DZoZwpd7J/Au7K/KOK1p+5iqQF2ieClw=
      "], 
     Association["Book" -> 10, "Theorem" -> 23] -> CompressedData["
1:eJztjLsNAkEMRBcJy5/12LvcBaS0RAnXALXSEV4ISCmAJ2uCNyPfjsf9OLXW
niv+/Mh1zAgzLMxm5Swwwr1OdZmxbZVx+YwQa7QHJvyNapTbyuyYq4RizMys
98zcmMisEbfOJGZKbO4iLCKuQpzJ3b+IOyIgeUaH9aoU+gJJdAoi
      "], 
     Association["Book" -> 10, "Theorem" -> 24] -> CompressedData["
1:eJztjMsNAjEMRIOE5U88drK7DdASJWwD1EpHOHDgSgE8WXN4M/LtfNzPS2vt
ueLPr4wZYYaF2aycBUa416kuM/a9MrbPCLFGR2DC36hGub3MgblKKMbMzPrO
zI2JzBpx60xipsTmLsIi4irEmdz9i7gjApJXdFivSqEvOcwKCg==
      "], 
     Association["Book" -> 10, "Theorem" -> 25] -> CompressedData["
1:eJztjN0NAkEIhNdEwj97CxVcS5ZwDVirHcn64qsFOCFfyMDMeT0f122M8dr4
62dVhKqIi6tWsypLsr0elu1kZnMdDe+ncPXK8Op9iyXcOhaevvZxR9Za1uXM
OBDAbACORCAVBrQwIiQiYwKcE9W+4g/jmPeuElUTdn4DLR4KAQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 26] -> CompressedData["
1:eJztjMsNAjEMRINE5PFnrIQlKzjSEiVsA9RKR2S1F44UwDvY1huNH9vruZ1K
Ke99/PkdMxy4D4C3Me5cM9MyVXfTloXkpc+TnB6JsSRXHjUNw3Q0EL6HDLTe
+3X+FtECkWCBFmq1MJUaGu5VVSla5674QoCWvXU/U81dPYz2Afe/CWA=
      "], 
     Association["Book" -> 10, "Theorem" -> 27] -> CompressedData["
1:eJztjTEOwlAMQz8SX2nyEzulrcTKlThCL8BZuREpCzN73xAlsWU/9tdzv7TW
3sc4+YN7OEDFGFkzk6nc5gMz1icQVCKhQJk2DmSsTHw9s9pCq4CVYB5iXcEl
rKJFpEnvZq1ru0mPcO9i7tOkBVRFq2L4jxLrrKYrHLWHD+oHP4EKRg==
      "], 
     Association["Book" -> 10, "Theorem" -> 28] -> CompressedData["
1:eJztjUESAkEIA8cqp1gYSNjV8e6XfMJ+wLf6I1kvnr3bhxSQFLnvz8d+aq29
DvnzC+ERVIyRpZlM5VwPzFiXQLlEQoEK3TiQMZn4ZFa1Kw3gJJiHWVvwElaf
RaRJ72ata9ukR7h3Mfdl0QKqolUx/EuZtVbTGY6awwf1DTDxCjE=
      "], 
     Association["Book" -> 10, "Theorem" -> 29] -> CompressedData["
1:eJztjLERAkEMA80MHvtsWf9D8DktUcI3QK10hI8LiMnZQMHK1v18Ps6LiLxm
/PmJY99JqmZnNswgi4wxTeXynGhX3X9yEREMog24dMyXjB42MzFVQNTkZloF
qIFwN3fncLNts8SXAQTQu9e+7aIYFW/gBQly
      "], 
     Association["Book" -> 10, "Theorem" -> 30] -> CompressedData["
1:eJztjLENAlEMQ4NE5FziWCc2YCVGuAWYlY3I5wpqel7h4jnx/Xg+jouZvVb8
+Y19l+RekzWoUmopsUzX6bXwqab/5ElmKsUx1KlzvVTOLgCDO2kOu8G7SQfF
CESEtgAkFL9sZCZn9zq3U7Sy8w3QtglW
      "], 
     Association["Book" -> 10, "Theorem" -> 31] -> CompressedData["
1:eJztjDEOQjEMQ4tEiJrGTtWRjStxhH8BzsqNcGFgZucpciLHye143I9Ta+25
5c+PXFmMmNIpatZaquglBwXp2OMnFOrk5Hqzna4QCQ5qqUJVpd6aeXMzoGlY
bshMczAvFxfsW+gjv0Qm+tD9mQNaAIF4ARq0CfI=
      "], 
     Association["Book" -> 10, "Theorem" -> 32] -> CompressedData["
1:eJztjMENQjEMQ4tEftQ0dqpuwEqM8BdgVjbChQNn7jxFTuQ4uZ2P+3lprT23
/PkVFiOmdIqatZYqeslBQTr2+AmFOjm53mynK0SCg1qqUFWpr2be3AxoGpYb
MtMczONwwb6FPvJLZKIP3V85oAUQiBcKjgnY
      "], 
     Association["Book" -> 10, "Theorem" -> 33] -> CompressedData["
1:eJztjEsOwjAMRI2EXcefpkosUthxJY7QC3BWboSzYt09bzHyaJ78PN6v4wIA
nxl/TtOHSH30qEmGe7gL1qFhbiPCbPSIntJdYnMfNYUJs+9l3ruHzfHGKW9W
8ikiAhKVAqjQFnSzmlWMiJhZlBcyI7Uf2bWUdV2vIhqa9Na+MjQKMQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 34] -> CompressedData["
1:eJztzLENQkEMA9BDIrnLJXEC5EsIKlZihL8As7IRHxpqel5hybLk2/q4r7vW
2vMdf79biBLXyrQ8WwAGDElcygNxLou81nKiqZiVoZGFjzEAKaQGyt5jjHJP
t+2TiBoxizTSduzkZrHVaczcexftwu6s9sVmQ8Qc+zknVHQeEC8N1AnN
      "], 
     Association["Book" -> 10, "Theorem" -> 35] -> CompressedData["
1:eJztzTEOwkAMBMBDwo59Xm9OBHFp+RJPyAd4Kz/ikoaeOlOs5LVkP7f3a7uU
Uj57nP4gAi4dw4I5855pDkZnm/e+jeiL1GBd0YLoeXAjfeVo2op9SXskGnCc
lCKq7kWi3CZJYB5jhaqaWQ2bNFMDPwqEO5nXOl6FRzTyC+x2CZw=
      "], 
     Association["Book" -> 10, "Theorem" -> 36] -> CompressedData["
1:eJztjNENw0AIQ4lUZMxx1SkbdKWMkAU6azcKXNXmP995EjYGya/9ve2LiHxK
bq6wNpItYZvWvNcl96kl7IMc8/LNpHsO/4xfimxMEwOii5kE1MMVGgxSAeQL
aqawk9y9P7P0UcnpZQdL/AfT
      "], 
     Association["Book" -> 10, "Theorem" -> 37] -> CompressedData["
1:eJztjNENw0AIQ4lUZMxx0a2QlTpCFsis3ahwUZv/fOdJ2Bgkb/vx3hcR+ZQ8
3KKRbAnbtOa9LrlPLeE6yDEvZybdc/hn/FJkYZoYEF3MJKAertBgkAogX1Az
hV3k7n3N0lclp5d9AUGeB8I=
      "], 
     Association["Book" -> 10, "Theorem" -> 38] -> CompressedData["
1:eJztjEsOwjAMRINE5PFnHKWlgS1X4gi9AGflRqSsWLPmLcajN5Lv+/Oxn0op
ryP+/AbAcR2DW2ZaptptmmVdSS59VnJ6JLY1OYgPGobpaCD8GBlYeu+X+U9E
C0SCBVpCq4Wp1NBwr6pK0TpvxRcCtOyt+Zlq7uphtDev8wjp
      "], 
     Association["Book" -> 10, "Theorem" -> 39] -> CompressedData["
1:eJztzE0OwkAIBeAxEcrwMzQz1LbuvJJH6AU8qzeSrty79lsQXnjhcbyex6WU
8j7H349GeBruZmHG4HcJNV1HqO4jYjD7zststnoWTkS21SWXzRY9jzfK8qw1
3wFAAcRaC0jpE5iqZ2RFRCJioQlVUfQrs9TaWrsyS0gavX8A9vYJ0Q==
      "], 
     Association["Book" -> 10, "Theorem" -> 40] -> CompressedData["
1:eJztzMEJQkEMBNAVTHazSSZRI3wED7ZkCb8Ba7Ujv168e/YdBoaBua2P+7pr
rT3f8feryrRcLAADhiQu5YFYyiKvdT7RVMzK0MjCxxiAFFIDZe8xRrmn2/ZG
RI2YRRppO3Zys9jqNGbuvYt2YXdW+2KzIWKO/ZwTKjoPiBfaFwl3
      "], 
     Association["Book" -> 10, "Theorem" -> 41] -> CompressedData["
1:eJztzEEOQjEIBFBMpC0w0GoT3XwXXskj/At4Vm8kdeXatS+EZCCZ+/587Aci
eq319zOkie4+3ZvAsMXoMTcMxe0ym2poYFjg2j9EPCwir3ru6xkCx4BlGTMT
l6JKLCTMrTFW5FpzqlktrGD5ksHllJVHiJpJNI94A6VYCNQ=
      "], 
     Association["Book" -> 10, "Theorem" -> 42] -> CompressedData["
1:eJztjDEOwkAQAw+J09m7602OJCBKvsQT8gHeyo84qKipmcK2bMm3/XHfD6WU
51v+/M62XnXJTMskN0Dzskg69RGl0SOxLamz8IFhGJ0Mgr9HBebe+zq+WmNB
a6EClmC1MLYaDPdKUo11eMUXDZiyT92PornTw2QvjvAIsQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 43] -> CompressedData["
1:eJztjMsNwkAQQxeJ1Xg+nmSzJIgjLVFCGqBWOmLDiTtX3sG2bMn3/fnYT6WU
1yF/fmC98ZqZlqm2AZx7J7m0EcnRI7H25EZ80DCMjgbCj5GBubV2GVciWiAS
LNASWi1MpYaGe1VVitbhFV8IMGWbmp+p5q4eRnsDhG4Inw==
      "], 
     Association["Book" -> 10, "Theorem" -> 44] -> CompressedData["
1:eJztjLsNwzAQQxUgwvE+PEGWod4rZQQvkFmzUeRU6dPmFSRBAjzO5+O8lVJe
l/z5hcmZmZapNgFuY5Dc+ork6pHYR3ISHzQMq6OB8GtkYOu97+tJRAtEggVa
QquFqdTQcK+qStG6vOILAVr21vxONXf1MNobdzMIhw==
      "], 
     Association["Book" -> 10, "Theorem" -> 45] -> CompressedData["
1:eJztjDsOAjEQQ4NEFHtmPErYT8+VOMJegLNyI7JU9LS8wrZsyffj+TgupZTX
KX9+QntmWia5AerLIuk2ZpRmj8S2pHbhA8MwOxkEP0cF+hhjnUetsaC1UAGL
WC2MrQbDvZJUY51e8UUDeo4+/CqaOz1M9gZo2ghx
      "], 
     Association["Book" -> 10, "Theorem" -> 46] -> CompressedData["
1:eJztjDsOwkAMRI2EN/7OJspSpeJKHCEX4KzcCC8VPS2vGM/TSL6fz8d5IaLX
jD+/sXVgB0TD15HoOEZ0P8ZtFwuYlWAd9kHU4HWjAnOEWuaao/4wM3FrZsRB
yizCPpWXpbq4SWML1i9KUrdMv4aau7ok8AZvTwhm
      "], 
     Association["Book" -> 10, "Theorem" -> 47] -> CompressedData["
1:eJztjLsNAkEQQxeJ1Xg+nrth4XJaooRrgFrpiD0iclJeYFu25Pv+fOyn1trr
kD8/kpmWqbYBXMcgeakZydkjcRvJjfigYZgdDYQfIwNrVV3njYg2iAQbtIV2
C1PpoeHeVZWifXrHFwIsWUv5mWru6mG0N1XQCE0=
      "], 
     Association["Book" -> 10, "Theorem" -> 48] -> CompressedData["
1:eJztzLsVwkAMRFFxDkIafRevG6AlSnAD1EpH2BBQACk3eMEEc9se9+1ERM8j
f79ap6/TrKGVlY2uoVpVZg7DyLVHfSAcQFbsux3pQPaS2F+YhUQkghi0gOEs
wuGsynhjNmPB10V15HXOPDci3SKt/AVPiAhK
      "], 
     Association["Book" -> 10, "Theorem" -> 49] -> CompressedData["
1:eJztjDsOAkEMQ4PEaOL8Fle7LVfiCHsBzsqNyFDR0/IiWbEc534+H+dFRF5L
/vzMzmOHE6iNrCpSUQtHgbwVa/sAN/SYt+oK+6IrjH4CqNgYkWKQnCMs0BZh
NtDMdr3qF1M1nEG7JjQdHp72BmMTCGE=
      "], 
     Association["Book" -> 10, "Theorem" -> 50] -> CompressedData["
1:eJztjDsOAkEMQ4PEaOL8FlfQciWOsBfYs+6NyFDR0/IiWbEc57kfr/0iIueS
P7/D+wNOoDayqkhFLRwF8las7QPc0GPeqivsi64w+gegYmNEikFyjrBAW4TZ
QDPb9apfTNVwBu2a0HR4eNobVmcISg==
      "], 
     Association["Book" -> 10, "Theorem" -> 51] -> CompressedData["
1:eJztjMENw1AIQ6lUBNh8UEboSh0hC3TWbJTQSxbose9g2Zbs1/557w8ROUb+
/IDuCFYVOEqumoAuXKGL1V8i7oZjK2ZCXBdmJqaaEDdJU4Khhkx3c/cVrrZt
xrzxTAQT/VyVXExixQlAMwhT
      "], 
     Association["Book" -> 10, "Theorem" -> 52] -> CompressedData["
1:eJztjMsNAkEMQ4PEaOL8Fp/2TEuUsA1srdsRGU4UwJEXyYrlOM/jfB03EbmW
/PkFO5xAbWRVkYpaOArko1jbB7ihx7xVV9gXXWH0B0DFxogUg+QcYYG2CLOB
ZrbrVb+YquEM2j2h6fDwtDdAFAgh
      "], 
     Association["Book" -> 10, "Theorem" -> 53] -> CompressedData["
1:eJztjN0NwkAMg4PU08X5az0CKzFCF2BWNiLXpw7AI18kK5bjPM/363yIyGfJ
n58AJ1A7WVWkohaOAnkUa7+AG3rMW3WFfdEVxnoAFRsjUgySc4QF2iLMBprZ
rle9MVXDGbQtoenw8LQvM64ICg==
      "], 
     Association["Book" -> 10, "Theorem" -> 54] -> CompressedData["
1:eJztjNENw0AIQ4mUOzAGRboNslJGyAKdtRsV+pMF+llLfhgkfN6v695E5N34
6zdiAmA2j0Yt4QhklsGvFvvi7iDQsWa9JOtf1QRmkVJh2QjGgBVVp6p6eTqm
2qNhdnis8J3WlaxWfAD28Qdw
      "], 
     Association["Book" -> 10, "Theorem" -> 55] -> CompressedData["
1:eJztjNsNAlEIRDGRwAxzNxorsCVL2Aas1Y6E9cMG/PQkHCbhcd+fj/1kZq/R
nx9RaDTmEaGN2CARag+riAI5xgzx2bz1eTfLCC3LtBWeooeLqvKIQJcTHvml
87W58EzMy+PZG+hNB1o=
      "], 
     Association["Book" -> 10, "Theorem" -> 56] -> CompressedData["
1:eJzti8ENAjEQA4NEtPbuOlEuXAG0RAnXALXSETleFMCTediWLd+P5+O4lFJe
p/z5FYC2OSVtY0WpNUfDPpt24QPTsTo5hDhHJbYxxm29zVhglipgSVZPp9Vk
RlSSMtblFV8Y0NvoPa6iRzDS5W8VSQfV
      "], 
     Association["Book" -> 10, "Theorem" -> 57] -> CompressedData["
1:eJzty80NwjAMBWAjYde/SdUktBxZiRG6ALOyES6XLsCR7/DkJ9uP/fXcLwDw
PuLvZ2x4+NqHH9Gbar3rmCPW2uOLOTYZOWwx/FjeOI9nl3xGREAiEUCDZcJw
r1nViYiZ1XgidzI/ZTeRUspV1bql1pYPXs0Iwg==
      "], 
     Association["Book" -> 10, "Theorem" -> 58] -> CompressedData["
1:eJzty7sNAkEMBFAjYe/6b2BXuuQCWqKEa4Ba6Yg7Ehog5AUjjUZz356P7QQA
ryP+fmd6RizTMtY5byhaMiq1asRH75E8orRi2DFGH+7ltn8REZCIGVDh2tDN
cq9iRNRaY21M7qT2RWad2TzOIpLKKpfIN0YBCG4=
      "], 
     Association["Book" -> 10, "Theorem" -> 59] -> CompressedData["
1:eJztjEsKAjEQRFuwkvQnlYwDIgyz8EoeYS7gWb2RiSsP4NJH0/Cq6bofz8dx
EpHXXH9+CHvjukW32K9rMaMxujNu7YNqpZMjtUubR2rU6OHjFYAgJTOBigKl
IKYi5zHZPSdYQL8YUnUZledQc1eWSr4BHxMH3w==
      "], 
     Association["Book" -> 10, "Theorem" -> 60] -> CompressedData["
1:eJzti8sNwzAMQ1VbMS1RsVCgC3SljpAFMms2qt1b7z32geCBn+dxvo6biFzL
/vySTJKZALlzOAZGBpPjg8PBWS23VRJ4zJzzWSsEpQBSuwBba96KAqGqZgad
8q7bN2H3CNN9rq2xe/gbyk8Gvg==
      "], 
     Association["Book" -> 10, "Theorem" -> 61] -> CompressedData["
1:eJzti8sNAjEQQweJ0Xg+TjawFEBLlLANUCsdkXCiAI68gy092ffj+ThOIvJa
8een7CTbFeDqQMO4NW6MD6jAdFw+V7Cwjd73eVR1gVlR4HJxnVs3La9MdXea
q1YqvjCg9zFGnumR6ZnBeAPzKgeZ
      "], 
     Association["Book" -> 10, "Theorem" -> 62] -> CompressedData["
1:eJzti8sNAjEQQweJUTwfJ5rVijstbQnbwNZKRyScKIAj72BLT/bzvI7zJiKv
FX9+C8n+ALja0VF7Z9E/IB3TcflYwcRWY+zzp2qC1pICk810bq1pWkaombGZ
aobiiwaMUVVxp3mERTj9DesnB4o=
      "], 
     Association["Book" -> 10, "Theorem" -> 63] -> CompressedData["
1:eJzti8sNAjEQQwcJJxlPJh9tVoIjLVHCNkCtdETCiQL2yJNl6cny43g9j4uI
vFf9OZmx51J9jC3RSbbq97bzS1LSyOqdLGu8KXNu2eYNgCAEUqCiQEqwpYhx
JprFAHPoD1Ncuxe7mnIzLanX/gHuxQeG
      "], 
     Association["Book" -> 10, "Theorem" -> 64] -> CompressedData["
1:eJzti8sRwkAMQ5cZtLHlT5JhOXKgJUpIA9RKR3g5UQBH3kHWG43vx/NxnFpr
rxl/fs3w1W7jehF6kiXrNvhBlGl1vSLnmMqILUZ9AWjonWzwpoAIbCqWpboY
pYMO/aIkdI+wsyvN1CQy3+QvB18=
      "], 
     Association["Book" -> 10, "Theorem" -> 65] -> CompressedData["
1:eJztjMENAjEMBI3EJrG98eVAunvxoCVKuAaolY5weFEAT0aWpVnLez+ej+Mk
Iq+5/vwcDuNtuzazsODw4L58UO3hEZnaZZnHUHYOej4BEJRiJlBRoDVwKmrN
qe61wAj9IqXrmpVnqrlrtB7xBuoJB3U=
      "], 
     Association["Book" -> 10, "Theorem" -> 66] -> CompressedData["
1:eJzti9ENwkAMQw+JXG0nOeh1gq7ECF2AWdmIlC8G4JMn2bJkez+ej+PSWnud
9uf33IHMHEMYFcIz9cFTENxL8LP0QM5t8/qYsS2k1CpMGt1AiAZ0krHQutjx
RVUj17nqGrV05k2hN73vBxc=
      "], 
     Association["Book" -> 10, "Theorem" -> 67] -> CompressedData["
1:eJzty8sNAkEMA9AgEcXOZ6JhK6AlStgGqJWOmOFEA9x4Bx8s+34+H+dFRF47
/n4AqGU4BvoY1eUfSMfqavexoxKzu8e6qFJgliWg3KhrS9NkRijJMqpmKL4Y
0D3njGsREYzw8jfBYQcv
      "], 
     Association["Book" -> 10, "Theorem" -> 68] -> CompressedData["
1:eJzti8sNwkAMRI2EHX/Gmyi7C+FIS5SQBqiVjthwoQJuvMNonkZz35+P/URE
ryP+/ILaWnWfb96XzCtaflDNzfooW3Yc40UbsMDGg5mJRcyIg9aJE5iHOkRE
VT10EkACX4aHWSnl7B4tLKIu6xsCYwgD
      "], 
     Association["Book" -> 10, "Theorem" -> 69] -> CompressedData["
1:eJzty8sNwkAMBFAjYXv934BTQFqihDRArXREwoUKuPEOI41Gs+3Px34BgNcZ
fz/R6x3VUnuW1ez8GCNTOqdVtp9jjY6Y4ccBEQGJRAANbozhXkdVJyJmFmOh
CDL/Ivch4pFXVU0T0yXrDeP4B6Y=
      "], 
     Association["Book" -> 10, "Theorem" -> 70] -> CompressedData["
1:eJztjMsNwkAMRB2J2V1/dhJAIue0lBLSALXSEV5OaSA3nixLb2zNdrz3YxKR
z1h/ruH1bGY0xuKMdf6h2ulkpvaYx5EaPZbw/AcgKMVMoKJAa4ihqDWnutcC
C+iJlK73rLyFmruydfILxDIHJg==
      "], 
     Association["Book" -> 10, "Theorem" -> 71] -> CompressedData["
1:eJztytENwjAMBFAjcUl85zitWgZgJUboAszKRqSIFfjj6XzSSb4fz8dxMbPX
WX8/sjVyJ5euEbfx4S6Jk5KDG5nOiCX2+Q7AUAppcHOgNcQ5UetMlWoBA/7F
eXN0X3vqGk7Js/XMN7RLBwI=
      "], 
     Association["Book" -> 10, "Theorem" -> 72] -> CompressedData["
1:eJztzNENwjAMBFBzTmPnYiVB6gKs1BG6QGdlIww/TMAfTyd/nE5+nNdx3kTk
+T5/v2LG3cg1Jtf8cGc0zskYHFzGnkXacw1AoOomqFKx1YquaoZSMsWtKEg0
fiGC7c7eyzALb77l1xePgAZn
      "], 
     Association["Book" -> 10, "Theorem" -> 73] -> CompressedData["
1:eJztjLENgEAMA4MEJPZjpB+BlRiBBZiVjUhomICOKxydYnk7zv0YzOyq+PmM
DnS0BrTMggSEolXkn48pyySNPq+yCJNPkjC5loX06ngawz1eEEH1HB3LmGN5
bpDmBpg=
      "], 
     Association["Book" -> 10, "Theorem" -> 74] -> CompressedData["
1:eJztjdsJgFAMQyt4bRKsoBu4kiPcBZzVjWw/xAn881AKeUD2fh59MLOr3s93
kBslUvkLgMxjOazwUdnN2OTTEgZYeIsINo95lrw6nkpwxwsBxZoDY6kcSQM3
gOAGcQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 75] -> CompressedData["
1:eJztyssNwmAMA+AgEaXJn9iF8hBHVmKELsCsbETKhQm48R2sWM59fT7WnYi8
tvj7IQ6wzvOM5cPjwgB4JXi4bWOguNSpX81MTDVC1OVoWpWpFpnT5A3u5qSP
/OqxKwp7DPRdOehv23YHxA==
      "], 
     Association["Book" -> 10, "Theorem" -> 76] -> CompressedData["
1:eJztyssNwkAMBFAjYcef8W60WHzEiZYoIQ1QKx2RcEkF3HiH0Yw0j+X1XA5E
9N7i75e85sxLr/xSzZtt/ZqFfvc6awEzbH0yM7GIGXHQmDiBvk6HiKiqh06S
KYGdAGHWWju6R4VFnMb4AMOHB38=
      "], 
     Association["Book" -> 10, "Theorem" -> 77] -> CompressedData["
1:eJztjdEJgEAMQyt4Ng1WECdwJUe4BZzVjUx/dAL/fJRA0kD2fh59MLOr5OdT
goygtAAidFGJ2B6not5Gn5Y0wNJbZkbznGfSq+NyhDteAmCuGhjLaUQBbnES
Bk4=
      "], 
     Association["Book" -> 10, "Theorem" -> 78] -> CompressedData["
1:eJztyrENAkEMRFEjYZ+9M7ZXBJAcAS1RwjVArXTEQUIFZLzgSyPNbXvct4OI
PN/5+6050bzkR3h3rNXoXoke1/JzcjL2n6qKmkWIQk6LJtn7HDQzdx/wxTIN
/DISEVV1xEAjgNn1Aq/+B0M=
      "], 
     Association["Book" -> 10, "Theorem" -> 79] -> CompressedData["
1:eJztyrENQkEMg+EgEcVO4tM7QKJmJUZ4CzArG3GIhgXo+ArrL3zbH/f9YGbP
9/z92HnoKnx0IjGUEGoF1NjmnJd1i6AhomWgiZ6dDG92lZNU0H0lvgQwx9xO
dRSzipWpfAGEdAau
      "], 
     Association["Book" -> 10, "Theorem" -> 80] -> CompressedData["
1:eJztissNwlAQAxeJ1dr7yctDSJxpKSWkgdRKR2w40QA3RvYcLD/3Y9svIvI6
9efXLPUo/4B0OJbqIk5VYp1j3PulSoFZloByo/aXpsmMUJJlVI2evzBgHXPO
uBYZ0fHyN3+DBqE=
      "], 
     Association["Book" -> 10, "Theorem" -> 81] -> CompressedData["
1:eJztzMsRg0AMA1AzE2HLa2OyVJCWKIEGqJWO+FxSQW55B40OGn22fd0GETnu
+Pu5mqs/GEUyMzjRi84KZi3ZrhGgoqoRAkon2GCKcJiBD9xV+TWazfleer6u
u2we6VM7AYiXBro=
      "], 
     Association["Book" -> 10, "Theorem" -> 82] -> CompressedData["
1:eJztytENgzAQA9BDqpOc75JwAvW/KzECCzArGzX0iwX465NlyZI/+7Htk4ic
V/09b1n5U5Szkb0G2WoPvpXus6/jA0CQEilQUaAU+DWR80g2ywl06M0YVaM2
e7lyMW0lenwBdLAGhQ==
      "], 
     Association["Book" -> 10, "Theorem" -> 83] -> CompressedData["
1:eJztyrENAgEMQ9EgEcV2ErgcLMBKjHALMCsbcYiGCah4xZcL37bHfTuY2fOd
vx9ofJQgnFpopHBBF5aZue6XCBoiqg20la4Sw4uV6SQ76L5PfAlgOc+seWwq
kym1XmnUBnQ=
      "], 
     Association["Book" -> 10, "Theorem" -> 84] -> CompressedData["
1:eJztjNEJwzAQQ69Q2dadfe5Bf/vRlTpCFsis2aiXkBnylSchEAh9l/W3PERk
2+PmCuYBOdzCXU3nVI9PsI/+6u9cABCUoiqgEGgNY6+oNV3NaoF2UFMnyD9G
Xj6dakZvw+MPeDsGig==
      "], 
     Association["Book" -> 10, "Theorem" -> 85] -> CompressedData["
1:eJztyrsNhEAMBFAjnfHfi0XAprRECTRArXTEEl0HF92TZjTB7Od1nBMA3G/9
/UTfet9ElvSsWrKy3pmSbVjHAZGAEFWBCJTQ2ARJ1eeZmNmFkVoj8y8eiVCt
T1iYmYeGPJLKBvk=
      "], 
     Association["Book" -> 10, "Theorem" -> 86] -> CompressedData["
1:eJztycsNwkAQA9BBYuL5b0Y5JFdaooQ0QK10xOZEB5x4ki1Lfpyv53kjovdV
f7+xH8euulZU91pdfc3SGtM2f2YQmM0IIAO7uDLMYlkgIqHCGAMeXzKTadb3
9HT3SEv9AIjyBuE=
      "], 
     Association["Book" -> 10, "Theorem" -> 87] -> CompressedData["
1:eJztycsNwkAQA9BBYuL5b0a57JWWKCENUCsdsTnRASeeZMuSH+fred6I6H3V
34/MOVX3iureq6uvWVpjOdbNDAKzGQFkYBdXhllsG0QkVBhjwONLVjLN+p6e
7h5pqR9//wbL
      "], 
     Association["Book" -> 10, "Theorem" -> 88] -> CompressedData["
1:eJztycsNwkAQA9BBYuL5b0Y5wDUtpYQ0QK10xOZEB5x4ki1L3s/Xcd6I6H3V
3688H6prRXWv1dXXLK0xbfNlBoHZjAAysIsrwyyWBSISKowx4PElM5lmfU9P
d4+01A918gay
      "], 
     Association["Book" -> 10, "Theorem" -> 89] -> CompressedData["
1:eJztybsNw0AMA1AZsEz9z0KK1FkpI3iBzJqNfK6yQSo/gAQBvo7P+1iI6HvV
7W+eqntFde/V1dcsrTE95skMArMZAWRgF1eGWWwbRCRUGGPA40dmMs16TU93
j7TUE2xiBpo=
      "], 
     Association["Book" -> 10, "Theorem" -> 90] -> CompressedData["
1:eJztycsNg0AQA9BByuD5LyMqSEspgQZSazpiOdEBpzzJliW/j+/nWIjod9Xf
c1S3iureqquvWVpj2ufHDAKzGQFkYBdXhlmsK0QkVBhjwOMmM5lm/UpPd4+0
1BNjTwaD
      "], 
     Association["Book" -> 10, "Theorem" -> 91] -> CompressedData["
1:eJztycENgDAMA8Ag0caOkx8LsBIjsACzshGFDxvw4iRblrzux7ZPZnbe9ftQ
KTJuCsaYrFBVaVzuMAJZ9oyWykYk0727e4z0YHe8GrAwl4xZBEWJ5AU/kwXi

      "], 
     Association["Book" -> 10, "Theorem" -> 92] -> CompressedData["
1:eJztjLENgDAMBI1EZF/iJAImYCVGYAFmZSOckgWouOKK/9fv53Wck4jcQz9f
4hR6dxptiI0SEE1KKgruoiYrKZcM6u5mRuRKslpN7cXCsm51LjnGjEMeRYEG
EA==
      "], 
     Association["Book" -> 10, "Theorem" -> 93] -> CompressedData["
1:eJzti8sJgEAMBSMYssnmp9iALVnCNmCtdmQ82YEn5zDwGN4+zmNMAHA9+vmU
lIzY0rPUs5YtuVgFIgJCFAFkWAnNVJFEtTUunJk4gru+VKzp5rN3V61HD74B
YqoGog==
      "], 
     Association["Book" -> 10, "Theorem" -> 94] -> CompressedData["
1:eJztyMsNg0AMhGFHwsaP8YI2G4S40VJK2AZSazoCTukgJ77DL83s/fPuDyL6
Xrn9l7XMXLNh2rwt+gJm2PkzM7GIGXFQHTmB6ZwOEVFVDx0lUwI/AoRZKWVw
jxYW8az1AFdSBnk=
      "], 
     Association["Book" -> 10, "Theorem" -> 95] -> CompressedData["
1:eJztytENgDAIBFBMrPQoUKsTuJIjdAFndSPplxv45UvuEsgd/Tr7RET3qN/H
YKaoiNqBhhI83iJCwosbMWjjpKpIbK6lMAKPy5jzCzm31Vu1WSBjY5EHPxAG
Ag==
      "], 
     Association["Book" -> 10, "Theorem" -> 96] -> CompressedData["
1:eJztyLENg1AMhGFHio397p6NKNJEFKzECG8BZs1GQJUNUuUrfuluG8c+HiLy
ufP3a1nIehPV1vRX58y4XlUVNYsQhSyTdrKu2Whm7t7gk/Vu4JeRiMjMJxoK
AcyVJ0g6BkI=
      "], 
     Association["Book" -> 10, "Theorem" -> 97] -> CompressedData["
1:eJztycERgzAQQ9FlJjvatSXbARqgJUqgAWqlI8gpHeSUd/gHaTvO/ZjM7Prk
7+eWpkW9aZaK5nfv6zMCMLiT5rAKj1LSQTICEcEMxxio/EpS5Gj9papSnyuV
N0YXBkE=
      "], 
     Association["Book" -> 10, "Theorem" -> 98] -> CompressedData["
1:eJztycERg0AMQ1FnJh7Zu9LikDRAS5RAA6k1HQVOdMCJd/gHadm+6/Yws9+R
2/WG3pqGZqlpflV99g2AwZ00h3V4tJYOkhGICGY4qtB5SlJkjemprtb3K5V/
P8sGMA==
      "], 
     Association["Book" -> 10, "Theorem" -> 99] -> CompressedData["
1:eJztybsRgEAIBFCckeHg+JyfxNCWLOEasFY7EiNLMPEFO+yy9/PoAwBcT/w+
0DymLWoLcYvF1pyICAhRBJBhJjRTRRLVUjg5M3EEV33lM6ubj149b9MafANC
pAZK
      "], 
     Association["Book" -> 10, "Theorem" -> 100] -> CompressedData["
1:eJztyLsNhEAMhGEjYePHeA8tCwEZLVECDVArHcFFV8IlfMEvzWzHuR8dEV3f
vP4hZ+Tqy6INGGHPw8zEImbEQXXgBD7PdIiIqnroIJkS+BEgzEopvXu0sIip
1hs11QYd
      "], 
     Association["Book" -> 10, "Theorem" -> 101] -> CompressedData["
1:eJztycENgDAMA8AgEdlJmxIQC7ASI7AAs7IR5cUIfDhZftjbce7HICLXU79P
xNxiibDIOXPtAwCBqrsopEDpbgr3SoJkNSoyUeqLPa2xTmOU8NIvC7sBLKsF
7Q==
      "], 
     Association["Book" -> 10, "Theorem" -> 102] -> CompressedData["
1:eJztx7ENgEAMBEEjYWP/3fsRARkBLVECDVArHQERJZAwwUq77se2dyJyPvl9
g2hlSZ8rR8b9qipqFiEKmQatZLu30MzcvcAHq9XAl5GIyMweBQ0BjC0vJgwF
4g==
      "], 
     Association["Book" -> 10, "Theorem" -> 103] -> CompressedData["
1:eJztx8ENgDAMQ9EgETltnFSCCViJEboAs7IRhQsjcOEdvuytH3ufROS88/sI
3cNrxLoudVwAAlVSFLJAI0gFk2YwsywGtAbnqzzNzNmd1ccqUS4j6gXn
      "], 
     Association["Book" -> 10, "Theorem" -> 104] -> CompressedData["
1:eJztybsNgEAMA9AgEfmSy0cCFmAlRrgFmJWNCBUj0PAKy5b3cR5jIqLrid9X
skdqeK6+1QJAYFYlFlrA7mYMNWtNSohAMqXbq86a4TFHj+puPeUGI9IF8Q==

      "], 
     Association["Book" -> 10, "Theorem" -> 105] -> CompressedData["
1:eJztx7sNgDAMRVEjYePPS0AhomclRsgCzMpGhIoRaDjFle7ezqMNRHQ9+X3G
66YVmGF9mJlYxIw4qEycgNTXISKq6qGTpCSBlwBhlnMe3aOGRaxLuQEXSwXB

      "], 
     Association["Book" -> 10, "Theorem" -> 106] -> CompressedData["
1:eJztissJgEAQQ0dwzQdHsARbsoRtwFrtyJ2TJXjxHUJeyNGvs08RcVf8fIdE
FR7ddhjLlkFGomWmGnJdbdQHw0yALyKdu625zPIY+AAF5gU4
      "], 
     Association["Book" -> 10, "Theorem" -> 107] -> CompressedData["
1:eJztx7ENgDAMRFEjYcfO2Q6ioGclRsgCzMpGhIoRaHjF193ez6NPRHQ9+X0o
dQtf3MZkZmIRM2LQWjjc27jVRURVK7RIhMBf4g6zzJxR0WDA0vIGDE8FlQ==

      "], 
     Association["Book" -> 10, "Theorem" -> 108] -> CompressedData["
1:eJzth8ENgDAMA4OEm8ZNKypYgJUYoQswKxuRTsEH62z5znFfYxGRZ86fL2N0
3/yIB0CQEikwMSBn+FSoBlqKJtBhxoCcDanWa+Pqxr1Yy33rL/ssBUM=
      "], 
     Association["Book" -> 10, "Theorem" -> 109] -> CompressedData["
1:eJztytEJgDAMBNAIhtylqdA6QVdyhC7grG5k/HQCf3xw4Tgy5nnMRUSu5/w+
xZKYRdXEyAgxSKd6cdIiAgBzNypqheGlsfW9rsXzmdwyN/7wBUc=
      "], 
     Association["Book" -> 10, "Theorem" -> 110] -> CompressedData["
1:eJztx8ERgDAIRFGcEQJZwEw6sCVLSAPWakfGkyV48R3+zu7jPMZCRNeT37fC
m9tcZiYWMSMG9cLhvs1bXURUtUKLRAj8Je4wy8wVFQkDWvYb/ioFaw==
      "], 
     Association["Book" -> 10, "Theorem" -> 111] -> CompressedData["
1:eJztxcENgDAMQ9HQpjFOokqMwEqM0AWYlY0AcWICLjx92evYtzGJyHHP72O9
97iuVghKAaTOAjQzt6JgqipJ6FN7Sy6Z1LDmtAhPPwHe7QSG
      "], 
     Association["Book" -> 10, "Theorem" -> 112] -> CompressedData["
1:eJztxcENgCAQRNE1cRzYARINMV5tyRJowFrtSDxagRff4f+9nUcbzOx68vva
uuVegEbA3SBbgBgxgZ5AUpJToZ/hLae5Vo05Bk9eipJu9Y4FLg==
      "], 
     Association["Book" -> 10, "Theorem" -> 113] -> CompressedData["
1:eJztxbENgDAQBMFH4jj/n40EskRMS5TgBqiVjjAhFZAwwe7ezqMNZnY9+X1u
Kz0AjUCEQbYC7pjAyCApKajUz/RW8lKrxuIpss+zsm7uhwUW
      "], 
     Association["Book" -> 10, "Theorem" -> 114] -> CompressedData["
1:eJztxckNgDAQQ9FBwjizRQKlAVqihDRArXREOFIBF5707b2fR59E5Hrm970c
ARQCZgKXDVDFAlqApLsbvYxnectYW/M5tVhorRZ2A+eYBP0=
      "], 
     Association["Book" -> 10, "Theorem" -> 115] -> CompressedData["
1:eJztysENgDAMQ9EgUZw4SauOwEqM0AWYlY0oJ1bgwDt8yZL3cR5jEZHrye8b
SAqx1RQ1aSgRYQVZwx1mlpgrCOjLVHurvXOlcX48Z27wIgUV
      "], 
     Association["Book" -> 11, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAAjHh4pAEAlWgCfA==
      "], 
     Association["Book" -> 11, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweADlMQLAJUeAns=
      "], 
     Association["Book" -> 11, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaABjEgk2QAAlWYCfA==
      "], 
     Association["Book" -> 11, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAB3BwM7Bxs7KwMnKwMHGysHBwcrECKlZOTE0iC
AAeUhgMgnwMEWFkgfHYQAQDQOwQW
      "], 
     Association["Book" -> 11, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWABHAzsHGzsrAycrAwcbGwcHBysbGysrJycnEAS
BNigNBwA+RwgwMoC4bODCADMswQI
      "], 
     Association["Book" -> 11, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAB3BxAwMDBysDBycbOwcrKwcbOzsnJycbKysoO
4bEiAxCPA6iMnQWiAiwNAM7LBBU=
      "], 
     Association["Book" -> 11, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSADFEYKAJUhAns=
      "], 
     Association["Book" -> 11, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSACbGwcHAxcrAwcHGwcrBysrKzs7FycnKwgBpgA
k3AA4nEAVbGzsIF5YGkAyekEAA==
      "], 
     Association["Book" -> 11, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWACvOysDKwcDKzcbOwcrOy8nKysXGxsHKysrOzs
rNwcIBIVAIU4OFhZgBJs7BBZAMu1BBE=
      "], 
     Association["Book" -> 11, "Theorem" -> 10] -> CompressedData["
1:eJztycENQFAYBOFfYnZXiB60pITXgFp1xIuTDhx8h7nM1o69DVV19vw+xRQu
Zjl4nUBSAJslvW8hCeM95Odex8gEAg==
      "], 
     Association["Book" -> 11, "Theorem" -> 11] -> CompressedData["
1:eJztx8EJgEAMRNEIhsxMgtiCLVnCNrC12pHxZAsefIcP/xjzHIuZXU9+37Kb
wyTPFD22ikZS0QdF4EWglFVaKUCsLm7XfgSY
      "], 
     Association["Book" -> 11, "Theorem" -> 12] -> CompressedData["
1:eJztx7ENgEAMBEEjYfnu7OBboCVK+AaolY4wES0Q/AQr7TGvc25mdr9Z/sZh
kmeKHqOikVT0QRH4EChllXYKEKuLB9PiBIo=
      "], 
     Association["Book" -> 11, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 11, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaADrAwcrGwc7BysrKzs7JwcIBorYAdisCwLgs8K
AL5vA8k=
      "], 
     Association["Book" -> 11, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAEnNysXBxcnFxcXBysrKwgGsTkBAJuTjTABcQs
HFxAxZzcQCUcANIZBIg=
      "], 
     Association["Book" -> 11, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "], 
     Association["Book" -> 11, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAE7Ky8vDycrOy8fDzc3OycnJxAAsjjZWfnQABO
Dg5+fj5+fi4WLk4ukBpeIAEA07sEow==
      "], 
     Association["Book" -> 11, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAEbOwcrOycHOzsXGxs7KysrOzsrJxAGsSEAyAH
qIiDg52FjZ2dDayGlRUAvbED1w==
      "], 
     Association["Book" -> 11, "Theorem" -> 19] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAEbGxsrCAAo4kFALVAA5g=
      "], 
     Association["Book" -> 11, "Theorem" -> 20] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAFPHycrKwcHBzs7Kzs7OwgCshjZedAA1xAzAJi
cHJygigAwwgEJA==
      "], 
     Association["Book" -> 11, "Theorem" -> 21] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAFvJycHEDAyckOBFwggoMDhJEBNxcPNzcHC4jJ
xcnFBaQAxMoEPQ==
      "], 
     Association["Book" -> 11, "Theorem" -> 22] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAFnKysHBwc7Oys7OzsIArIY2XnQAOcQMwCZnCC
mBwAvXAECQ==
      "], 
     Association["Book" -> 11, "Theorem" -> 23] -> CompressedData["
1:eJzth8sNgFAQAtfE/bE0YUuW8BqwVjsSu/DgECZwrOtcm5ndr36+i0dNuEdm
dqcHGBQzqlJAoUnsRXZr6T7DVgRa
      "], 
     Association["Book" -> 11, "Theorem" -> 24] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAGnKysPGxsHKysrBzsrLwcrOzsrKiAg5WDg4OV
hZWdg40dSABFALXIA7U=
      "], 
     Association["Book" -> 11, "Theorem" -> 25] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAG3Jw8bGw8nJycPFyc3DwgxIECQEI8HCycXEBp
IBMoAgDDUwRJ
      "], 
     Association["Book" -> 11, "Theorem" -> 26] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAGPOxAwMnJycXOycoOJNk5EICTg4OHi5uHh4uF
i5uDk5uThwsoCAC9FgQk
      "], 
     Association["Book" -> 11, "Theorem" -> 27] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAGPOzs7JycnHzsnKzsQJIDCXBycPDw8PEIcrFw
83JxcnPy8gEVAgC97wQ9
      "], 
     Association["Book" -> 11, "Theorem" -> 28] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwSAHnJysOAAHEIEAKwuEzw4iAK2LA4A=
      "], 
     Association["Book" -> 11, "Theorem" -> 29] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwWAH/KwQwM6KBoACHCDAyoKkAACtLwN+
      "], 
     Association["Book" -> 11, "Theorem" -> 30] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwaAHrBDAzooGgAIcIMDKgqQAAKp9A28=
      "], 
     Association["Book" -> 11, "Theorem" -> 31] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGweAH3NzcPFzcrEJcvLzcHCiAEwS4WLg4uTm5ubh5
gBwAuoUEGw==
      "], 
     Association["Book" -> 11, "Theorem" -> 32] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAAQrzcHEJsXBxc3LzcCMABRJzcHBxcLJxcPJwg
EU4uTgC88gQ5
      "], 
     Association["Book" -> 11, "Theorem" -> 33] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAAvNyc4my8HJzcfDwIwMHNzcfFzcHBwcLDxcXJ
zcPNzcnFCQC85QRC
      "], 
     Association["Book" -> 11, "Theorem" -> 34] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAA3JxCbBwcnNx8PAjAwc3NwcXNwcfBwsPNxcnN
w83NycUJALk5BCw=
      "], 
     Association["Book" -> 11, "Theorem" -> 35] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAAHNysnBzcnMiAg4ODk5sXyGDh4OAGyvHwcHBy
AACxagPj
      "], 
     Association["Book" -> 11, "Theorem" -> 36] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRABnBzsnNxc3EiAg5ubk5uPn5OThYeTj5OTm5eT
k4sTALQTBAs=
      "], 
     Association["Book" -> 11, "Theorem" -> 37] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVABbLwcnNx8PAjAwc3Nx8XNwcHBwsPFxcnNw83N
ycUJALQXBAo=
      "], 
     Association["Book" -> 11, "Theorem" -> 38] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZABbKzsrOyogBUoxMrKwsrBygrmsbICAKZqA14=

      "], 
     Association["Book" -> 11, "Theorem" -> 39] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdABXLy83BwogBMEuFi4OLk5ubm4eYAcAKx5A8I=

      "], Association["Book" -> 12, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRACwuw8SICDh4eLl4+Pi4uFl4+Xk4+bl4eTixMA
sgMEDg==
      "], 
     Association["Book" -> 12, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAC7DxIgIuHh5OLi5+fi4WHm4eDj5sXyOUEAK8q
A/U=
      "], 
     Association["Book" -> 12, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZACvFwIwAEiOIGYhZuTn5Obm5uPg4MDAKrpA8M=

      "], Association["Book" -> 12, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdACgoKCAgIC/AL8/PwcfHxcnFx8XBwsPJxcnNzc
3LxACgCwzwQA
      "], 
     Association["Book" -> 12, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRADEpISIuLiouJi4hzCwlzcvHxcHCycQMDNy83N
ycUJALbaBEA=
      "], 
     Association["Book" -> 12, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVADoqJCoqIiQMghJMTFzcvHxcHCCQTcvNzcnFyc
ALG3BBI=
      "], 
     Association["Book" -> 12, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZADEiKioiJAyCEszMXNy8fFwcIJBNy83NycXJwA
r/cEBA==
      "], 
     Association["Book" -> 12, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdADIiAgKsIhLs7FzcvHxcHCCQTcQMDJxQkArYMD
7w==
      "], 
     Association["Book" -> 12, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAEIuKC4hLiHIJCXNy8fFycLJxAwM3Lzc3JxQkA
q2cD3A==
      "], 
     Association["Book" -> 12, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAEkqKSkpIcfEJc3Lx8XBwsnEDADQScXJwAqgED
zQ==
      "], 
     Association["Book" -> 12, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAEorKyshx8/FzcvHxcHCycQMANBJxcnACoUgO9

      "], 
     Association["Book" -> 12, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAEIsLCHCIiXNy8fFxcLJxAwA0EnFycAKScA5g=

      "], 
     Association["Book" -> 12, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAFsrIcfPxc3Lx8XBwsnEDADQScXJwAo40Diw==

      "], 
     Association["Book" -> 12, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAFshx8/FzcvHxcHCycQMANBJxcnACg8gNu
      "], 
     Association["Book" -> 12, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAFHHz8XNy8fFwcLJxAwA0EnFycAJ50A1E=
      "], 
     Association["Book" -> 12, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAFHBxc3FxAwMLFxcHBCQQcHBwAnE0DKw==
      "], 
     Association["Book" -> 12, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAGUpycXDw8XCy83BycPNzcQB4nAJ1vA0Y=
      "], 
     Association["Book" -> 12, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAGnJxcPDxcLLzcHJw83NxAHicAm4EDLA==
      "], 
     Association["Book" -> 13, "Theorem" -> 1] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAG3IJiwlwsvNxcnJyc3PxAAgCccAM/
      "], 
     Association["Book" -> 13, "Theorem" -> 2] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAGgvz8XCxcnFyc3EAIBACa1QMg
      "], 
     Association["Book" -> 13, "Theorem" -> 3] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwRAHXFycLBxAwMXJxQmkAJjbAvk=
      "], 
     Association["Book" -> 13, "Theorem" -> 4] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwVAHwlwsvNxcnJyc3PxAAgCZWwMN
      "], 
     Association["Book" -> 13, "Theorem" -> 5] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwZAHXCy83FycnJzc/EACAJhRAvo=
      "], 
     Association["Book" -> 13, "Theorem" -> 6] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwdAHLLzcvNzc3Dw8XLycAJgYAvw=
      "], 
     Association["Book" -> 13, "Theorem" -> 7] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTAArAgAAJYsArE=
      "], 
     Association["Book" -> 13, "Theorem" -> 8] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXAAHKycnGxcvJxcnACWqALS
      "], 
     Association["Book" -> 13, "Theorem" -> 9] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAA/Jz8PNzcnFycAJb9At8=
      "], 
     Association["Book" -> 13, "Theorem" -> 10] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAAnNy83LycXJwAlmgCzw==
      "], 
     Association["Book" -> 13, "Theorem" -> 11] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTABPGwcnMJ8QgCWDQLQ
      "], 
     Association["Book" -> 13, "Theorem" -> 12] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXABXHxcnFycAJXEArg=
      "], 
     Association["Book" -> 13, "Theorem" -> 13] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAB3LycXJwAlYUCrg==
      "], 
     Association["Book" -> 13, "Theorem" -> 14] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfAB3BycHACVQAKe
      "], 
     Association["Book" -> 13, "Theorem" -> 15] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwTACnJycAJUYApU=
      "], 
     Association["Book" -> 13, "Theorem" -> 16] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwXACfGIAlRQCng==
      "], 
     Association["Book" -> 13, "Theorem" -> 17] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwbAC/ACU8QKJ
      "], 
     Association["Book" -> 13, "Theorem" -> 18] -> CompressedData["
1:eJxTTMoPSmJkYGC4CCJGwfACAJTiAno=
      "]},
    SelectWithContents->True,
    Selectable->False]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Module", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"dataA", "=", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"#", "[", 
           RowBox[{"[", "1", "]"}], "]"}], "[", "\"\<Book\>\"", "]"}],
          "\[Rule]", " ", 
         RowBox[{"N", "[", 
          RowBox[{"Max", "[", 
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "]"}], "]"}]}], "&"}], "/@",
        "resDepth"}]}], ",", "vals", ",", "acc", ",", "xval"}], "}"}],
    ",", "\[IndentingNewLine]", 
   RowBox[{
    RowBox[{"vals", "=", 
     RowBox[{"CountsBy", "[", 
      RowBox[{"dataA", ",", "First"}], "]"}]}], ";", 
    RowBox[{"acc", "=", 
     RowBox[{"Association", "[", 
      RowBox[{"MapIndexed", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{"First", "[", "#2", "]"}], "\[Rule]", " ", "#1"}], 
         "&"}], ",", 
        RowBox[{"Accumulate", "[", 
         RowBox[{"Values", "[", 
          RowBox[{"CountsBy", "[", 
           RowBox[{"dataA", ",", "First"}], "]"}], "]"}], "]"}]}], 
       "]"}], "]"}]}], ";", "\[IndentingNewLine]", 
    RowBox[{"xval", "=", 
     RowBox[{"Association", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{"#", "[", 
            RowBox[{"[", "2", "]"}], "]"}], "-", 
           RowBox[{
            RowBox[{"vals", "[", 
             RowBox[{"#", "[", 
              RowBox[{"[", "1", "]"}], "]"}], "]"}], "/", "2"}]}], 
          ")"}]}], "&"}], "/@", 
       RowBox[{"Normal", "[", "acc", "]"}]}], "]"}]}], ";", 
    "\[IndentingNewLine]", 
    RowBox[{"Show", "[", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"ListLinePlot", "[", 
        RowBox[{
         RowBox[{"Values", "[", "dataA", "]"}], ",", 
         RowBox[{"Axes", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{"False", ",", "True"}], "}"}]}], ",", 
         RowBox[{"Filling", "\[Rule]", "Axis"}], ",", 
         RowBox[{"Frame", "\[Rule]", " ", "True"}], ",", 
         RowBox[{"FrameLabel", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
           "\"\<theorems by book\>\"", ",", 
            "\"\<maximum depth reduction\>\""}], "}"}]}], ",", 
         RowBox[{"FrameTicks", "\[Rule]", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"True", ",", "False"}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"#", "[", 
                   RowBox[{"[", "2", "]"}], "]"}], ",", 
                  RowBox[{"#", "[", 
                   RowBox[{"[", "1", "]"}], "]"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], "&"}], "/@", 
               RowBox[{"Normal", "[", "xval", "]"}]}], ",", "False"}],
              "}"}]}], "}"}]}], ",", 
         RowBox[{"ColorFunctionScaling", "\[Rule]", "False"}], ",", 
         RowBox[{"ColorFunction", "\[Rule]", " ", 
          RowBox[{"Function", "[", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"x", ",", "y"}], "}"}], ",", 
            RowBox[{"Piecewise", "[", 
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "6", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "6", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "10", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "10", "]"}]}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{
                 RowBox[{"bookColorIntense", "[", "13", "]"}], ",", 
                 RowBox[{"x", "\[LessEqual]", 
                  RowBox[{"acc", "[", "13", "]"}]}]}], "}"}]}], "}"}],
              "]"}]}], "]"}]}]}], " ", "]"}], ",", 
       RowBox[{"Graphics", "[", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"GrayLevel", "[", "0.5", "]"}], ",", 
          RowBox[{"Line", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"{", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"#", ",", 
                 RowBox[{"-", "5"}]}], "}"}], ",", 
               RowBox[{"{", 
                RowBox[{"#", ",", "32"}], "}"}]}], "}"}], "&"}], "/@", 
            RowBox[{"Values", "[", "acc", "]"}]}], "]"}]}], "}"}], 
        "]"}]}], "}"}], "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

Formalizing Euclid

Everything we’ve discussed so far is basically derived from the original text of Euclid’s Elements. But what if we look instead at the pure “mathematical content” of Euclid? We’ve now got a way to represent this in the Wolfram Language. Consider Euclid’s 3.16. It asserts that:

GeometricScene
&#10005
Style[
 Text[
  Style[Entity["GeometricScene", "EuclidBook3Proposition16"][
    "Statement"], RGBColor["#777777"],
   FontSize -> 14]]]

Well, we can now give a “computational translation” of this:

GeometricScene
&#10005
Entity["GeometricScene", "EuclidBook3Proposition16"]["Scene"]

And this is all we need to say to define that theorem in Euclid. Given the definition of the Wolfram Language, this is completely self-contained, and ready to be understood by both computers and humans. And from this form, we can now for example compute a random instance of the theorem:

RandomInstance
&#10005
RandomInstance[%]

As another example, here’s Euclid’s 4.2:

Style
&#10005
Style[
 Text[
  Style[Entity["GeometricScene", "EuclidBook4Proposition2"][
    "Statement"], RGBColor["#777777"],
   FontSize -> 14]]]

This is now asking for a construction—or, effectively, stating the theorem that it’s possible to do such a construction with ruler and compass. And again we can give a computable version of this in the Wolfram Language, including the construction:

Entity
&#10005
Entity["GeometricScene", "EuclidBook4Proposition2"]["Scene"]
RandomInstance
&#10005
RandomInstance[%]

It’s interesting to see, though, how the computable versions of theorems compare to their textual ones. Here are length comparisons for 2D geometry theorems:

GraphicsRow
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"]; GraphicsRow[{Module[{res = 
     ToExpression[
         First[StringSplit[
           StringReplace[#[[1]], {"Euclid book" -> "", 
             "proposition" -> ""}]]]] -> ByteCount[#[[2]]] & /@ 
      EntityValue[
       EntityClass["GeometricScene", "EuclidsElements"], {"Name", 
        "Statement"}], vals, acc, xval},
   vals = CountsBy[res, First]; 
   acc = Association[
     MapThread[#2 -> #1 &, {Accumulate[Values[CountsBy[res, First]]], 
       Keys[CountsBy[res, First]]}]];
   xval = 
    Association[#[[1]] -> (#[[2]] - vals[#[[1]]]/2) & /@ 
      Normal[acc]];
   Labeled[
    Show[{ListLinePlot[Values[res], Axes -> {False, True}, 
       Filling -> Axis, Frame -> True, 
       FrameTicks -> {{False, 
          False}, {{#[[2]], #[[1]], {0, 0}} & /@ Normal[xval], 
          False}}, ColorFunctionScaling -> False, 
       ColorFunction -> 
        Function[{x, y}, 
         Piecewise[{{bookColorIntense[6], 
            x <= acc[6]}, {bookColorIntense[13], x <= acc[13]}}]], 
       ImageSize -> {300, 200} , 
       FrameLabel -> {None, "character length"}], 
      Graphics[{GrayLevel[0.5], 
        Line[{{#, -1100}, {#, 15000}} & /@ Values[acc]]}]}], 
    Style["textual", 11, GrayLevel[.4]]]], 
  Module[{res = 
     ToExpression[
         First[StringSplit[
           StringReplace[#[[1]], {"Euclid book" -> "", 
             "proposition" -> ""}]]]] -> LeafCount[#[[2]]] & /@ 
      EntityValue[
       EntityClass["GeometricScene", "EuclidsElements"], {"Name", 
        "Scene"}], vals, acc, xval},
   vals = CountsBy[res, First]; 
   acc = Association[
     MapThread[#2 -> #1 &, {Accumulate[Values[CountsBy[res, First]]], 
       Keys[CountsBy[res, First]]}]];
   xval = 
    Association[#[[1]] -> (#[[2]] - vals[#[[1]]]/2) & /@ Normal[acc]];
    Labeled[
    Show[{ListLinePlot[Values[res], Axes -> {False, True}, 
       Filling -> Axis, Frame -> True, 
       FrameLabel -> {None, "expression length"}, 
       FrameTicks -> {{False, 
          False}, {{#[[2]], #[[1]], {0, 0}} & /@ Normal[xval], 
          False}}, ColorFunctionScaling -> False, 
       ColorFunction -> 
        Function[{x, y}, 
         Piecewise[{{bookColorIntense[6], 
            x <= acc[6]}, {bookColorIntense[13], x <= acc[13]}}]], 
       ImageSize -> {300, 200}], 
      Graphics[{GrayLevel[0.5], 
        Line[{{#, -5}, {#, 300}} & /@ Values[acc]]}]}], 
    Style["symbolic", 11, GrayLevel[.4]]]]}, 
 ImageSize -> {650, Automatic}]

And we see that there is indeed at least some correlation between the lengths of textual and symbolic representations of theorems (the accumulation of points on the left is associated with constructions, where the text just says what’s wanted, and the symbolic form also says how to do it):

Module
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"]; Module[{dataA = 
   GroupBy[ToExpression[First[StringSplit[Last[#], "."]]] -> 
       Callout[Take[#, 2], Last[#]] & /@ 
     Transpose[{ByteCount /@ 
        EntityValue[EntityClass["GeometricScene", "EuclidsElements"], 
         "HeathsStatement"], 
       LeafCount /@ 
        EntityValue[EntityClass["GeometricScene", "EuclidsElements"], 
         "Scene"], 
       StringJoin[Riffle[StringSplit[#][[{3, 5}]], "."]] & /@ 
        EntityValue[EntityClass["GeometricScene", "EuclidsElements"], 
         "Name"]}], First -> Last]},
 ListPlot[Values[dataA], 
  PlotStyle -> Table[bookColorIntense[i], {i, Keys[dataA]}], 
  Frame -> True, 
  FrameLabel -> {Style["textual", GrayLevel[.5]], 
    Style["symbolic", GrayLevel[.5]]} ]]

In the Wolfram Language representation we’ve just been discussing, there’s a built-in Wolfram Language meaning to things like CircleThrough and PlanarAngle—and we can in a sense do general computations with these.

But at some level we can view what Euclid did as something purely formal. Yes, he talks about lines and planes. But we can think of these things just as formal constructs, without any externally known properties. Many centuries after Euclid, this became a much more familiar way to think about mathematics. And in the Wolfram Language we capture it with AxiomaticTheory and related functions.

For example, we can ask for an axiom system for Boolean algebra, or group theory:

AxiomaticTheory
&#10005
AxiomaticTheory["BooleanAxioms"]
AxiomaticTheory
&#10005
AxiomaticTheory["GroupAxioms"]

What does the mean? We’re not saying. We’re just formally defining certain properties it’s supposed to have. In the case of Boolean algebra, we can interpret it as And. In the case of group theory, it’s group multiplication—though we’re not saying what particular group it’s for. And, yes, we could as well write the group theory axioms for example as:

AxiomaticTheory
&#10005
AxiomaticTheory[{"GroupAxioms", <|"Multiplication" -> f, 
   "Inverse" -> c, "Identity" -> e|>}]

OK, so can we do something similar for Euclid’s geometry? It’s more complicated, but thanks particularly to work by David Hilbert and Alfred Tarski in the first half of the 1900s, we can—and here’s a version of the result:

geometryall
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"geometryall", "=", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"x", ",", " ", "y", ",", " ", "z"}], "}"}], ",", " ", 
       
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"congruent", "[", 
          RowBox[{
           RowBox[{"line", "[", 
            RowBox[{"x", ",", "y"}], "]"}], ",", 
           RowBox[{"line", "[", 
            RowBox[{"z", ",", "z"}], "]"}]}], "]"}], ",", 
         RowBox[{"congruent", "[", 
          RowBox[{"x", ",", "y"}], "]"}]}], "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
        "x", ",", " ", "y", ",", " ", "z", ",", " ", "u", ",", " ", 
         "v", ",", "w"}], "}"}], ",", " ", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"and", "[", 
          RowBox[{
           RowBox[{"congruent", "[", 
            RowBox[{
             RowBox[{"line", "[", 
              RowBox[{"x", ",", "y"}], "]"}], ",", 
             RowBox[{"line", "[", 
              RowBox[{"z", ",", "u"}], "]"}]}], "]"}], ",", 
           RowBox[{"congruent", "[", 
            RowBox[{
             RowBox[{"line", "[", 
              RowBox[{"x", ",", "y"}], "]"}], ",", 
             RowBox[{"line", "[", 
              RowBox[{"v", ",", "w"}], "]"}]}], "]"}]}], "]"}], ",", 
         " ", 
         RowBox[{"congruent", "[", 
          RowBox[{
           RowBox[{"line", "[", 
            RowBox[{"z", ",", "u"}], "]"}], ",", 
           RowBox[{"line", "[", 
            RowBox[{"v", ",", "w"}], "]"}]}], "]"}]}], "]"}]}], "]"}],
      ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"x", ",", " ", "y", ",", " ", "z"}], "}"}], ",", " ", 
       
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"between", "[", 
          RowBox[{"x", ",", "y", ",", "z"}], "]"}], ",", 
         RowBox[{"equal", "[", 
          RowBox[{"x", ",", "y"}], "]"}]}], "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
        "x", ",", " ", "y", ",", " ", "z", ",", " ", "u", ",", " ", 
         "v"}], "}"}], ",", " ", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"and", "[", 
          RowBox[{
           RowBox[{"between", "[", 
            RowBox[{"x", ",", "u", ",", "z"}], "]"}], ",", 
           RowBox[{"between", "[", 
            RowBox[{"y", ",", "v", ",", "z"}], "]"}]}], "]"}], ",", 
         RowBox[{"Exists", "[", 
          RowBox[{"a", ",", 
           RowBox[{"and", "[", 
            RowBox[{
             RowBox[{"between", "[", 
              RowBox[{"u", ",", "a", ",", "y"}], "]"}], ",", 
             RowBox[{"between", "[", 
              RowBox[{"v", ",", "a", ",", "x"}], "]"}]}], "]"}]}], 
          "]"}]}], "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
        "x", ",", " ", "y", ",", " ", "z", ",", " ", "u", ",", " ", 
         "v"}], "}"}], ",", " ", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"and", "[", 
          RowBox[{
           RowBox[{"and", "[", 
            RowBox[{
             RowBox[{"and", "[", 
              RowBox[{
               RowBox[{"congruent", "[", 
                RowBox[{
                 RowBox[{"line", "[", 
                  RowBox[{"x", ",", "u"}], "]"}], ",", 
                 RowBox[{"line", "[", 
                  RowBox[{"x", ",", "v"}], "]"}]}], "]"}], ",", 
               RowBox[{"congruent", "[", 
                RowBox[{
                 RowBox[{"line", "[", 
                  RowBox[{"y", ",", "u"}], "]"}], ",", 
                 RowBox[{"line", "[", 
                  RowBox[{"y", ",", "v"}], "]"}]}], "]"}]}], "]"}], 
             ",", 
             RowBox[{"congruent", "[", 
              RowBox[{
               RowBox[{"line", "[", 
                RowBox[{"z", ",", "u"}], "]"}], ",", 
               RowBox[{"line", "[", 
                RowBox[{"z", ",", "v"}], "]"}]}], "]"}]}], "]"}], ",", 
           RowBox[{"not", "[", 
            RowBox[{"equal", "[", 
             RowBox[{"u", ",", "v"}], "]"}], "]"}]}], "]"}], ",", 
         RowBox[{"or", "[", 
          RowBox[{
           RowBox[{"or", "[", 
            RowBox[{
             RowBox[{"between", "[", 
              RowBox[{"x", ",", "y", ",", "z"}], "]"}], ",", 
             RowBox[{"between", "[", 
              RowBox[{"y", ",", "z", ",", "x"}], "]"}]}], "]"}], ",", 
           
           RowBox[{"between", "[", 
            RowBox[{"z", ",", "x", ",", "y"}], "]"}]}], "]"}]}], 
        "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
        "x", ",", " ", "y", ",", " ", "z", ",", " ", "u", ",", " ", 
         "v", ",", " ", "w"}], "}"}], ",", " ", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"and", "[", 
          RowBox[{
           RowBox[{"and", "[", 
            RowBox[{
             RowBox[{"and", "[", 
              RowBox[{
               RowBox[{"between", "[", 
                RowBox[{"x", ",", "y", ",", "w"}], "]"}], ",", 
               RowBox[{"congruent", "[", 
                RowBox[{
                 RowBox[{"line", "[", 
                  RowBox[{"x", ",", "y"}], "]"}], ",", 
                 RowBox[{"line", "[", 
                  RowBox[{"y", ",", "w"}], "]"}]}], "]"}]}], "]"}], 
             ",", 
             RowBox[{"and", "[", 
              RowBox[{
               RowBox[{"between", "[", 
                RowBox[{"x", ",", "u", ",", "v"}], "]"}], ",", 
               RowBox[{"congruent", "[", 
                RowBox[{
                 RowBox[{"line", "[", 
                  RowBox[{"x", ",", "u"}], "]"}], ",", 
                 RowBox[{"line", "[", 
                  RowBox[{"u", ",", "v"}], "]"}]}], "]"}]}], "]"}]}], 
            "]"}], ",", 
           RowBox[{"and", "[", 
            RowBox[{
             RowBox[{"between", "[", 
              RowBox[{"y", ",", "u", ",", "z"}], "]"}], ",", 
             RowBox[{"congruent", "[", 
              RowBox[{
               RowBox[{"line", "[", 
                RowBox[{"y", ",", "u"}], "]"}], ",", 
               RowBox[{"line", "[", 
                RowBox[{"z", ",", "u"}], "]"}]}], "]"}]}], "]"}]}], 
          "]"}], ",", 
         RowBox[{"congruent", "[", 
          RowBox[{
           RowBox[{"line", "[", 
            RowBox[{"y", ",", "z"}], "]"}], ",", 
           RowBox[{"line", "[", 
            RowBox[{"v", ",", "w"}], "]"}]}], "]"}]}], "]"}]}], "]"}],
      ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
        "x", ",", " ", "y", ",", " ", "z", ",", " ", "a", ",", " ", 
         "b", ",", " ", "c", ",", " ", "u", ",", " ", "v"}], "}"}], 
       ",", " ", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"and", "[", 
          RowBox[{
           RowBox[{"and", "[", 
            RowBox[{
             RowBox[{"and", "[", 
              RowBox[{
               RowBox[{"and", "[", 
                RowBox[{
                 RowBox[{"and", "[", 
                  RowBox[{
                   RowBox[{"and", "[", 
                    RowBox[{
                    RowBox[{"not", "[", 
                    RowBox[{"equal", "[", 
                    RowBox[{"x", ",", "y"}], "]"}], "]"}], ",", 
                    RowBox[{"between", "[", 
                    RowBox[{"x", ",", "y", ",", "z"}], "]"}]}], "]"}],
                    ",", 
                   RowBox[{"between", "[", 
                    RowBox[{"a", ",", "b", ",", "c"}], "]"}]}], "]"}],
                  ",", 
                 RowBox[{"congruent", "[", 
                  RowBox[{
                   RowBox[{"line", "[", 
                    RowBox[{"x", ",", "y"}], "]"}], ",", 
                   RowBox[{"line", "[", 
                    RowBox[{"a", ",", "b"}], "]"}]}], "]"}]}], "]"}], 
               ",", 
               RowBox[{"congruent", "[", 
                RowBox[{
                 RowBox[{"line", "[", 
                  RowBox[{"y", ",", "z"}], "]"}], ",", 
                 RowBox[{"line", "[", 
                  RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}], "]"}], 
             ",", 
             RowBox[{"congruent", "[", 
              RowBox[{
               RowBox[{"line", "[", 
                RowBox[{"x", ",", "u"}], "]"}], ",", 
               RowBox[{"line", "[", 
                RowBox[{"a", ",", "v"}], "]"}]}], "]"}]}], "]"}], ",", 
           RowBox[{"congruent", "[", 
            RowBox[{
             RowBox[{"line", "[", 
              RowBox[{"y", ",", "u"}], "]"}], ",", 
             RowBox[{"line", "[", 
              RowBox[{"b", ",", "v"}], "]"}]}], "]"}]}], "]"}], ",", 
         RowBox[{"congruent", "[", 
          RowBox[{
           RowBox[{"line", "[", 
            RowBox[{"z", ",", "u"}], "]"}], ",", 
           RowBox[{"line", "[", 
            RowBox[{"c", ",", "v"}], "]"}]}], "]"}]}], "]"}]}], "]"}],
      ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"x", ",", " ", "y"}], "}"}], ",", " ", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"equal", "[", 
          RowBox[{"x", ",", "y"}], "]"}], ",", 
         RowBox[{"equal", "[", 
          RowBox[{"y", ",", "x"}], "]"}]}], "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"x", ",", "y", ",", "z"}], "}"}], ",", 
       RowBox[{"implies", "[", 
        RowBox[{
         RowBox[{"and", "[", 
          RowBox[{
           RowBox[{"equal", "[", 
            RowBox[{"x", ",", "y"}], "]"}], ",", 
           RowBox[{"equal", "[", 
            RowBox[{"y", ",", "z"}], "]"}]}], "]"}], ",", 
         RowBox[{"equal", "[", 
          RowBox[{"x", ",", "z"}], "]"}]}], "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{"x", ",", 
       RowBox[{"equal", "[", 
        RowBox[{"x", ",", "x"}], "]"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b"}], "}"}], ",", 
       RowBox[{
        RowBox[{"and", "[", 
         RowBox[{"a", ",", "b"}], "]"}], "\[Equal]", 
        RowBox[{"and", "[", 
         RowBox[{"b", ",", "a"}], "]"}]}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b"}], "}"}], ",", 
       RowBox[{
        RowBox[{"or", "[", 
         RowBox[{"a", ",", "b"}], "]"}], "\[Equal]", 
        RowBox[{"or", "[", 
         RowBox[{"b", ",", "a"}], "]"}]}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b"}], "}"}], ",", 
       RowBox[{
        RowBox[{"and", "[", 
         RowBox[{"a", ",", 
          RowBox[{"or", "[", 
           RowBox[{"b", ",", 
            RowBox[{"not", "[", "b", "]"}]}], "]"}]}], "]"}], 
        "\[Equal]", "a"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b"}], "}"}], ",", 
       RowBox[{
        RowBox[{"or", "[", 
         RowBox[{"a", ",", 
          RowBox[{"and", "[", 
           RowBox[{"b", ",", 
            RowBox[{"not", "[", "b", "]"}]}], "]"}]}], "]"}], 
        "\[Equal]", "a"}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b", ",", "c"}], "}"}], ",", 
       RowBox[{
        RowBox[{"and", "[", 
         RowBox[{"a", ",", 
          RowBox[{"or", "[", 
           RowBox[{"b", ",", "c"}], "]"}]}], "]"}], "\[Equal]", 
        RowBox[{"or", "[", 
         RowBox[{
          RowBox[{"and", "[", 
           RowBox[{"a", ",", "b"}], "]"}], ",", 
          RowBox[{"and", "[", 
           RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b", ",", "c"}], "}"}], ",", 
       RowBox[{
        RowBox[{"or", "[", 
         RowBox[{"a", ",", 
          RowBox[{"and", "[", 
           RowBox[{"b", ",", "c"}], "]"}]}], "]"}], "\[Equal]", 
        RowBox[{"and", "[", 
         RowBox[{
          RowBox[{"or", "[", 
           RowBox[{"a", ",", "b"}], "]"}], ",", 
          RowBox[{"or", "[", 
           RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}]}], "]"}], ",", 
     RowBox[{"ForAll", "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"a", ",", "b"}], "}"}], ",", 
       RowBox[{
        RowBox[{"implies", "[", 
         RowBox[{"a", ",", "b"}], "]"}], "\[Equal]", 
        RowBox[{"or", "[", 
         RowBox[{
          RowBox[{"not", "[", "a", "]"}], ",", "b"}], "]"}]}]}], 
      "]"}], ",", 
     RowBox[{"HoldForm", "[", 
      RowBox[{"ForAll", "[", 
       RowBox[{
        RowBox[{"{", 
         RowBox[{
         "\[Alpha]", ",", " ", "\[Beta]", ",", " ", "y", ",", " ", 
          "z"}], "}"}], ",", " ", 
        RowBox[{"implies", "[", 
         RowBox[{
          RowBox[{"Exists", "[", 
           RowBox[{"x", ",", " ", 
            RowBox[{"implies", "[", 
             RowBox[{
              RowBox[{"and", "[", 
               RowBox[{
                RowBox[{"\[Alpha]", "[", "y", "]"}], ",", " ", 
                RowBox[{"\[Beta]", "[", "z", "]"}]}], "]"}], ",", " ", 
              RowBox[{"between", "[", 
               RowBox[{"x", ",", " ", "y", ",", " ", "z"}], "]"}]}], 
             "]"}]}], "]"}], ",", " ", 
          RowBox[{"Exists", "[", 
           RowBox[{"u", ",", " ", 
            RowBox[{"implies", "[", 
             RowBox[{
              RowBox[{"and", "[", 
               RowBox[{
                RowBox[{"\[Alpha]", "[", "y", "]"}], ",", " ", 
                RowBox[{"\[Beta]", "[", "z", "]"}]}], "]"}], ",", " ", 
              RowBox[{"between", "[", 
               RowBox[{"y", ",", " ", "u", ",", " ", "z"}], "]"}]}], 
             "]"}]}], "]"}]}], "]"}]}], "]"}], "]"}]}], "}"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"geometry", "=", 
   RowBox[{"Most", "[", "geometryall", "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Column", "[", 
  RowBox[{
   RowBox[{
    RowBox[{
     RowBox[{"Style", "[", 
      RowBox[{"#", ",", "Smaller"}], "]"}], "&"}], "/@", 
    "geometryall"}], ",", 
   RowBox[{"Frame", "\[Rule]", "All"}], ",", 
   RowBox[{"FrameStyle", "\[Rule]", "LightGray"}]}], "]"}]], "Input"]
}, Open  ]]

Once again, this is all just a collection of formal statements. The fact that we’re calling an operator between is just for our convenience and understanding. All we can really say for sure is that this is some ternary operator; any properties it has have to be defined by the axioms.

To get to this formalization of Euclid, quite a bit of tightening up had to be done. Euclid’s theorems often had implicit assumptions, and it sometimes wasn’t even clear exactly what their logical structure was supposed to be. But the mathematical content is presumably the same, and indeed some of Euclid’s axioms (like CN1) say basically exactly the same as these. (An important addition to what Euclid explicitly said is the last axiom above, which states Euclid’s implicit assumption—that I now believe to be incorrect for the physical universe—that space is continuous. Unlike other axioms, which just make statements “true for all values of ...”, this axiom makes a statement “true for all functions ...”.)

So what can we do with these axioms? Well, in principle we can prove any theorem in Euclidean geometry. Appending to the axioms (that we refer to—ignoring the last axiom—as geometry) an assertion that we can interpret as saying that if a point y is between x and z and between x and w, then either z is between y and w or w is between y and z:

FindEquationalProof
&#10005
FindEquationalProof[or[between[y, z, w], between[y, w, z]], 
 Append[geometry, and[between[x, y, z], between[x, y, w]]]]

Here’s a graph representing this proof:

%[“ProofGraph”]
&#10005
%["ProofGraph"]

The axioms (including the “setup assertion”) are at the top—and the proof, with all its various intermediate lemmas, establishes that our “hypothesis” (represented by a little purple diamond on the left) eventually leads to “true” at the bottom.

As a more complicated example, we can look at Euclid’s very first theorem, 1.1, which asserts that there’s a ruler-and-compass way to construct an equilateral triangle on any line segment. In the Wolfram Language, the construction is:

Entity
&#10005
Entity["GeometricScene", "EuclidBook1Proposition1"]["Scene"]
RandomInstance
&#10005
RandomInstance[%]

And now we can write this directly in terms of our low-level constructs. First we need a definition of what circles are (Euclid has this as Definition 1.15)—basically saying that two circles centered at a that go through b and c are equal if the lines from a to b and a to c are congruent:

circles
&#10005
circles = \!\(
\*SubscriptBox[\(\[ForAll]\), \({a, b, c}\)]\(implies[
   equal[circle[a, b], circle[a, c]], 
   congruent[line[a, b], line[a, c]]]\)\)

We’ll call this definition circles. We’re going to do a construction that involves having circles that overlap, as specified by the assertions:

equal
&#10005
{equal[circle[a, b], circle[a, c]], equal[circle[b, a], circle[b, c]]}

And then our goal is to show that we get an equilateral triangle, for which the following is true:

and[congruent[line[a, b]
&#10005
and[congruent[line[a, b], line[a, c]], 
 congruent[line[b, a], line[b, c]]]

Putting this all together we can prove Euclid’s 1.1:

FindEquationalProof
&#10005
FindEquationalProof[
 and[congruent[line[a, b], line[a, c]], 
  congruent[line[b, a], line[b, c]]], Join[geometry, {\!\(
\*SubscriptBox[\(\[ForAll]\), \({a, b, c}\)]\(implies[
     equal[circle[a, b], circle[a, c]], 
     congruent[line[a, b], line[a, c]]]\)\)}, {equal[circle[a, b], 
    circle[a, c]], equal[circle[b, a], circle[b, c]]}]]

And, yes, it took 272 steps—and here’s a graphical representation of the proof that got generated, with all its intermediate lemmas:

%[ProofGraph]
&#10005
%["ProofGraph"]

We can go on and prove Euclid’s 1.2 as well, all the way from the lowest-level axioms. This time it takes us 330 steps, with proof graph:

FindEquationalProof
&#10005
CloudGet["https://wolfr.am/POgPyWJt"];
						FindEquationalProof[congruent[line[a, l], line[b, c]], 
Join[geometry, {\!\(
\*SubscriptBox[\(\[ForAll]\), \({a, b, c}\)]\(implies[
     equal[circle[a, b], circle[a, c]], 
     congruent[line[a, b], line[a, c]]]\)\)}, {equal[circle[a, b], 
    circle[a, d]], equal[circle[b, a], circle[b, d]], 
   and[between[a, d, e], between[b, d, f]], 
   and[equal[circle[b, c], circle[b, g]], 
    equal[circle[b, c], circle[b, h]]], 
   and[equal[circle[d, g], circle[d, k]], 
    equal[circle[d, g], circle[d, l]]]}], "ProofGraph"]

These graphs are conceptually similar to, but concretely rather different from, our “empirical metamathematics” graphs above. There are differences at the level of how interdependence of theorems is defined. But, more important, this graph is generated by automated theorem proving methods; the intermediate theorems (or lemmas) it involves are produced “on the fly” for the convenience of the computer, not because they help in any way to explain the proof to a human. In our empirical metamathematics on Euclid’s Elements, however, we’re dealing with the theorems that Euclid chose to define, and that have served as a basis for explaining his proofs to humans for more than two thousand years.

By the way, if our goal is simply to find out what’s true in geometry—rather than to write out step-by-step proofs—then we now know how to do that. Essentially it involves turning geometric assertions into algebraic ones—and then systematically solving the polynomial equations and inequalities that result. It can be computationally expensive, but in the Wolfram Language we now have one master function, CylindricalDecomposition, that ultimately does the job. And, yes, given Gödel's theorem, one might wonder whether this kind of finite procedure for solving any Euclid-style geometry problem was even possible. But it turns out that—unlike arithmetic, for which Gödel’s theorem was originally proved—Euclid-style geometry, like basic logic, is decidable, in the sense that there is ultimately a finite procedure for deciding whether any given statement is true or not. In principle, this procedure could be based on theorem proving from the axioms, but CylindricalDecomposition effectively leverages a tower of more sophisticated mathematics to provide a much more efficient approach.

All Possible Theorems

From the axioms of geometry one can in principle derive an infinite number of true theorems—of which Euclid picked just 465 to include in his Elements. But why these theorems, and not others? Given a precise symbolic representation of geometry—as in the axioms above—one can just start enumerating true theorems.

One way to do this is to use a multiway system, with the axioms defining transformation rules that one can apply in all possible ways. In effect this is like constructing every possible proof, and seeing what gets proved. Needless to say, the network that gets produced quickly becomes extremely large—even if its structure is interesting for our attempt to find a “bulk theory of metamathematics”.

Here’s an example of doing it, not for the full geometry axioms above, but for basic logic (which is actually part of the axiom system we’ve used for geometry). We can either start with expressions, or with statements. Here we start with the expression xy, and then progressively find all expressions equal to it. Here’s the first, rather pedantic step:

BooleanDisplay
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"BooleanDisplay", "[", "expr_", "]"}], ":=", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"symbs", "=", 
      RowBox[{"Union", "[", 
       RowBox[{"Level", "[", 
        RowBox[{"expr", ",", 
         RowBox[{"{", 
          RowBox[{"-", "1"}], "}"}]}], "]"}], "]"}]}], "}"}], ",", 
    RowBox[{"expr", "/.", 
     RowBox[{"Thread", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"(", 
         RowBox[{"#", "\[Rule]", 
          RowBox[{"Take", "[", 
           RowBox[{
            RowBox[{"Alphabet", "[", "\"\<Greek\>\"", "]"}], ",", 
            RowBox[{"Length", "[", "#", "]"}]}], "]"}]}], ")"}], 
        "&"}], "[", 
       RowBox[{"Select", "[", 
        RowBox[{"symbs", ",", 
         RowBox[{
          RowBox[{"StringContainsQ", "[", 
           RowBox[{
            RowBox[{"SymbolName", "[", "#", "]"}], ",", "\"\<$\>\""}],
            "]"}], "&"}]}], "]"}], "]"}], "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Graph", "[", 
  RowBox[{
   RowBox[{
    RowBox[{
    "ResourceFunction", "[", "\"\<MultiwayOperatorSystem\>\"", "]"}], 
    "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
        RowBox[{"And", "[", 
         RowBox[{"b", ",", "a"}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
        RowBox[{"Or", "[", 
         RowBox[{"b", ",", "a"}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"Or", "[", 
           RowBox[{"b_", ",", 
            RowBox[{"Not", "[", "b_", "]"}]}], "]"}]}], "]"}], 
        "\[RuleDelayed]", "a"}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"And", "[", 
           RowBox[{"b_", ",", 
            RowBox[{"Not", "[", "b_", "]"}]}], "]"}]}], "]"}], 
        "\[RuleDelayed]", "a"}], ",", 
       RowBox[{"a_", "\[RuleDelayed]", 
        RowBox[{"Module", "[", 
         RowBox[{
          RowBox[{"{", "b", "}"}], ",", 
          RowBox[{"And", "[", 
           RowBox[{"a", ",", 
            RowBox[{"Or", "[", 
             RowBox[{"b", ",", 
              RowBox[{"Not", "[", "b", "]"}]}], "]"}]}], "]"}]}], 
         "]"}]}], ",", 
       RowBox[{"a_", "\[RuleDelayed]", 
        RowBox[{"Module", "[", 
         RowBox[{
          RowBox[{"{", "b", "}"}], ",", 
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", 
            RowBox[{"And", "[", 
             RowBox[{"b", ",", 
              RowBox[{"Not", "[", "b", "]"}]}], "]"}]}], "]"}]}], 
         "]"}]}], ",", 
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"Or", "[", 
           RowBox[{"b_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"Or", "[", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a", ",", "b"}], "]"}], ",", 
          RowBox[{"And", "[", 
           RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", "b_"}], "]"}], ",", 
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"And", "[", 
         RowBox[{"a", ",", 
          RowBox[{"Or", "[", 
           RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"And", "[", 
           RowBox[{"b_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"And", "[", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", "b"}], "]"}], ",", 
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", "b_"}], "]"}], ",", 
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"Or", "[", 
         RowBox[{"a", ",", 
          RowBox[{"And", "[", 
           RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}]}], "}"}], ",", 
     RowBox[{"And", "[", 
      RowBox[{"x", ",", "y"}], "]"}], ",", "1", ",", 
     "\"\<StatesGraph\>\"", ",", 
     RowBox[{"\"\<StateRenderingFunction\>\"", "\[RuleDelayed]", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{"Inset", "[", 
         RowBox[{
          RowBox[{"Framed", "[", 
           RowBox[{
            RowBox[{"Style", "[", 
             RowBox[{
              RowBox[{"TraditionalForm", "[", 
               RowBox[{"BooleanDisplay", "[", 
                RowBox[{"ToExpression", "[", "#2", "]"}], "]"}], 
               "]"}], ",", "Black"}], "]"}], ",", 
            RowBox[{"Background", "\[Rule]", 
             RowBox[{"Directive", "[", 
              RowBox[{
               RowBox[{"Opacity", "[", "0.5`", "]"}], ",", 
               InterpretationBox[
                ButtonBox[
                 TooltipBox[
                  GraphicsBox[{
                    {GrayLevel[0], RectangleBox[{0, 0}]}, 
                    {GrayLevel[0], RectangleBox[{1, -1}]}, 
                    {RGBColor[0.73925, 0.79406, 0.935], 
                    RectangleBox[{0, -1}, {2, 1}]}},
                   AspectRatio->1,
                   DefaultBaseStyle->"ColorSwatchGraphics",
                   Frame->True,
                   
                   FrameStyle->RGBColor[
                    0.49283333333333335`, 0.5293733333333334, 
                    0.6233333333333334],
                   FrameTicks->None,
                   
                   ImageSize->
                    Dynamic[{
                    Automatic, 
                    1.35 (CurrentValue["FontCapHeight"]/
                    AbsoluteCurrentValue[Magnification])}],
                   PlotRangePadding->None],
                  StyleBox[
                   RowBox[{"RGBColor", "[", 
                    
                    RowBox[{
                    "0.73925`", ",", "0.79406`", ",", "0.935`"}], 
                    "]"}], NumberMarks -> False]],
                 Appearance->None,
                 BaseStyle->{},
                 BaselinePosition->Baseline,
                 
                 ButtonFunction:>
                  With[{Typeset`box$ = EvaluationBox[]}, 
                   If[
                    Not[
                    AbsoluteCurrentValue["Deployed"]], 
                    SelectionMove[Typeset`box$, All, Expression]; 
                    FrontEnd`Private`$ColorSelectorInitialAlpha = 1; 
                    FrontEnd`Private`$ColorSelectorInitialColor = 
                    RGBColor[0.73925, 0.79406, 0.935]; 
                    FrontEnd`Private`$ColorSelectorUseMakeBoxes = 
                    True; MathLink`CallFrontEnd[
                    FrontEnd`AttachCell[Typeset`box$, 
                    FrontEndResource["RGBColorValueSelector"], {
                    0, {Left, Bottom}}, {Left, Top}, 
                    "ClosingActions" -> {
                    "SelectionDeparture", "ParentChanged", 
                    "EvaluatorQuit"}]]]],
                 DefaultBaseStyle->{},
                 Evaluator->Automatic,
                 Method->"Preemptive"],
                RGBColor[0.73925, 0.79406, 0.935],
                Editable->False,
                Selectable->False]}], "]"}]}], ",", 
            RowBox[{"FrameStyle", "\[Rule]", 
             RowBox[{"GrayLevel", "[", ".7", "]"}]}], ",", 
            RowBox[{"RoundingRadius", "\[Rule]", "4"}], ",", 
            RowBox[{"FrameMargins", "\[Rule]", "2"}]}], "]"}], ",", 
          "#"}], "]"}], "&"}], ")"}]}]}], "]"}], ",", 
   RowBox[{
   "GraphLayout", "\[Rule]", "\"\<SpringElectricalEmbedding\>\""}], 
   ",", 
   RowBox[{"EdgeStyle", "->", 
    RowBox[{
     RowBox[{
      RowBox[{
      "ResourceFunction", "[", 
       "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
      "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
     "]"}]}]}], "]"}]], "Input"]
}, Open  ]]

And here’s the second step:

The second step
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"BooleanDisplay", "[", "expr_", "]"}], ":=", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"symbs", "=", 
      RowBox[{"Union", "[", 
       RowBox[{"Level", "[", 
        RowBox[{"expr", ",", 
         RowBox[{"{", 
          RowBox[{"-", "1"}], "}"}]}], "]"}], "]"}]}], "}"}], ",", 
    RowBox[{"expr", "/.", 
     RowBox[{"Thread", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"(", 
         RowBox[{"#", "\[Rule]", 
          RowBox[{"Take", "[", 
           RowBox[{
            RowBox[{"Alphabet", "[", "\"\<Greek\>\"", "]"}], ",", 
            RowBox[{"Length", "[", "#", "]"}]}], "]"}]}], ")"}], 
        "&"}], "[", 
       RowBox[{"Select", "[", 
        RowBox[{"symbs", ",", 
         RowBox[{
          RowBox[{"StringContainsQ", "[", 
           RowBox[{
            RowBox[{"SymbolName", "[", "#", "]"}], ",", "\"\<$\>\""}],
            "]"}], "&"}]}], "]"}], "]"}], "]"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"{", 
  RowBox[{
   RowBox[{
    RowBox[{
    "ResourceFunction", "[", "\"\<MultiwayOperatorSystem\>\"", "]"}], 
    "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
        RowBox[{"And", "[", 
         RowBox[{"b", ",", "a"}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
        RowBox[{"Or", "[", 
         RowBox[{"b", ",", "a"}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"Or", "[", 
           RowBox[{"b_", ",", 
            RowBox[{"Not", "[", "b_", "]"}]}], "]"}]}], "]"}], 
        "\[RuleDelayed]", "a"}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"And", "[", 
           RowBox[{"b_", ",", 
            RowBox[{"Not", "[", "b_", "]"}]}], "]"}]}], "]"}], 
        "\[RuleDelayed]", "a"}], ",", 
       RowBox[{"a_", "\[RuleDelayed]", 
        RowBox[{"Module", "[", 
         RowBox[{
          RowBox[{"{", "b", "}"}], ",", 
          RowBox[{"And", "[", 
           RowBox[{"a", ",", 
            RowBox[{"Or", "[", 
             RowBox[{"b", ",", 
              RowBox[{"Not", "[", "b", "]"}]}], "]"}]}], "]"}]}], 
         "]"}]}], ",", 
       RowBox[{"a_", "\[RuleDelayed]", 
        RowBox[{"Module", "[", 
         RowBox[{
          RowBox[{"{", "b", "}"}], ",", 
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", 
            RowBox[{"And", "[", 
             RowBox[{"b", ",", 
              RowBox[{"Not", "[", "b", "]"}]}], "]"}]}], "]"}]}], 
         "]"}]}], ",", 
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"Or", "[", 
           RowBox[{"b_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"Or", "[", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a", ",", "b"}], "]"}], ",", 
          RowBox[{"And", "[", 
           RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", "b_"}], "]"}], ",", 
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"And", "[", 
         RowBox[{"a", ",", 
          RowBox[{"Or", "[", 
           RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"Or", "[", 
         RowBox[{"a_", ",", 
          RowBox[{"And", "[", 
           RowBox[{"b_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"And", "[", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", "b"}], "]"}], ",", 
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}], ",", 
       RowBox[{
        RowBox[{"And", "[", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", "b_"}], "]"}], ",", 
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", "c_"}], "]"}]}], "]"}], 
        "\[RuleDelayed]", 
        RowBox[{"Or", "[", 
         RowBox[{"a", ",", 
          RowBox[{"And", "[", 
           RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}]}], "}"}], ",", 
     RowBox[{"And", "[", 
      RowBox[{"x", ",", "y"}], "]"}], ",", "2", ",", 
     "\"\<StatesGraphStructure\>\"", ",", 
     RowBox[{"\"\<StateRenderingFunction\>\"", "\[RuleDelayed]", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{"Inset", "[", 
         RowBox[{
          RowBox[{"Style", "[", 
           RowBox[{
            RowBox[{"TraditionalForm", "[", 
             RowBox[{"BooleanDisplay", "[", 
              RowBox[{"ToExpression", "[", "#2", "]"}], "]"}], "]"}], 
            ",", "Black"}], "]"}], ",", "#", ",", 
          RowBox[{"Background", "\[Rule]", 
           RowBox[{
            RowBox[{
             RowBox[{
             "ResourceFunction", "[", 
              "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
             "\"\<StatesGraph\>\"", "]"}], "[", "\"\<VertexStyle\>\"",
             "]"}]}]}], "]"}], "&"}], ")"}]}], ",", 
     RowBox[{"EdgeStyle", "->", 
      RowBox[{
       RowBox[{
        RowBox[{
        "ResourceFunction", "[", 
         "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
        "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
       "]"}]}], ",", 
     RowBox[{"ImageSize", "\[Rule]", " ", "300"}]}], "]"}], ",", 
   RowBox[{"Graph", "[", 
    RowBox[{
     RowBox[{
      RowBox[{
      "ResourceFunction", "[", "\"\<MultiwayOperatorSystem\>\"", 
       "]"}], "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
          RowBox[{"And", "[", 
           RowBox[{"b", ",", "a"}], "]"}]}], ",", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", "b_"}], "]"}], "\[RuleDelayed]", 
          RowBox[{"Or", "[", 
           RowBox[{"b", ",", "a"}], "]"}]}], ",", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", 
            RowBox[{"Or", "[", 
             RowBox[{"b_", ",", 
              RowBox[{"Not", "[", "b_", "]"}]}], "]"}]}], "]"}], 
          "\[RuleDelayed]", "a"}], ",", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", 
            RowBox[{"And", "[", 
             RowBox[{"b_", ",", 
              RowBox[{"Not", "[", "b_", "]"}]}], "]"}]}], "]"}], 
          "\[RuleDelayed]", "a"}], ",", 
         RowBox[{"a_", "\[RuleDelayed]", 
          RowBox[{"Module", "[", 
           RowBox[{
            RowBox[{"{", "b", "}"}], ",", 
            RowBox[{"And", "[", 
             RowBox[{"a", ",", 
              RowBox[{"Or", "[", 
               RowBox[{"b", ",", 
                RowBox[{"Not", "[", "b", "]"}]}], "]"}]}], "]"}]}], 
           "]"}]}], ",", 
         RowBox[{"a_", "\[RuleDelayed]", 
          RowBox[{"Module", "[", 
           RowBox[{
            RowBox[{"{", "b", "}"}], ",", 
            RowBox[{"Or", "[", 
             RowBox[{"a", ",", 
              RowBox[{"And", "[", 
               RowBox[{"b", ",", 
                RowBox[{"Not", "[", "b", "]"}]}], "]"}]}], "]"}]}], 
           "]"}]}], ",", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{"a_", ",", 
            RowBox[{"Or", "[", 
             RowBox[{"b_", ",", "c_"}], "]"}]}], "]"}], 
          "\[RuleDelayed]", 
          RowBox[{"Or", "[", 
           RowBox[{
            RowBox[{"And", "[", 
             RowBox[{"a", ",", "b"}], "]"}], ",", 
            RowBox[{"And", "[", 
             RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}], ",", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{
            RowBox[{"And", "[", 
             RowBox[{"a_", ",", "b_"}], "]"}], ",", 
            RowBox[{"And", "[", 
             RowBox[{"a_", ",", "c_"}], "]"}]}], "]"}], 
          "\[RuleDelayed]", 
          RowBox[{"And", "[", 
           RowBox[{"a", ",", 
            RowBox[{"Or", "[", 
             RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}], ",", 
         RowBox[{
          RowBox[{"Or", "[", 
           RowBox[{"a_", ",", 
            RowBox[{"And", "[", 
             RowBox[{"b_", ",", "c_"}], "]"}]}], "]"}], 
          "\[RuleDelayed]", 
          RowBox[{"And", "[", 
           RowBox[{
            RowBox[{"Or", "[", 
             RowBox[{"a", ",", "b"}], "]"}], ",", 
            RowBox[{"Or", "[", 
             RowBox[{"a", ",", "c"}], "]"}]}], "]"}]}], ",", 
         RowBox[{
          RowBox[{"And", "[", 
           RowBox[{
            RowBox[{"Or", "[", 
             RowBox[{"a_", ",", "b_"}], "]"}], ",", 
            RowBox[{"Or", "[", 
             RowBox[{"a_", ",", "c_"}], "]"}]}], "]"}], 
          "\[RuleDelayed]", 
          RowBox[{"Or", "[", 
           RowBox[{"a", ",", 
            RowBox[{"And", "[", 
             RowBox[{"b", ",", "c"}], "]"}]}], "]"}]}]}], "}"}], ",", 
       
       RowBox[{"And", "[", 
        RowBox[{"x", ",", "y"}], "]"}], ",", "2", ",", 
       "\"\<StatesGraph\>\"", ",", 
       RowBox[{"\"\<StateRenderingFunction\>\"", "\[RuleDelayed]", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{"Inset", "[", 
           RowBox[{
            RowBox[{"Framed", "[", 
             RowBox[{
              RowBox[{"Style", "[", 
               RowBox[{
                RowBox[{"TraditionalForm", "[", 
                 RowBox[{"BooleanDisplay", "[", 
                  RowBox[{"ToExpression", "[", "#2", "]"}], "]"}], 
                 "]"}], ",", "Black"}], "]"}], ",", 
              RowBox[{"Background", "\[Rule]", 
               RowBox[{"Directive", "[", 
                RowBox[{
                 RowBox[{"Opacity", "[", "0.5`", "]"}], ",", 
                 InterpretationBox[
                  ButtonBox[
                   TooltipBox[
                    GraphicsBox[{
                    {GrayLevel[0], RectangleBox[{0, 0}]}, 
                    {GrayLevel[0], RectangleBox[{1, -1}]}, 
                    {RGBColor[0.73925, 0.79406, 0.935], 
                    RectangleBox[{0, -1}, {2, 1}]}},
                    AspectRatio->1,
                    DefaultBaseStyle->"ColorSwatchGraphics",
                    Frame->True,
                    
                    FrameStyle->RGBColor[
                    0.49283333333333335`, 0.5293733333333334, 
                    0.6233333333333334],
                    FrameTicks->None,
                    
                    ImageSize->
                    Dynamic[{
                    Automatic, 
                    1.35 (CurrentValue["FontCapHeight"]/
                    AbsoluteCurrentValue[Magnification])}],
                    PlotRangePadding->None],
                    StyleBox[
                    RowBox[{"RGBColor", "[", 
                    
                    RowBox[{
                    "0.73925`", ",", "0.79406`", ",", "0.935`"}], 
                    "]"}], NumberMarks -> False]],
                   Appearance->None,
                   BaseStyle->{},
                   BaselinePosition->Baseline,
                   
                   ButtonFunction:>
                    With[{Typeset`box$ = EvaluationBox[]}, 
                    If[
                    Not[
                    AbsoluteCurrentValue["Deployed"]], 
                    SelectionMove[Typeset`box$, All, Expression]; 
                    FrontEnd`Private`$ColorSelectorInitialAlpha = 1; 
                    FrontEnd`Private`$ColorSelectorInitialColor = 
                    RGBColor[0.73925, 0.79406, 0.935]; 
                    FrontEnd`Private`$ColorSelectorUseMakeBoxes = 
                    True; MathLink`CallFrontEnd[
                    FrontEnd`AttachCell[Typeset`box$, 
                    FrontEndResource["RGBColorValueSelector"], {
                    0, {Left, Bottom}}, {Left, Top}, 
                    "ClosingActions" -> {
                    "SelectionDeparture", "ParentChanged", 
                    "EvaluatorQuit"}]]]],
                   DefaultBaseStyle->{},
                   Evaluator->Automatic,
                   Method->"Preemptive"],
                  RGBColor[0.73925, 0.79406, 0.935],
                  Editable->False,
                  Selectable->False]}], "]"}]}], ",", 
              RowBox[{"FrameStyle", "\[Rule]", 
               RowBox[{"GrayLevel", "[", ".7", "]"}]}], ",", 
              RowBox[{"RoundingRadius", "\[Rule]", "4"}], ",", 
              RowBox[{"FrameMargins", "\[Rule]", "2"}]}], "]"}], ",", 
            "#"}], "]"}], "&"}], ")"}]}], ",", 
       RowBox[{"PlotRange", "\[Rule]", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"5.2", ",", "7.3"}], "}"}], ",", 
          RowBox[{"{", 
           RowBox[{"4", ",", "5.5"}], "}"}]}], "}"}]}]}], "]"}], ",", 
     
     RowBox[{"ImageSize", "\[Rule]", " ", "550"}], ",", 
     RowBox[{"EdgeStyle", "\[Rule]", 
      RowBox[{"Directive", "[", 
       RowBox[{
        RowBox[{
         RowBox[{
          RowBox[{
          "ResourceFunction", "[", 
           "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
          "\"\<StatesGraph\>\"", "]"}], "[", "\"\<EdgeStyle\>\"", 
         "]"}], ",", 
        RowBox[{"Arrowheads", "[", "Medium", "]"}]}], "]"}]}]}], 
    "]"}]}], "}"}]], "Input"]
}, Open  ]]

Every path in this graph is a proof that its endpoint expressions are equal. And while eventually this approach will give us every possible theorem (in this case about equalities involving xy), it’ll obviously take a while, generating huge numbers of long and uninteresting results on its way to anything interesting.

As a different approach, we can consider just enumerating short possible statements, then picking out ones that we determine are true. In principle we could determine truth by explicitly proving theorems using the axioms (and, yes, if there was undecidability we wouldn’t always be able to do this). But in practice for the case of basic logic that we’re using as an example here, we can basically just explicitly construct truth tables to find out what’s true and what’s not.

Here are some statements in logic, sorted in increasing order of complexity (as measured by depth and number of symbols):

ClearAll
&#10005
CloudGet["https://wolfr.am/PO7vasDF"];
(LogicFormat /@ (all43 = 
     Take[Select[FindAllAON[4, 3], LowestQ[#, {a, b, c}] &], 100])) //
  TraditionalForm[Style[#, 14]] &

Many (like a=b) are very obviously not true, at least not for all possible values of each variable. But—essentially by using truth tables—we can readily pick out ones that are always true:

LogicFormat /@ (If
&#10005
CloudGet["https://wolfr.am/PO7vasDF"]; (LogicFormat /@ (all43 = 
     Take[Select[FindAllAON[4, 3], LowestQ[#, {a, b, c}] &], 100])) //
  TraditionalForm[
   Style[#, 
    14]] &; (LogicFormat /@ (If[MemberQ[data53, #], 
       Framed[#, 
        Background -> 
         Lighter[RGBColor[1., 0.8549019607843137, 0.59], .6], 
        FrameStyle -> RGBColor["#efcabd"], RoundingRadius -> 3, 
        FrameMargins -> Tiny], 
       Framed[#, FrameMargins -> Tiny, FrameStyle -> None]] & /@ 
     all43)) // TraditionalForm[Style[#, 14]] &

OK, so now we can get a list of true theorems:

Framed
&#10005
CloudGet["https://wolfr.am/PO7vasDF"]; 
Framed[LogicFormat[#], 
    Background -> Lighter[RGBColor[1., 0.8549019607843137, 0.59], .6],
     FrameStyle -> RGBColor["#efcabd"], RoundingRadius -> 3, 
    FrameMargins -> None] & /@ Take[data53, 60] // 
 TraditionalForm[Style[#, 14]] &

Some are “interesting”. Others seem repetitive, overly complicated, or otherwise not terribly interesting. But if we want to “channel Euclid” we somehow have to decide which are the interesting theorems that we’re going to write down. And although Euclid himself didn’t explicitly discuss logic, we can look at textbooks of logic from the last couple of centuries—and we find that there’s a very consistent set of theorems that they end up picking out from the list, and giving names to:

Named theorems

One might assume that these named theorems were just the result of historical convention. But when I was writing A New Kind of Science I discovered something quite surprising. With all the theorems written out in “order of complexity”, I tried seeing which theorems I could prove just from theorems earlier in the list. Many were easy to prove. But some simply couldn’t be proved. And it turned out that these were essentially precisely the “named theorems”:

LogicFormat
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PO7vasDF\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"LogicFormat", "/@", 
    RowBox[{"(", 
     RowBox[{"interesting", "=", 
      RowBox[{"First", "/@", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "==", 
            RowBox[{"a", "\[Wedge]", "a"}]}], ",", 
           "\"\<idempotent law for and\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "==", 
            RowBox[{"a", "\[Vee]", "a"}]}], ",", 
           "\"\<idempotent law for or\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{"a", "\[Wedge]", "b"}], "==", 
            RowBox[{"b", "\[Wedge]", "a"}]}], ",", 
           "\"\<commutativity for and\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{"a", "\[Vee]", "b"}], "==", 
            RowBox[{"b", "\[Vee]", "a"}]}], ",", 
           "\"\<commutativity for or\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "==", 
            RowBox[{"\[Square]", 
             RowBox[{"\[Square]", "a"}]}]}], ",", 
           "\"\<law of double negation\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"\[Square]", "a"}], "\[Wedge]", "a"}], "==", 
            RowBox[{
             RowBox[{"\[Square]", "b"}], "\[Wedge]", "b"}]}], ",", 
           "\"\<definition of false (law of noncontradiction)\>\""}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"\[Square]", "a"}], "\[Vee]", "a"}], "==", 
            RowBox[{
             RowBox[{"\[Square]", "b"}], "\[Vee]", "b"}]}], ",", 
           "\"\<definition of true (law of excluded middle)\>\""}], 
          "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{"\[Square]", 
             RowBox[{"(", 
              RowBox[{"a", "\[Vee]", "b"}], ")"}]}], "==", 
            RowBox[{
             RowBox[{"\[Square]", "a"}], "\[Wedge]", 
             RowBox[{"\[Square]", "b"}]}]}], ",", 
           "\"\<de Morgan law\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{"\[Square]", 
             RowBox[{"(", 
              RowBox[{"a", "\[Wedge]", "b"}], ")"}]}], "==", 
            RowBox[{
             RowBox[{"\[Square]", "a"}], "\[Vee]", 
             RowBox[{"\[Square]", "b"}]}]}], ",", 
           "\"\<de Morgan law\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "==", 
            RowBox[{"a", "\[Wedge]", 
             RowBox[{"(", 
              RowBox[{"a", "\[Vee]", "b"}], ")"}]}]}], ",", 
           "\"\<absorption law\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "==", 
            RowBox[{"a", "\[Vee]", 
             RowBox[{"a", "\[Wedge]", "b"}]}]}], ",", 
           "\"\<absorption law\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"(", 
              RowBox[{"a", "\[Wedge]", "b"}], ")"}], "\[Wedge]", 
             "c"}], "==", 
            RowBox[{"a", "\[Wedge]", 
             RowBox[{"(", 
              RowBox[{"b", "\[Wedge]", "c"}], ")"}]}]}], ",", 
           "\"\<associativity of and\>\""}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"(", 
              RowBox[{"a", "\[Vee]", "b"}], ")"}], "\[Vee]", "c"}], "==", 
            RowBox[{"a", "\[Vee]", 
             RowBox[{"(", 
              RowBox[{"b", "\[Vee]", "c"}], ")"}]}]}], ",", 
           "\"\<associativity of or\>\""}], "}"}]}], "}"}]}]}], 
     ")"}]}], "//", 
   RowBox[{"(", 
    RowBox[{
     RowBox[{"TraditionalForm", "[", 
      RowBox[{"Style", "[", 
       RowBox[{"#", ",", "15"}], "]"}], "]"}], "&"}], ")"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"provable", "=", 
   RowBox[{"ParallelTable", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"data53", "[", 
        RowBox[{"[", "i", "]"}], "]"}], ",", 
       RowBox[{"FindEquationalProof", "[", 
        RowBox[{
         RowBox[{"data53", "[", 
          RowBox[{"[", "i", "]"}], "]"}], ",", 
         RowBox[{
          RowBox[{
           RowBox[{"ForAll", "[", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"a", ",", "b", ",", "c"}], "}"}], ",", "#"}], 
            "]"}], "&"}], "/@", 
          RowBox[{"Take", "[", 
           RowBox[{"data53", ",", 
            RowBox[{"i", "-", "1"}]}], "]"}]}]}], "]"}]}], "}"}], ",", 
     RowBox[{"{", 
      RowBox[{"i", ",", "2", ",", "100"}], "}"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{
    RowBox[{"If", "[", 
     RowBox[{
      RowBox[{
       RowBox[{"Head", "[", 
        RowBox[{"Last", "[", "#", "]"}], "]"}], "===", "Failure"}], 
      ",", 
      RowBox[{"Framed", "[", 
       RowBox[{
        RowBox[{"TraditionalForm", "[", 
         RowBox[{"LogicFormat", "[", 
          RowBox[{"First", "[", "#", "]"}], "]"}], "]"}], ",", 
        RowBox[{"Background", "\[Rule]", 
         InterpretationBox[
          ButtonBox[
           TooltipBox[
            GraphicsBox[{
              {GrayLevel[0], RectangleBox[{0, 0}]}, 
              {GrayLevel[0], RectangleBox[{1, -1}]}, 
              {RGBColor[1., 0.7803921568627451, 0.6823529411764706], 
               RectangleBox[{0, -1}, {2, 1}]}},
             AspectRatio->1,
             DefaultBaseStyle->"ColorSwatchGraphics",
             Frame->True,
             
             FrameStyle->RGBColor[
              0.6666666666666667, 0.5202614379084968, 
               0.4549019607843138],
             FrameTicks->None,
             
             ImageSize->
              Dynamic[{
               Automatic, 
                1.35 (CurrentValue["FontCapHeight"]/
                 AbsoluteCurrentValue[Magnification])}],
             PlotRangePadding->None],
            StyleBox[
             RowBox[{"RGBColor", "[", 
               
               RowBox[{
                "1.`", ",", "0.7803921568627451`", ",", 
                 "0.6823529411764706`"}], "]"}], NumberMarks -> 
             False]],
           Appearance->None,
           BaseStyle->{},
           BaselinePosition->Baseline,
           ButtonFunction:>With[{Typeset`box$ = EvaluationBox[]}, 
             If[
              Not[
               AbsoluteCurrentValue["Deployed"]], 
              SelectionMove[Typeset`box$, All, Expression]; 
              FrontEnd`Private`$ColorSelectorInitialAlpha = 1; 
              FrontEnd`Private`$ColorSelectorInitialColor = 
               RGBColor[1., 0.7803921568627451, 0.6823529411764706]; 
              FrontEnd`Private`$ColorSelectorUseMakeBoxes = True; 
              MathLink`CallFrontEnd[
                FrontEnd`AttachCell[Typeset`box$, 
                 FrontEndResource["RGBColorValueSelector"], {
                 0, {Left, Bottom}}, {Left, Top}, 
                 "ClosingActions" -> {
                  "SelectionDeparture", "ParentChanged", 
                   "EvaluatorQuit"}]]]],
           DefaultBaseStyle->{},
           Evaluator->Automatic,
           Method->"Preemptive"],
          RGBColor[1., 0.7803921568627451, 0.6823529411764706],
          Editable->False,
          Selectable->False]}], ",", 
        RowBox[{"FrameStyle", "\[Rule]", 
         RowBox[{"RGBColor", "[", "\"\<#f7c5b2\>\"", "]"}]}], ",", 
        RowBox[{"RoundingRadius", "\[Rule]", "3"}], ",", 
        RowBox[{"FrameMargins", "\[Rule]", "None"}]}], "]"}], ",", 
      RowBox[{"Graph", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"Last", "[", "#", "]"}], "[", "\"\<ProofGraph\>\"", 
         "]"}], ",", 
        RowBox[{"VertexLabels", "\[Rule]", "None"}], ",", 
        RowBox[{"ImageSize", "\[Rule]", 
         RowBox[{"{", 
          RowBox[{"Automatic", ",", "50"}], "}"}]}]}], "]"}]}], "]"}],
     "&"}], "/@", 
   RowBox[{"Take", "[", 
    RowBox[{"provable", ",", "70"}], "]"}]}], "//", 
  "TraditionalForm"}]], "Input"]
}, Open  ]]

In other words, the “named theorems” are basically the simplest statements of new facts about logic, that can’t be established from “simpler facts”. Eventually as one’s going through the list of theorems, one will have accumulated enough to fill out what can serve as full axioms for logic—so that then all subsequent theorems can be proved from “existing facts”.

Now of course the setup we’ve just used relies on the idea that one’s separately got a list of true theorems. To do something more like Euclid, we’d have to pick certain theorems to serve as axioms, then derive all others from these.

Back in 2000 I figured out the very simplest possible axiom system for logic, written in terms of Nand, just the single axiom:

AxiomaticTheory
&#10005
AxiomaticTheory[
   "WolframAxioms"] /. {\[FormalA] -> a, \[FormalB] -> 
    b, \[FormalC] -> c} // (TraditionalForm[Style[#, 18]] &)

So now writing And, Or and Not in terms of Nand according to

LogicFormat
&#10005
LogicFormat /@ {Square[a] == a\[CenterDot]a, 
   Wedge[a, b] == (a\[CenterDot]b)\[CenterDot](a\[CenterDot]b), 
   Vee[a, b] == (a\[CenterDot]a)\[CenterDot](b\[CenterDot]b)} // \
(TraditionalForm[Style[#, 18]] &)

we can, for example, derive the notable theorems of logic from my axiom. FindEquationalProof gives automated proofs of these theorems, though most of them involve quite a few steps (the — indicates a theorem that is trivially true after substituting the forms for And, Or and Not):

notableTheorems
&#10005
CloudGet["https://wolfr.am/PKWTJ8gE"];
Row[Grid[#, Frame -> All, 
     Background -> {{RGBColor[1., 0.8549019607843137, 0.59], None}, 
       None}] & /@ 
   Partition[
    Transpose@{TraditionalForm[LogicFormat[#]] & /@ (Last /@ 
         Flatten[Values[
           notableTheorems /. {OverBar -> Square, CirclePlus -> Vee, 
             CircleTimes -> Wedge}]]), {54, 54, 103, 102, 54, 95, 92, 
       132, 143, 91, 328, 274, 958, 1502, 
       Style["\[LongDash]", LightGray], 56, 131, 130, 120, 103}}, 5], 
  Spacer[2]] // TraditionalForm
					

The longer cases here involve first proving the lemma a·b = b·a which takes 102 steps. Including this lemma as an axiom, the minimal axiom system (as I also found in 2000) is:

AxiomaticTheory
&#10005
AxiomaticTheory[
   "WolframCommutativeAxioms"] /. {\[FormalA] -> a, \[FormalB] -> 
    b, \[FormalC] -> c} // (TraditionalForm[Style[#, 16]] &)

And with this axiom system FindEquationalProof succeeds in finding shorter proofs for the notable theorems of logic, even though now the definitions for And, Or and Not are just treated as theorems:

fullcax
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PKXzCFkk\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"#", "[", "\"\<ProofLength\>\"", "]"}], " ", "&"}], " ", "/@",
   " ", "notableProofs"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"Row", "[", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"Grid", "[", 
       RowBox[{"#", ",", 
        RowBox[{"Frame", "\[Rule]", "All"}], ",", 
        RowBox[{"Background", "\[Rule]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{
             InterpretationBox[
              ButtonBox[
               TooltipBox[
                GraphicsBox[{
                  {GrayLevel[0], RectangleBox[{0, 0}]}, 
                  {GrayLevel[0], RectangleBox[{1, -1}]}, 
                  {RGBColor[1., 0.8549019607843137, 0.59], 
                   RectangleBox[{0, -1}, {2, 1}]}},
                 AspectRatio->1,
                 DefaultBaseStyle->"ColorSwatchGraphics",
                 Frame->True,
                 
                 FrameStyle->RGBColor[
                  0.6666666666666667, 0.5699346405228758, 
                   0.3933333333333333],
                 FrameTicks->None,
                 
                 ImageSize->
                  Dynamic[{
                   Automatic, 
                    1.35 (CurrentValue["FontCapHeight"]/
                    AbsoluteCurrentValue[Magnification])}],
                 PlotRangePadding->None],
                StyleBox[
                 RowBox[{"RGBColor", "[", 
                   
                   RowBox[{
                    "1.`", ",", "0.8549019607843137`", ",", "0.59`"}],
                    "]"}], NumberMarks -> False]],
               Appearance->None,
               BaseStyle->{},
               BaselinePosition->Baseline,
               
               ButtonFunction:>
                With[{Typeset`box$ = EvaluationBox[]}, 
                 If[
                  Not[
                   AbsoluteCurrentValue["Deployed"]], 
                  SelectionMove[Typeset`box$, All, Expression]; 
                  FrontEnd`Private`$ColorSelectorInitialAlpha = 1; 
                  FrontEnd`Private`$ColorSelectorInitialColor = 
                   RGBColor[1., 0.8549019607843137, 0.59]; 
                  FrontEnd`Private`$ColorSelectorUseMakeBoxes = True; 
                  MathLink`CallFrontEnd[
                    FrontEnd`AttachCell[Typeset`box$, 
                    FrontEndResource["RGBColorValueSelector"], {
                    0, {Left, Bottom}}, {Left, Top}, 
                    "ClosingActions" -> {
                    "SelectionDeparture", "ParentChanged", 
                    "EvaluatorQuit"}]]]],
               DefaultBaseStyle->{},
               Evaluator->Automatic,
               Method->"Preemptive"],
              RGBColor[1., 0.8549019607843137, 0.59],
              Editable->False,
              Selectable->False], ",", "None"}], "}"}], ",", "None"}],
           "}"}]}]}], "]"}], "&"}], "/@", 
     RowBox[{"Partition", "[", 
      RowBox[{
       RowBox[{"Transpose", "@", 
        RowBox[{"{", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"TraditionalForm", "[", 
             RowBox[{"LogicFormat", "[", "#", "]"}], "]"}], "&"}], "/@", 
           RowBox[{"(", 
            RowBox[{"Last", "/@", 
             RowBox[{"Flatten", "[", 
              RowBox[{"Values", "[", 
               RowBox[{"notableTheorems", "/.", 
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"OverBar", "\[Rule]", "Square"}], ",", 
                  RowBox[{"CirclePlus", "\[Rule]", "Vee"}], ",", 
                  RowBox[{"CircleTimes", "\[Rule]", "Wedge"}]}], 
                 "}"}]}], "]"}], "]"}]}], ")"}]}], ",", 
          RowBox[{"{", 
           RowBox[{
           "21", ",", "15", ",", "8", ",", "9", ",", "17", ",", "130",
             ",", "119", ",", "9", ",", "28", ",", "43", ",", "32", 
            ",", "26", ",", "20", ",", "249", ",", "239", ",", "89", 
            ",", "129", ",", "129", ",", "328", ",", "338"}], "}"}]}],
          "}"}]}], ",", "5"}], "]"}]}], ",", 
    RowBox[{"Spacer", "[", "2", "]"}]}], "]"}], "//", 
  "TraditionalForm"}]], "Input"]
}, Open  ]]

Actually looking at these proofs is not terribly illuminating; they certainly don’t have the same kind of “explanatory feel” as Euclid. But combining the graphs for all these proofs is more interesting, because it shows us the common lemmas that were used in these proofs, and effectively defines a network of interdependencies between theorems:

notableTheorems
&#10005
CloudGet["https://wolfr.am/PKXzCFkk"];
					Show[Graph[dependencyNetworkSimplified, 
  GraphLayout -> "LayeredDigraphEmbedding", 
  EdgeStyle -> GrayLevel[.5, .5], 
  VertexStyle -> (# -> 
       Which[MemberQ[conclusion, #], 
        Directive[RGBColor[221/255, 17/255, 0], EdgeForm[]], 
        MemberQ[standingPropositions, Sort[#]], Hue[
        0.8238095238095239, 0.4, 0.9647058823529412], 
        MemberQ[viaLemmaPropositions, Sort[#]], Hue[
        0.8238095238095239, 0.9, 0.9647058823529412], 
        MemberQ[axiomsList, #], Hue[0.11309523809523807`, 0.84, 1.], 
        MemberQ[lemmas, #], {EdgeForm[Opacity[.75]], Opacity[.5]}] & /@
      VertexList[dependencyNetwork]), 
  VertexSize -> (# -> 
       Which[MemberQ[conclusion, #], 2, 
        MemberQ[standingPropositions, Sort[#]], .4 Sqrt[LeafCount[#]],
         MemberQ[viaLemmaPropositions, 
         Sort[#]], .4 Sqrt[LeafCount[#]], 
        MemberQ[axiomsList, #], .4 Sqrt[LeafCount[#]], 
        MemberQ[lemmas, #], .4 Sqrt[LeafCount[#]]] & /@ 
     VertexList[dependencyNetwork]), 
  VertexLabels -> (# -> 
       Which[MemberQ[conclusion, #], None, 
        MemberQ[standingPropositions, Sort[#]], LogicFormat[#], 
        MemberQ[viaLemmaPropositions, Sort[#]], 
        LogicFormat[# /. {x1 -> a, x2 -> b, x3 -> c}], 
        MemberQ[axiomsList, #], 
        LogicFormat[# /. {x1 -> a, x2 -> b, x3 -> c}], 
        MemberQ[lemmas, #], None] & /@ 
     VertexList[dependencyNetwork]), AspectRatio -> 1/2], 
 Editable -> True]	
					

There are 361 lemmas (i.e. automatically generated intermediate theorems) here. It’s a fair number, given that we’re only proving 20 theorems—but it’s definitely much less than the total of 1978 that would be involved in proving each of the theorems separately.

In our graph here—like in our Euclid theorem-dependency graphs above—the axioms are shown (in yellow) at the top. The “notable theorems” that we’re proving are shown in pink. But the structure of the graph is a little different from our earlier Euclid theorem-dependency graphs, and this alternative layout makes it clearer:

notableTheorems
&#10005
CloudGet["https://wolfr.am/PKXzCFkk"];
					Show[Graph[dependencyNetworkSimplified, 
  GraphLayout -> "SpringElectricalEmbedding", 
  EdgeStyle -> GrayLevel[.5, .5], 
  VertexStyle -> (# -> 
       Which[MemberQ[conclusion, #], 
        Directive[RGBColor[221/255, 17/255, 0], EdgeForm[]], 
        MemberQ[standingPropositions, Sort[#]], Hue[
        0.8238095238095239, 0.4, 0.9647058823529412], 
        MemberQ[viaLemmaPropositions, Sort[#]], Hue[
        0.8238095238095239, 0.9, 0.9647058823529412], 
        MemberQ[axiomsList, #], Hue[0.11309523809523807`, 0.84, 1.], 
        MemberQ[lemmas, #], {EdgeForm[Opacity[.75]], Opacity[.3]}] & /@
      VertexList[dependencyNetwork]), 
  VertexSize -> (# -> 
       Which[MemberQ[conclusion, #], 2, 
        MemberQ[standingPropositions, Sort[#]], .4 Sqrt[LeafCount[#]],
         MemberQ[viaLemmaPropositions, 
         Sort[#]], .4 Sqrt[LeafCount[#]], 
        MemberQ[axiomsList, #], .4 Sqrt[LeafCount[#]], 
        MemberQ[lemmas, #], .4 Sqrt[LeafCount[#]]] & /@ 
     VertexList[dependencyNetwork]), 
  VertexLabels -> (# -> 
       Which[MemberQ[conclusion, #], None, 
        MemberQ[standingPropositions, Sort[#]], LogicFormat[#], 
        MemberQ[viaLemmaPropositions, Sort[#]], 
        LogicFormat[# /. {x1 -> a, x2 -> b, x3 -> c}], 
        MemberQ[axiomsList, #], 
        LogicFormat[# /. {x1 -> a, x2 -> b, x3 -> c}], 
        MemberQ[lemmas, #], None] & /@ 
     VertexList[dependencyNetwork]), AspectRatio -> 1], 
 Editable -> True]

In Euclid, a given theorem is proved on the basis of other theorems, and ultimately on the basis of axioms. But here the automated theorem-proving process creates lemmas that ultimately allow one to show that the theorems one’s trying to prove are equivalent to “true” (i.e. to a tautology)—shown as a red node.

We can ask other questions, such as how long the lemmas are. Here are the distributions of lengths of the final notable theorems, and of the intermediate lemmas used to prove them:

{Labeled
&#10005
CloudGet["https://wolfr.am/PKXzCFkk"];
					{Labeled[Histogram[LeafCount /@ propositions, {1}, 
   PlotRange -> {{0, 25}, {0, 6}}, Frame -> True], 
  Style["notable theorems", FontFamily -> "Source Sans Pro", 
   FontSize -> 12]], 
 Labeled[Histogram[LeafCount /@ lemmas, {1}, 
   PlotRange -> {{0, 25}, {0, Automatic}}, Frame -> True], 
  Style["intermediate lemmas", FontFamily -> "Source Sans Pro", 
   FontSize -> 12]]}

We get something slightly more in the spirit of Euclid if we elide the lemmas, and just find the implied effective dependency graph between notable theorems:

dependencies
&#10005
CloudGet["https://wolfr.am/PKXzCFkk"]; dependencies = {};
Module[{proofObject = #}, 
   Module[{theorem = #}, 
      If[MemberQ[
        If[Length[#] >= 1, 
           Sort[#], #] & /@ (Normal[
            proofObject["ProofDataset"][[All, 1]][[
             Values]]] /. {x1 -> \[FormalA], x2 -> \[FormalB], 
            x3 -> \[FormalC]}), theorem], 
       dependencies = 
        Append[dependencies, 
         DirectedEdge[theorem, 
          Last[proofObject["Theorem"]]]]]] & /@ (Last /@ 
      toProve)] & /@ notableProofs; SimpleGraph[dependencies, 
 AspectRatio -> 1/2, 
 VertexLabels -> (# -> LogicFormat[#] & /@ VertexList[dependencies]), 
 VertexStyle -> Hue[0.8238095238095239, 0.4, 0.9647058823529412], 
 VertexSize -> .4, 
 EdgeStyle -> Directive[Arrowheads[.01], GrayLevel[.5, .5]]]

Transitive reduction then gives:

TransitiveReductionGraph
&#10005
CloudGet["https://wolfr.am/PKXzCFkk"]; dependencies = {};
Module[{proofObject = #}, 
   Module[{theorem = #}, 
      If[MemberQ[
        If[Length[#] >= 1, 
           Sort[#], #] & /@ (Normal[
            proofObject["ProofDataset"][[All, 1]][[
             Values]]] /. {x1 -> \[FormalA], x2 -> \[FormalB], 
            x3 -> \[FormalC]}), theorem], 
       dependencies = 
        Append[dependencies, 
         DirectedEdge[theorem, 
          Last[proofObject["Theorem"]]]]]] & /@ (Last /@ 
      toProve)] & /@ notableProofs; \
TransitiveReductionGraph[dependencies,  
 VertexLabels -> (# -> LogicFormat[#] & /@ VertexList[dependencies]), 
 VertexStyle -> Hue[0.8238095238095239, 0.4, 0.9647058823529412], 
 VertexSize -> .15, AspectRatio -> 1/3, 
 EdgeStyle -> Directive[Arrowheads[.02], GrayLevel[.5, .5]]]

By omitting intermediate lemmas, we’re in a sense just getting a shadow of the dependencies of the notable theorems, in the “environment” defined by our particular choice of axioms. But with this setup, it’s interesting to see the distributive law be the “hardest theorem”—kind of the metamathematical analog of Euclid’s 13.18 about the Platonic solids.

OK, but what we’re doing so far with logic is still fundamentally a bit different from how most of Euclid works. Because what Euclid typically does is to say something like “imagine such-and-such a geometrical setup; then the following theorem will be true about it”. And the analog of that for logic would be to take axioms of logic, then append some logical assertion, and ask if with the axioms and this assertion some particular statement is true. In other words, there are some statements—like the axioms—that will be true in “pure logic”, but there are more statements that will be true with particular setups (or, in the case of logic, particular possible values for variables).

For example, in “pure logic” abbb is not necessarily true (i.e. it is not a tautology). But if we assert that a(ab) is true, then this implies the following possible choices for a and b

SatisfiabilityInstances
&#10005
SatisfiabilityInstances[a == (a \[And] b), {a, b}, All]

and in all these cases abbb is true. So, in a Euclid tradition, we could say “imagine a setup where a(ab); then we can prove from the axioms of logic the theorem that a(ab)”.

Above we looked at which statements in logic are true for all values of variables:

LogicFormat
&#10005
CloudGet["https://wolfr.am/PO7vasDF"]; (LogicFormat /@ (all43 = 
     Take[Select[FindAllAON[4, 3], LowestQ[#, {a, b, c}] &], 100])) //
  TraditionalForm[Style[#, 14]] &;
(LogicFormat /@ (If[MemberQ[data53, #], 
       Framed[#, 
        Background -> 
         Lighter[RGBColor[1., 0.8549019607843137, 0.59], .6], 
        FrameStyle -> RGBColor["#efcabd"], RoundingRadius -> 3, 
        FrameMargins -> Tiny], 
       Framed[#, FrameMargins -> Tiny, FrameStyle -> None]] & /@ 
     Take[all43, 50])) // TraditionalForm[Style[#, 14]] &

Now let’s look at the ones that aren’t always true. If we assume that some particular one of these statements is true, we can see which other statements it implies are true:

CheckTrue
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PO7vasDF\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"(", 
    RowBox[{"LogicFormat", "/@", 
     RowBox[{"(", 
      RowBox[{"all43", "=", 
       RowBox[{"Take", "[", 
        RowBox[{
         RowBox[{"Select", "[", 
          RowBox[{
           RowBox[{"FindAllAON", "[", 
            RowBox[{"4", ",", "3"}], "]"}], ",", 
           RowBox[{
            RowBox[{"LowestQ", "[", 
             RowBox[{"#", ",", 
              RowBox[{"{", 
               RowBox[{"a", ",", "b", ",", "c"}], "}"}]}], "]"}], 
            "&"}]}], "]"}], ",", "100"}], "]"}]}], ")"}]}], ")"}], "//", 
   RowBox[{
    RowBox[{"TraditionalForm", "[", 
     RowBox[{"Style", "[", 
      RowBox[{"#", ",", "14"}], "]"}], "]"}], "&"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"CheckTrue", "[", 
   RowBox[{"expr_", ",", "reps_"}], "]"}], ":=", 
  RowBox[{"TrueQ", "[", 
   RowBox[{
    RowBox[{
     RowBox[{"Length", "[", "reps", "]"}], ">", "0"}], "&&", 
    RowBox[{"AllTrue", "[", 
     RowBox[{"reps", ",", 
      RowBox[{
       RowBox[{"expr", "/.", "#"}], "&"}]}], "]"}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"FindSats", "[", "expr_", "]"}], ":=", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"vars", "=", 
      RowBox[{"Union", "[", 
       RowBox[{"Level", "[", 
        RowBox[{"expr", ",", 
         RowBox[{"{", 
          RowBox[{"-", "1"}], "}"}]}], "]"}], "]"}]}], "}"}], ",", 
    RowBox[{"Thread", "/@", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{
        RowBox[{"vars", "\[Rule]", "#"}], "&"}], "/@", 
       RowBox[{"SatisfiabilityInstances", "[", 
        RowBox[{"expr", ",", "vars", ",", "All"}], "]"}]}], ")"}]}]}],
    "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"uxx", "=", 
   RowBox[{"Cases", "[", 
    RowBox[{
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Complement", "[", 
        RowBox[{
         RowBox[{"Take", "[", 
          RowBox[{"all43", ",", "30"}], "]"}], ",", "data53"}], "]"}],
        "/.", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"Vee", "\[Rule]", "Or"}], ",", 
         RowBox[{"Wedge", "\[Rule]", "And"}], ",", 
         RowBox[{"Square", "\[Rule]", "Not"}]}], "}"}]}], ")"}], ",", 
     "_Equal"}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"TraditionalForm", "[", 
  RowBox[{"Grid", "[", 
   RowBox[{
    RowBox[{
     RowBox[{"Function", "[", 
      RowBox[{"u", ",", 
       RowBox[{"{", 
        RowBox[{"u", ",", 
         RowBox[{"Select", "[", 
          RowBox[{"uxx", ",", 
           RowBox[{
            RowBox[{"CheckTrue", "[", 
             RowBox[{"#", ",", 
              RowBox[{"FindSats", "[", "u", "]"}]}], "]"}], "&"}]}], 
          "]"}]}], "}"}]}], "]"}], "/@", 
     RowBox[{"Take", "[", 
      RowBox[{"uxx", ",", "18"}], "]"}]}], ",", 
    RowBox[{"Alignment", "\[Rule]", "Left"}], ",", 
    RowBox[{"Dividers", "\[Rule]", " ", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"2", "\[Rule]", " ", 
        RowBox[{"GrayLevel", "[", "0.7", "]"}]}], ",", 
       RowBox[{
        RowBox[{
         RowBox[{"#", "\[Rule]", " ", 
          RowBox[{"GrayLevel", "[", "0.7", "]"}]}], "&"}], "/@", 
        RowBox[{"Range", "[", 
         RowBox[{"2", ",", "18"}], "]"}]}]}], "}"}]}], ",", 
    RowBox[{"Frame", "\[Rule]", "True"}]}], "]"}], "]"}]], "Input"]
}, Open  ]]

Or on a larger scale, with a black dot when one statement implies another:

all43p = Select
&#10005

Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PO7vasDF\>\"", "]"}],
   ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"CheckTrue", "[", 
   RowBox[{"expr_", ",", "reps_"}], "]"}], ":=", 
  RowBox[{"TrueQ", "[", 
   RowBox[{
    RowBox[{
     RowBox[{"Length", "[", "reps", "]"}], ">", "0"}], "&&", 
    RowBox[{"AllTrue", "[", 
     RowBox[{"reps", ",", 
      RowBox[{
       RowBox[{"expr", "/.", "#"}], "&"}]}], "]"}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"FindSats", "[", "expr_", "]"}], ":=", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"vars", "=", 
      RowBox[{"Union", "[", 
       RowBox[{"Level", "[", 
        RowBox[{"expr", ",", 
         RowBox[{"{", 
          RowBox[{"-", "1"}], "}"}]}], "]"}], "]"}]}], "}"}], ",", 
    RowBox[{"Thread", "/@", 
     RowBox[{"(", 
      RowBox[{
       RowBox[{
        RowBox[{"vars", "\[Rule]", "#"}], "&"}], "/@", 
       RowBox[{"SatisfiabilityInstances", "[", 
        RowBox[{"expr", ",", "vars", ",", "All"}], "]"}]}], ")"}]}]}],
    "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"all43p", "=", 
   RowBox[{"Select", "[", 
    RowBox[{
     RowBox[{"FindAllAON", "[", 
      RowBox[{"4", ",", "3"}], "]"}], ",", 
     RowBox[{
      RowBox[{"LowestQ", "[", 
       RowBox[{"#", ",", 
        RowBox[{"{", 
         RowBox[{"a", ",", "b", ",", "c"}], "}"}]}], "]"}], "&"}]}], 
    "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"uxxp", "=", 
   RowBox[{"Cases", "[", 
    RowBox[{
     RowBox[{"(", 
      RowBox[{
       RowBox[{"Complement", "[", 
        RowBox[{"all43p", ",", "data53"}], "]"}], "/.", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"Vee", "\[Rule]", "Or"}], ",", 
         RowBox[{"Wedge", "\[Rule]", "And"}], ",", 
         RowBox[{"Square", "\[Rule]", "Not"}]}], "}"}]}], ")"}], ",", 
     "_Equal"}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"uxxpr", "=", 
   RowBox[{"FindSats", "/@", "uxxp"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"ArrayPlot", "[", 
  RowBox[{"Boole", "[", 
   RowBox[{"Outer", "[", 
    RowBox[{"CheckTrue", ",", "uxxp", ",", "uxxpr", ",", "1"}], "]"}],
    "]"}], "]"}]], "Input"]
}, Open  ]]

For each of these theorems we can in principle construct a proof, using the axioms:

FindEquationalProof
&#10005
FindEquationalProof[(a\[CirclePlus]b) == (b\[CirclePlus]b), 
 Append[AxiomaticTheory["BooleanAxioms"], 
  a == (a\[CircleTimes]b)], "ProofGraph"]

And now we could go through and find out which theorems are useful in proving other theorems—and in principle this would allow us to build up a theorem dependency network. But there are undoubtedly many ways to do this, and so we’d need additional criteria to find ones that have whatever attributes would make us say “that might have been how someone like Euclid would have done it”.

OK, so could one look at geometry the same way? Basically, yes. Using the formalization we had above in terms of line, between, congruent, etc. we can again start by just enumerating possible statements. Unlike for logic, many of them won’t even make “structural sense”; for example they might contain line[congruent[...],...], but it makes no sense to have a line whose endpoint is a truth value. But we can certainly get a list of “structurally meaningful” statements.

And then we can ask which are “tautologically true”—though it’s in practice considerably harder to do this than for logic (the best known methods involve all sorts of elaborate algebraic computations, which Mathematica can certainly do, but which quickly become quite unwieldy). And after that, we can proceed like Euclid, and start saying “assert this, then you can prove this”. And, yes, it’s nice that after 2000+ years, we can finally imagine automating the process of producing generalizations of Euclid’s Elements. Though this just makes it more obvious that part of what Euclid did was in a sense a matter of art—picking in some kind of aesthetic way which possible sequence of theorems would best “tell his story” of geometry.

Math beyond Euclid

We’ve looked here at some of the empirical metamathematics of what Euclid did on geometry more than 2000 years ago. But what about more recent mathematics, and all those other areas of mathematics that have now been studied? In the history of mathematics, there have been perhaps 5 million research papers published, as well as probably hundreds of thousands of textbooks (though few quite as systematic as Euclid).

And, yes, in modern times almost all mathematics that’s published is on the web in some form. A few years ago we scraped arXiv and identified about 2 million things described as theorems there (the most popular being the central limit theorem, the implicit function theorem and Fubini’s theorem); we also scraped as much as we could of the visible web and found about 30 million theorems there. No doubt many were duplicates (though it’s hard—and in principle undecidable!—which they are). But it’s a reasonable estimate that there are a few million distinct theorems for which proofs have been published in the history of human mathematics.

It’s a remarkable piece of encapsulated intellectual achievement—perhaps the largest coherent such one produced by our species. And I’ve long been interested in seeing just what it would take to make it computable, and to bring it into the whole computational knowledge framework we have in the Wolfram Language. A few years ago I hoped that we could mobilize the mathematics community to help make this happen. But formalization is hard work, and it’s not at the center of what most mathematicians aspire to. Still, we’ve at least been slowly working—much as we have for Euclid-style geometry—to define the elements of computational language needed to represent theorems in various areas of mathematics.

For example, in the area of point-set topology, we have under development things like

Entity
Entity
&#10005
If[PacletFind["PureMath"] === {}, 
 PacletInstall[First[PacletFindRemote["PureMath"]]]]
Needs["PureMath`"]
Entity["TopologyConcept", "IsHausdorff"]["Output"] // InputForm

which in traditional mathematical notation becomes:

TraditionalForm
&#10005
TraditionalForm[%]

So far we have encoded in computable form 742 “topology concepts”, and 1687 theorems about them. Here are the connections recorded between concepts (dropping the concept of “topological spaces” that a third of all concepts are connected to, and labeling concepts with high betweenness centrality):

edges
&#10005
Cell[CellGroupData[{
Cell[BoxData[{
 RowBox[{"If", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"PacletFind", "[", "\"\<PureMath\>\"", "]"}], "===", 
    RowBox[{"{", "}"}]}], ",", 
   RowBox[{"PacletInstall", "[", 
    RowBox[{"First", "[", 
     RowBox[{"PacletFindRemote", "[", "\"\<PureMath\>\"", "]"}], 
     "]"}], "]"}]}], "]"}], "\n", 
 RowBox[{"Needs", "[", "\"\<PureMath`\>\"", "]"}]}], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"edges", "[", "type_", "]"}], ":=", 
  RowBox[{"Join", "@@", 
   RowBox[{"KeyValueMap", "[", 
    RowBox[{
     RowBox[{"Thread", "@*", "DirectedEdge"}], ",", 
     RowBox[{"EntityValue", "[", 
      RowBox[{
      "type", ",", "\"\<ReferencedEntities\>\"", ",", 
       "\"\<EntityAssociation\>\""}], "]"}]}], "]"}]}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"graph", "[", "args___", "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{"args", ",", 
    RowBox[{"VertexSize", "\[Rule]", "0.5"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"GrayLevel", "[", 
        RowBox[{".5", ",", ".5"}], "]"}], ",", 
       RowBox[{"Arrowheads", "[", ".02", "]"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"select", "[", 
   RowBox[{"g_Graph", ",", "centrality_", ",", "crit_"}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{"g", ",", 
    RowBox[{"VertexStyle", "\[Rule]", "Orange"}], ",", 
    RowBox[{"VertexSize", "\[Rule]", "2"}], ",", 
    RowBox[{"VertexLabels", "\[Rule]", 
     RowBox[{"MapThread", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"If", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"crit", "[", "#2", "]"}], "&&", 
           RowBox[{
            RowBox[{"StringLength", "[", 
             RowBox[{"ToString", "[", 
              RowBox[{"CommonName", "[", "#", "]"}], "]"}], "]"}], 
            "<", "20"}]}], ",", 
          RowBox[{"#", "\[Rule]", 
           RowBox[{"Placed", "[", 
            RowBox[{
             RowBox[{"Style", "[", 
              RowBox[{
               RowBox[{"CommonName", "[", "#", "]"}], ",", 
               RowBox[{"Background", "\[Rule]", "White"}]}], "]"}], 
             ",", "Left"}], "]"}]}], ",", 
          RowBox[{"#", "\[Rule]", "None"}]}], "]"}], "&"}], ",", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"VertexList", "[", "g", "]"}], ",", 
         RowBox[{"centrality", "[", "g", "]"}]}], "}"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Show", "[", 
  RowBox[{
   RowBox[{"First", "[", 
    RowBox[{"WeaklyConnectedGraphComponents", "[", 
     RowBox[{"select", "[", 
      RowBox[{
       RowBox[{"VertexDelete", "[", 
        RowBox[{
         RowBox[{"graph", "[", 
          RowBox[{"edges", "[", "\"\<TopologyConcept\>\"", "]"}], 
          "]"}], ",", 
         TemplateBox[{"\"topological spaces\"", 
           RowBox[{"Entity", "[", 
             RowBox[{"\"Category\"", ",", "\"Top\""}], "]"}], 
           "\"Entity[\\\"Category\\\", \\\"Top\\\"]\"", 
           "\"category\""},
          "Entity"]}], "]"}], ",", "BetweennessCentrality", ",", 
       RowBox[{"GreaterThan", "[", "0", "]"}]}], "]"}], "]"}], "]"}], 
   ",", " ", 
   RowBox[{"Editable", " ", "\[Rule]", " ", "True"}]}], 
  "]"}]], "Input"]
}, Open  ]]

And here is the graph of what theorem references what in its description:

First
&#10005
Cell[CellGroupData[{
Cell[BoxData[{
 RowBox[{"If", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"PacletFind", "[", "\"\<PureMath\>\"", "]"}], "===", 
    RowBox[{"{", "}"}]}], ",", 
   RowBox[{"PacletInstall", "[", 
    RowBox[{"First", "[", 
     RowBox[{"PacletFindRemote", "[", "\"\<PureMath\>\"", "]"}], 
     "]"}], "]"}]}], "]"}], "\n", 
 RowBox[{"Needs", "[", "\"\<PureMath`\>\"", "]"}]}], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"edges", "[", "type_", "]"}], ":=", 
  RowBox[{"Join", "@@", 
   RowBox[{"KeyValueMap", "[", 
    RowBox[{
     RowBox[{"Thread", "@*", "DirectedEdge"}], ",", 
     RowBox[{"EntityValue", "[", 
      RowBox[{
      "type", ",", "\"\<ReferencedEntities\>\"", ",", 
       "\"\<EntityAssociation\>\""}], "]"}]}], "]"}]}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"graph", "[", "args___", "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{"args", ",", 
    RowBox[{"VertexSize", "\[Rule]", "0.5"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"Directive", "[", 
      RowBox[{
       RowBox[{"GrayLevel", "[", 
        RowBox[{".5", ",", ".5"}], "]"}], ",", 
       RowBox[{"Arrowheads", "[", ".02", "]"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"select", "[", 
   RowBox[{"g_Graph", ",", "centrality_", ",", "crit_"}], "]"}], ":=", 
  RowBox[{"Graph", "[", 
   RowBox[{"g", ",", 
    RowBox[{"VertexStyle", "\[Rule]", "Orange"}], ",", 
    RowBox[{"VertexSize", "\[Rule]", "2"}], ",", 
    RowBox[{"VertexLabels", "\[Rule]", 
     RowBox[{"MapThread", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"If", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"crit", "[", "#2", "]"}], "&&", 
           RowBox[{
            RowBox[{"StringLength", "[", 
             RowBox[{"ToString", "[", 
              RowBox[{"CommonName", "[", "#", "]"}], "]"}], "]"}], 
            "<", "100"}]}], ",", 
          RowBox[{"#", "\[Rule]", 
           RowBox[{"Placed", "[", 
            RowBox[{
             RowBox[{"Style", "[", 
              RowBox[{
               RowBox[{"CommonName", "[", "#", "]"}], ",", 
               RowBox[{"Background", "\[Rule]", "White"}]}], "]"}], 
             ",", "Left"}], "]"}]}], ",", 
          RowBox[{"#", "\[Rule]", "None"}]}], "]"}], "&"}], ",", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"VertexList", "[", "g", "]"}], ",", 
         RowBox[{"centrality", "[", "g", "]"}]}], "}"}]}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Show", "[", 
  RowBox[{
   RowBox[{"First", "[", 
    RowBox[{"WeaklyConnectedGraphComponents", "[", 
     RowBox[{"select", "[", 
      RowBox[{
       RowBox[{"VertexDelete", "[", 
        RowBox[{
         RowBox[{"graph", "[", 
          RowBox[{"edges", "[", "\"\<TopologyTheorem\>\"", "]"}], 
          "]"}], ",", 
         TemplateBox[{"\"topological spaces\"", 
           RowBox[{"Entity", "[", 
             RowBox[{"\"Category\"", ",", "\"Top\""}], "]"}], 
           "\"Entity[\\\"Category\\\", \\\"Top\\\"]\"", 
           "\"category\""},
          "Entity"]}], "]"}], ",", 
       RowBox[{"BetweennessCentrality", "@*", "UndirectedGraph"}], 
       ",", 
       RowBox[{
        RowBox[{"#", "<", "0.0001"}], "&"}]}], "]"}], "]"}], "]"}], 
   ",", " ", 
   RowBox[{"Editable", " ", "\[Rule]", " ", "True"}]}], 
  "]"}]], "Input"]
}, Open  ]]

We haven’t encoded proofs for these theorems, so we can’t yet make the kind of theorem dependency graph that we did for Euclid. But we do have the dependency graph for 76 properties of topological spaces:

With
&#10005
Cell[CellGroupData[{
						Cell[BoxData[{
 RowBox[{"If", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"PacletFind", "[", "\"\<PureMath\>\"", "]"}], "===", 
    RowBox[{"{", "}"}]}], ",", 
   RowBox[{"PacletInstall", "[", 
    RowBox[{"First", "[", 
     RowBox[{"PacletFindRemote", "[", "\"\<PureMath\>\"", "]"}], 
     "]"}], "]"}]}], "]"}], "\n", 
 RowBox[{"Needs", "[", "\"\<PureMath`\>\"", "]"}]}], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"aggregate", "[", "axioms_List", "]"}], ":=", 
   RowBox[{"Union", "[", 
    RowBox[{
     RowBox[{
      RowBox[{
       RowBox[{"#", "[", 
        RowBox[{"[", 
         RowBox[{"1", ",", "1"}], "]"}], "]"}], "\[Implies]", 
       RowBox[{"Union", "@", 
        RowBox[{"Flatten", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", 
           RowBox[{"All", ",", "2"}], "]"}], "]"}], "]"}]}]}], "&"}], 
     "/@", 
     RowBox[{"GatherBy", "[", 
      RowBox[{"axioms", ",", "First"}], "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"propertyOntology", "=", 
   RowBox[{
    RowBox[{
     RowBox[{"Map", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"EntityValue", "[", 
         RowBox[{"#", ",", "\"\<PropertyRelations\>\""}], "]"}], 
        "&"}], ",", 
       RowBox[{"{", 
        RowBox[{
        "\"\<TopologyConcept\>\"", ",", "\"\<TopologyTheorem\>\""}], 
        "}"}]}], "]"}], "//", "Flatten"}], "//", "DeleteMissing"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"preprocessedOntology", "=", 
   RowBox[{"aggregate", "@", 
    RowBox[{"Fold", "[", 
     RowBox[{
      RowBox[{
       RowBox[{"Union", "@", 
        RowBox[{"Flatten", "@", 
         RowBox[{"Replace", "[", 
          RowBox[{"#1", ",", "#2", ",", "1"}], "]"}]}]}], "&"}], ",", 
      "propertyOntology", ",", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{
         RowBox[{"a_", "\[Equivalent]", "b_"}], "\[RuleDelayed]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "\[Implies]", "b"}], ",", 
           RowBox[{"b", "\[Implies]", "a"}]}], "}"}]}], ",", 
        RowBox[{
         RowBox[{"a_", "\[Implies]", "e_Equivalent"}], 
         "\[RuleDelayed]", 
         RowBox[{
          RowBox[{"(", 
           RowBox[{
            RowBox[{"a", "\[Implies]", "#"}], "&"}], ")"}], "/@", 
          RowBox[{"Subsets", "[", 
           RowBox[{"e", ",", 
            RowBox[{"{", "2", "}"}]}], "]"}]}]}], ",", 
        RowBox[{
         RowBox[{"a_", "\[Implies]", 
          RowBox[{"b_", "\[Equivalent]", "c_"}]}], "\[RuleDelayed]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{"a", "&&", "b"}], "\[Implies]", "c"}], ",", 
           RowBox[{
            RowBox[{"a", "&&", "c"}], "\[Implies]", "b"}]}], "}"}]}], 
        ",", 
        RowBox[{
         RowBox[{
          RowBox[{"d_", "&&", 
           RowBox[{"(", 
            RowBox[{"a_", "||", "b_"}], ")"}]}], "\[Implies]", "c_"}],
          "\[RuleDelayed]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
            RowBox[{"d", "&&", "a"}], "\[Implies]", "c"}], ",", 
           RowBox[{
            RowBox[{"d", "&&", "b"}], "\[Implies]", "c"}]}], "}"}]}], 
        ",", 
        RowBox[{
         RowBox[{"a_", "\[Implies]", "b_"}], "\[RuleDelayed]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"a", "\[Implies]", "b"}], ",", 
           RowBox[{
            RowBox[{"LogicalExpand", "[", 
             RowBox[{"!", "b"}], "]"}], "\[Implies]", 
            RowBox[{"LogicalExpand", "[", 
             RowBox[{"!", "a"}], "]"}]}]}], "}"}]}], ",", 
        RowBox[{
         RowBox[{"d_Or", "\[Implies]", "c_"}], "\[RuleDelayed]", 
         RowBox[{
          RowBox[{"(", 
           RowBox[{
            RowBox[{"#", "\[Implies]", "c"}], "&"}], ")"}], "/@", 
          RowBox[{"List", "@@", "d"}]}]}], ",", 
        RowBox[{
         RowBox[{"a_", "\[Implies]", "d_Or"}], "\[RuleDelayed]", 
         RowBox[{"With", "[", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"s", "=", 
             RowBox[{"List", "@@", "d"}]}], "}"}], ",", 
           RowBox[{
            RowBox[{
             RowBox[{"Implies", "[", 
              RowBox[{
               RowBox[{"a", "&&", 
                RowBox[{"And", "@@", 
                 RowBox[{"(", 
                  RowBox[{"Not", "/@", 
                   RowBox[{"Complement", "[", 
                    RowBox[{"s", ",", 
                    RowBox[{"{", "#", "}"}]}], "]"}]}], ")"}]}]}], 
               ",", "#"}], "]"}], "&"}], "/@", "s"}]}], "]"}]}], ",", 
        
        RowBox[{
         RowBox[{"a_", "\[Implies]", "b_"}], "\[RuleDelayed]", 
         RowBox[{"Implies", "[", 
          RowBox[{
           RowBox[{"Union", "@", 
            RowBox[{"Flatten", "[", 
             RowBox[{
              RowBox[{"{", "a", "}"}], "/.", " ", 
              RowBox[{"And", "\[Rule]", "List"}]}], "]"}]}], ",", 
           RowBox[{"Union", "@", 
            RowBox[{"Flatten", "[", 
             RowBox[{
              RowBox[{"{", "b", "}"}], "/.", " ", 
              RowBox[{"And", "\[Rule]", "List"}]}], "]"}]}]}], 
          "]"}]}]}], "}"}]}], "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"nodes", "=", 
   RowBox[{"CanonicalName", "/@", 
    RowBox[{"EntityProperties", "@", 
     RowBox[{"EntityPropertyClass", "[", 
      RowBox[{
      "\"\<TopologicalSpace\>\"", ",", 
       "\"\<TopologicalProperties\>\""}], "]"}]}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"deduce", "[", "known_List", "]"}], ":=", 
   RowBox[{"If", "[", 
    RowBox[{
     RowBox[{"SatisfiableQ", "[", 
      RowBox[{"And", "@@", "known"}], "]"}], ",", 
     RowBox[{"Union", "[", 
      RowBox[{"known", ",", 
       RowBox[{"Flatten", "[", 
        RowBox[{
         RowBox[{"Cases", "[", 
          RowBox[{"preprocessedOntology", ",", 
           RowBox[{"x_", "/;", 
            RowBox[{"SubsetQ", "[", 
             RowBox[{"known", ",", 
              RowBox[{"First", "@", "x"}]}], "]"}]}]}], "]"}], "[", 
         RowBox[{"[", 
          RowBox[{"All", ",", "2"}], "]"}], "]"}], "]"}]}], "]"}], 
     ",", 
     RowBox[{"{", "False", "}"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"edges", "=", 
   RowBox[{"Cases", "[", 
    RowBox[{
     RowBox[{"Union", "@@", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{
         RowBox[{"Thread", "[", 
          RowBox[{"#", "\[DirectedEdge]", 
           RowBox[{"FixedPoint", "[", 
            RowBox[{"deduce", ",", 
             RowBox[{"{", "#", "}"}]}], "]"}]}], "]"}], "&"}], "/@", 
        "nodes"}], ")"}]}], ",", 
     RowBox[{
      RowBox[{"a_String", "\[DirectedEdge]", "b_String"}], "/;", 
      RowBox[{"a", "\[NotEqual]", "b"}]}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Show", "[", 
  RowBox[{
   RowBox[{"With", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"g", "=", 
       RowBox[{"TransitiveReductionGraph", "@", 
        RowBox[{"Graph", "[", 
         RowBox[{"nodes", ",", "edges", ",", 
          RowBox[{
          "GraphLayout", "\[Rule]", 
           "\"\<LayeredDigraphEmbedding\>\""}]}], "]"}]}]}], "}"}], 
     ",", 
     RowBox[{"Graph", "[", 
      RowBox[{"g", ",", 
       RowBox[{"VertexLabels", "\[Rule]", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{"#", "\[Rule]", 
            RowBox[{"CommonName", "[", 
             RowBox[{"EntityProperty", "[", 
              RowBox[{"\"\<TopologicalSpace\>\"", ",", "#"}], "]"}], 
             "]"}]}], "&"}], "/@", 
          RowBox[{"VertexList", "[", "g", "]"}]}], ")"}]}], ",", " ", 
       
       RowBox[{"EdgeStyle", " ", "\[Rule]", " ", 
        RowBox[{"GrayLevel", "[", 
         RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], ",", " ", 
       RowBox[{"AspectRatio", "\[Rule]", " ", "0.8"}], ",", " ", 
       RowBox[{"VertexStyle", " ", "\[Rule]", " ", "Orange"}]}], 
      "]"}]}], "]"}], ",", " ", 
   RowBox[{"Editable", " ", "\[Rule]", " ", "True"}]}], 
  "]"}]], "Input"]
}, Open  ]]

The longest path here (along with a similar one starting with ) is 14 steps:

Row
&#10005
Cell[CellGroupData[{Cell[BoxData[{
 RowBox[{"If", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"PacletFind", "[", "\"\<PureMath\>\"", "]"}], "===", 
    RowBox[{"{", "}"}]}], ",", 
   RowBox[{"PacletInstall", "[", 
    RowBox[{"First", "[", 
     RowBox[{"PacletFindRemote", "[", "\"\<PureMath\>\"", "]"}], 
     "]"}], "]"}]}], "]"}], "\n", 
 RowBox[{"Needs", "[", "\"\<PureMath`\>\"", "]"}]}], "Input"],

Cell[BoxData[
 RowBox[{"Row", "[", 
  RowBox[{"Riffle", "[", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"EntityProperty", "[", 
       RowBox[{"\"\<TopologicalSpace\>\"", ",", "#"}], "]"}], "&"}], "/@", 
     RowBox[{"{", 
      RowBox[{
      "\"\<IsSurface\>\"", ",", "\"\<IsManifold\>\"", ",", 
       "\"\<IsCompletelyMetrizable\>\"", ",", "\"\<IsMetrizable\>\"", 
       ",", "\"\<IsPerfectlyNormal\>\"", ",", 
       "\"\<IsCompletelyNormal\>\"", ",", "\"\<IsNormal\>\"", ",", 
       "\"\<IsCompletelyRegular\>\"", ",", "\"\<IsRegular\>\"", ",", 
       "\"\<IsSemiregular\>\"", ",", "\"\<IsHausdorff\>\"", ",", 
       "\"\<IsLocallyHausdorff\>\"", ",", "\"\<IsT1\>\"", ",", 
       "\"\<IsT0\>\""}], "}"}]}], ",", "\"\< \[RightArrow] \>\""}], 
   "]"}], "]"}]], "Input"]
}, Open  ]]

(And, yes, this isn’t particularly profound; it’s just an indication of what it looks like to make specific definitions in topology computable.)

So far, what we’ve discussed is being able to represent pure mathematical ideas and results in a high-level computable way, understandable to both humans and computers. But what if we want to just formalize everything, from the ground up, explicitly deriving and validating every theorem from the lowest-level foundations? Over the past few decades there have been a number of large-scale projects—like Mizar, Coq, Isabelle, HOL, Metamath, Lean—that have tried to do this (nowadays often in connection with creating “proof assistants”).

Ultimately each project defines a certain “machine code” for mathematics. And yes, even though people might think that “mathematics is a universal language”, if one’s really going to give full, precise, formal specifications there are all sorts of choices to be made. Should things be based on set theory, type theory, higher-order logic, calculus of constructions, etc.? Should the law of excluded middle be assumed? The axiom of choice? What if one’s axiomatic structure seems great, but implies a few silly results, like 1/0 = 0? There’s no perfect solution, but each of these projects has made a certain set of choices.

And the good news here is that for our purposes in doing large-scale empirical metamathematics—as in doing mathematics in the way mathematicians usually do it—it doesn’t seem like the choices will matter much. But what’s important for us is that these projects have accumulated tens of thousands of theorems (well, OK, some are “throwaway lemmas” or simple rearrangements), and that starting from axioms (or what amount to axioms), they’ve reached decently far into quite a few areas of mathematics.

Looking at them is a bit of a different experience from looking at Euclid. While the Elements has the feel of a “narrative textbook” (albeit from a different age), formalized mathematics projects tend to seem more like software codebases, with their theorem dependency graphs being more like function call graphs. But they still provide fascinating metamathematical corpuses, and there's undoubtedly lots about empirical metamathematics that one can learn from them.

Here I’m going to look at two examples: the Lean mathlib collection, which includes about 36,000 theorems (and 16,000 definitions) and the Metamath set.mm (“set theory”) collection, which has about 44,000 theorems (and 1500 definitions).

To get a sense of what’s in these collections, we can start by drawing interdependence graphs for the theorems they contain in different areas of mathematics. Just like for Euclid, we make the size of each node represent the number of theorems in a particular area, and the thickness of each edge represent the fraction of theorems from one area that directly reference another in their proof.

Leaving out theorems that effectively just do structural manipulation, rather than representing mathematical content (as well as “self-loop” connections within a single domain) here’s the interdependence graph for Lean:

leanAssoc
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"leanAssoc", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL39QRbE\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "leanAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<init\>\"", ",", " ", "\"\<system\>\"", ",", " ", 
     "\"\<tactic\>\"", ",", " ", "\"\<data\>\"", ",", " ", 
     "\"\<meta\>\"", ",", " ", "\"\<control\>\"", ",", " ", 
     "\"\<computability\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", " ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "leanInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanDomainWeights", " ", "=", " ", 
   RowBox[{"Tally", "[", 
    RowBox[{"Values", "[", "leanAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanEdgeWeights", " ", "=", " ", 
   RowBox[{"Tally", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"leanAssoc", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "]"}], ",", " ", 
        RowBox[{"leanAssoc", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", "2", "]"}], "]"}], "]"}]}], "}"}], " ", "&"}], 
     " ", "/@", " ", 
     RowBox[{"EdgeList", "[", "leanGraph", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanEdgesOutSimple", " ", "=", " ", 
   RowBox[{"Append", "[", 
    RowBox[{
     RowBox[{"Merge", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"AssociationThread", "[", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"#", "[", 
             RowBox[{"[", 
              RowBox[{"1", ",", " ", "1", ",", " ", "1"}], "]"}], 
             "]"}], "}"}], "\[Rule]", " ", 
           RowBox[{"Total", "[", 
            RowBox[{"#", "[", 
             RowBox[{"[", "2", "]"}], "]"}], "]"}]}], "]"}], " ", 
         "&"}], " ", "/@", " ", 
        RowBox[{"(", 
         RowBox[{"Transpose", " ", "/@", " ", 
          RowBox[{"GatherBy", "[", 
           RowBox[{
            RowBox[{"Select", "[", 
             RowBox[{"leanEdgeWeights", ",", " ", 
              RowBox[{
               RowBox[{
                RowBox[{"#", "[", 
                 RowBox[{"[", 
                  RowBox[{"1", ",", " ", "1"}], "]"}], "]"}], " ", 
                "\[NotEqual]", " ", 
                RowBox[{"#", "[", 
                 RowBox[{"[", 
                  RowBox[{"1", ",", " ", "2"}], "]"}], "]"}]}], " ", 
               "&"}]}], "]"}], ",", " ", 
            RowBox[{
             RowBox[{"#", "[", 
              RowBox[{"[", 
               RowBox[{"1", ",", " ", "1"}], "]"}], "]"}], " ", 
             "&"}]}], "]"}]}], ")"}]}], ",", " ", "Identity"}], "]"}],
      ",", " ", 
     RowBox[{"\"\<init\>\"", " ", "\[Rule]", " ", 
      RowBox[{"{", "3493", "}"}]}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanNormalizedEdgeWeights", " ", "=", " ", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"DirectedEdge", "[", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", " ", "1"}], "]"}], "]"}], ",", " ", 
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", " ", "2"}], "]"}], "]"}]}], "]"}], " ", 
      "\[Rule]", " ", 
      RowBox[{
       RowBox[{"#", "[", 
        RowBox[{"[", "2", "]"}], "]"}], "/", " ", 
       RowBox[{"Flatten", "[", 
        RowBox[{"leanEdgesOutSimple", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", 
           RowBox[{"1", ",", "1"}], "]"}], "]"}], "]"}], "]"}]}]}], 
     " ", "&"}], " ", "/@", " ", "leanEdgeWeights"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"diskedLine", "[", 
    RowBox[{"{", 
     RowBox[{"line_", ",", "radii_"}], "}"}], "]"}], ":=", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{
      RowBox[{"RegionIntersection", "[", 
       RowBox[{
        RowBox[{"Line", "[", "line", "]"}], ",", 
        RowBox[{"Circle", "[", 
         RowBox[{
          RowBox[{"line", "[", 
           RowBox[{"[", "1", "]"}], "]"}], ",", 
          RowBox[{"radii", "[", 
           RowBox[{"[", "1", "]"}], "]"}]}], "]"}]}], "]"}], "[", 
      RowBox[{"[", 
       RowBox[{"1", ",", "1"}], "]"}], "]"}], ",", 
     "\[IndentingNewLine]", 
     RowBox[{
      RowBox[{"RegionIntersection", "[", 
       RowBox[{
        RowBox[{"Line", "[", "line", "]"}], ",", 
        RowBox[{"Circle", "[", 
         RowBox[{
          RowBox[{"line", "[", 
           RowBox[{"[", "2", "]"}], "]"}], ",", 
          RowBox[{"radii", "[", 
           RowBox[{"[", "2", "]"}], "]"}]}], "]"}]}], "]"}], "[", 
      RowBox[{"[", 
       RowBox[{"1", ",", "1"}], "]"}], "]"}]}], "}"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"weightedArrow", "[", 
    RowBox[{"line_", ",", "weight_"}], "]"}], ":=", " ", 
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
      "len", ",", "start", ",", "end", ",", "angle", ",", " ", 
       "thick", ",", " ", "rec", ",", " ", "mid"}], "}"}], ",", 
     "\[IndentingNewLine]", 
     RowBox[{
      RowBox[{"start", "=", 
       RowBox[{"line", "[", 
        RowBox[{"[", "1", "]"}], "]"}]}], ";", " ", 
      RowBox[{"end", "=", 
       RowBox[{"line", "[", 
        RowBox[{"[", "2", "]"}], "]"}]}], ";", " ", 
      RowBox[{"mid", "=", 
       RowBox[{"Mean", "[", "line", "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"len", "=", 
       RowBox[{"EuclideanDistance", "[", 
        RowBox[{"start", ",", "end"}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"angle", "=", 
       RowBox[{"Arg", "[", 
        RowBox[{
         RowBox[{"(", 
          RowBox[{"start", "-", "end"}], ")"}], ".", 
         RowBox[{"{", 
          RowBox[{"1", ",", "I"}], "}"}]}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"thick", "=", 
       RowBox[{"weight", "/", "len"}]}], ";", "\[IndentingNewLine]", 
      RowBox[{"rec", "=", " ", 
       RowBox[{
        RowBox[{
         RowBox[{"#", "+", "mid"}], "&"}], "/@", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"RotationMatrix", "[", "angle", "]"}], ".", "#"}],
            "&"}], "/@", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{"-", "len"}], "/", "2"}], ",", 
              RowBox[{
               RowBox[{"-", " ", "thick"}], "/", "2"}]}], "}"}], ",", 
            
            RowBox[{"{", 
             RowBox[{
              RowBox[{"len", "/", "2"}], ",", 
              RowBox[{
               RowBox[{"-", " ", "thick"}], "/", "2"}]}], "}"}], ",", 
            
            RowBox[{"{", 
             RowBox[{
              RowBox[{"len", "/", "2"}], ",", " ", 
              RowBox[{"thick", "/", "2"}]}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{"-", "len"}], "/", "2"}], ",", " ", 
              RowBox[{"thick", "/", "2"}]}], "}"}]}], "}"}]}], 
         ")"}]}]}], ";", "\[IndentingNewLine]", 
      RowBox[{"Polygon", "[", "rec", "]"}]}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"VertexDelete", "[", 
  RowBox[{
   RowBox[{"SimpleGraph", "[", 
    RowBox[{"Graph", "[", 
     RowBox[{"leanDomains", ",", " ", 
      RowBox[{"First", " ", "/@", " ", "leanNormalizedEdgeWeights"}], 
      ",", " ", 
      RowBox[{"EdgeStyle", "\[Rule]", 
       RowBox[{"Thread", "[", 
        RowBox[{
         RowBox[{"First", "/@", "leanNormalizedEdgeWeights"}], " ", 
         "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{
            RowBox[{"{", 
             RowBox[{
              RowBox[{"AbsoluteThickness", "[", 
               RowBox[{"20", 
                RowBox[{
                 RowBox[{"Last", "[", "#", "]"}], "[", 
                 RowBox[{"[", "1", "]"}], "]"}]}], "]"}], ",", 
              RowBox[{"Arrowheads", "[", 
               RowBox[{
                RowBox[{
                 RowBox[{"Last", "[", "#", "]"}], "[", 
                 RowBox[{"[", "1", "]"}], "]"}], "/", "4"}], "]"}], 
              ",", " ", 
              RowBox[{"GrayLevel", "[", 
               RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], "}"}], 
            "&"}], "/@", "leanNormalizedEdgeWeights"}], ")"}]}], 
        "]"}]}], ",", " ", 
      RowBox[{"VertexSize", "\[Rule]", 
       RowBox[{"Thread", "[", 
        RowBox[{
         RowBox[{"First", "/@", "leanDomainWeights"}], " ", "\[Rule]",
          " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{
            RowBox[{
             RowBox[{"Sqrt", "[", "#", "]"}], "/", "90"}], "&"}], "/@", 
           RowBox[{"(", 
            RowBox[{"Last", "/@", "leanDomainWeights"}], ")"}]}], 
          ")"}]}], "]"}]}], ",", " ", 
      RowBox[{"VertexStyle", " ", "\[Rule]", " ", 
       RowBox[{"(", 
        RowBox[{
         RowBox[{
          RowBox[{"#", " ", "\[Rule]", " ", 
           RowBox[{"{", 
            RowBox[{"Lighter", " ", "/@", " ", 
             RowBox[{"leanColors", "[", "#", "]"}]}], "}"}]}], " ", 
          "&"}], " ", "/@", "  ", "leanDomains"}], ")"}]}], ",", "  ", 
      RowBox[{"VertexLabels", "\[Rule]", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{
         "\"\<algebra\>\"", " ", "\[Rule]", " ", "\"\<algebra\>\""}], 
         ",", 
         RowBox[{
         "\"\<algebraic_geometry\>\"", " ", "\[Rule]", " ", 
          "\"\<algebraic geometry\>\""}], ",", 
         RowBox[{
         "\"\<analysis\>\"", " ", "\[Rule]", " ", 
          "\"\<analysis\>\""}], ",", 
         RowBox[{
         "\"\<category_theory\>\"", " ", "\[Rule]", " ", 
          "\"\<category theory\>\""}], ",", 
         RowBox[{
         "\"\<combinatorics\>\"", " ", "\[Rule]", " ", 
          "\"\<combinatorics\>\""}], ",", 
         RowBox[{
         "\"\<computability\>\"", " ", "\[Rule]", " ", 
          "\"\<computability\>\""}], ",", 
         RowBox[{
         "\"\<control\>\"", " ", "\[Rule]", " ", "\"\<control\>\""}], 
         ",", 
         RowBox[{
         "\"\<data\>\"", " ", "\[Rule]", " ", "\"\<data\>\""}], ",", 
         RowBox[{
         "\"\<dynamics\>\"", " ", "\[Rule]", " ", 
          "\"\<dynamics\>\""}], ",", 
         RowBox[{
         "\"\<geometry\>\"", " ", "\[Rule]", " ", 
          "\"\<geometry\>\""}], ",", 
         RowBox[{
         "\"\<init\>\"", " ", "\[Rule]", " ", "\"\<init\>\""}], ",", 
         RowBox[{
         "\"\<logic\>\"", " ", "\[Rule]", " ", "\"\<logic\>\""}], ",", 
         RowBox[{
         "\"\<meta\>\"", " ", "\[Rule]", " ", "\"\<meta\>\""}], ",", 
         RowBox[{
         "\"\<number_theory\>\"", " ", "\[Rule]", " ", 
          "\"\<number theory\>\""}], ",", 
         RowBox[{
         "\"\<order\>\"", " ", "\[Rule]", " ", 
          "\"\<order theory\>\""}], ",", 
         RowBox[{
         "\"\<set_theory\>\"", " ", "\[Rule]", " ", 
          "\"\<set theory\>\""}], ",", 
         RowBox[{
         "\"\<system\>\"", " ", "\[Rule]", " ", "\"\<system\>\""}], 
         ",", 
         RowBox[{
         "\"\<tactic\>\"", " ", "\[Rule]", " ", "\"\<tactic\>\""}], 
         ",", 
         RowBox[{
         "\"\<topology\>\"", " ", "\[Rule]", " ", 
          "\"\<topology\>\""}]}], "}"}]}], ",", " ", 
      RowBox[{
      "GraphLayout", " ", "\[Rule]", " ", 
       "\"\<SpringElectricalEmbedding\>\""}], ",", " ", 
      RowBox[{"PerformanceGoal", "\[Rule]", "\"\<Quality\>\""}], ",", 
      " ", 
      RowBox[{"AspectRatio", "\[Rule]", 
       RowBox[{"1", "/", "4"}]}]}], "]"}], "]"}], ",", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<init\>\"", ",", " ", "\"\<system\>\"", ",", " ", 
     "\"\<tactic\>\"", ",", " ", "\"\<data\>\"", ",", " ", 
     "\"\<meta\>\"", ",", " ", "\"\<control\>\"", ",", " ", 
     "\"\<computability\>\""}], "}"}], ",", " ", 
   RowBox[{"AspectRatio", "\[Rule]", " ", "1"}]}], "]"}]], "Input"]
}, Open  ]]

And here’s the corresponding one for Metamath:

metamathGraph
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"extensibleStructures", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<df-struct\>\"", ",", "\"\<df-ndx\>\"", ",", 
     "\"\<df-slot\>\"", ",", "\"\<df-base\>\"", ",", 
     "\"\<df-base\>\"", ",", "\"\<df-sets\>\"", ",", 
     "\"\<df-ress\>\"", ",", "\"\<brstruct\>\"", ",", 
     "\"\<isstruct2\>\"", ",", "\"\<isstruct\>\"", ",", 
     "\"\<structcnvcnv\>\"", ",", "\"\<structfun\>\"", ",", 
     "\"\<structfn\>\"", ",", "\"\<slotfn\>\"", ",", 
     "\"\<strfvnd\>\"", ",", "\"\<wunndx\>\"", ",", "\"\<strfvn\>\"", 
     ",", "\"\<strfvn\>\"", ",", "\"\<strfvss\>\"", ",", 
     "\"\<wunstr\>\"", ",", "\"\<ndxarg\>\"", ",", "\"\<ndxid\>\"", 
     ",", "\"\<ndxid\>\"", ",", "\"\<strndxid\>\"", ",", 
     "\"\<reldmsets\>\"", ",", "\"\<setsvalg\>\"", ",", 
     "\"\<setsval\>\"", ",", "\"\<setsval\>\"", ",", 
     "\"\<setsidvald\>\"", ",", "\"\<fvsetsid\>\"", ",", 
     "\"\<fsets\>\"", ",", "\"\<wunsets\>\"", ",", "\"\<setsres\>\"", 
     ",", "\"\<setsres\>\"", ",", "\"\<setsabs\>\"", ",", 
     "\"\<setscom\>\"", ",", "\"\<setscom\>\"", ",", "\"\<strfvd\>\"",
      ",", "\"\<strfv2d\>\"", ",", "\"\<strfv2\>\"", ",", 
     "\"\<strfv\>\"", ",", "\"\<strfv\>\"", ",", "\"\<strfv3\>\"", 
     ",", "\"\<strssd\>\"", ",", "\"\<strssd\>\"", ",", 
     "\"\<strss\>\"", ",", "\"\<strss\>\"", ",", "\"\<str0\>\"", ",", 
     "\"\<str0\>\"", ",", "\"\<base0\>\"", ",", "\"\<strfvi\>\"", 
     ",", "\"\<setsid\>\"", ",", "\"\<setsid\>\"", ",", 
     "\"\<setsnid\>\"", ",", "\"\<setsnid\>\"", ",", 
     "\"\<sbcie2s\>\"", ",", "\"\<sbcie3s\>\"", ",", 
     "\"\<baseval\>\"", ",", "\"\<baseid\>\"", ",", "\"\<elbasfv\>\"",
      ",", "\"\<elbasov\>\"", ",", "\"\<strov2rcl\>\"", ",", 
     "\"\<strov2rcl\>\"", ",", "\"\<basendx\>\"", ",", 
     "\"\<reldmress\>\"", ",", "\"\<ressval\>\"", ",", 
     "\"\<ressid2\>\"", ",", "\"\<ressval2\>\"", ",", 
     "\"\<ressbas\>\"", ",", "\"\<ressbas2\>\"", ",", 
     "\"\<ressbasss\>\"", ",", "\"\<ressbasss\>\"", ",", 
     "\"\<resslem\>\"", ",", "\"\<resslem\>\"", ",", "\"\<ress0\>\"", 
     ",", "\"\<ress0\>\"", ",", "\"\<ressid\>\"", ",", 
     "\"\<ressinbas\>\"", ",", "\"\<ressval3d\>\"", ",", 
     "\"\<ressress\>\"", ",", "\"\<ressress\>\"", ",", 
     "\"\<ressabs\>\"", ",", "\"\<wunress\>\"", ",", 
     "\"\<df-rest\>\"", ",", "\"\<df-rest\>\"", ",", 
     "\"\<df-topn\>\"", ",", "\"\<restfn\>\"", ",", "\"\<topnfn\>\"", 
     ",", "\"\<restval\>\"", ",", "\"\<restval\>\"", ",", 
     "\"\<elrest\>\"", ",", "\"\<elrest\>\"", ",", "\"\<elrestr\>\"", 
     ",", "\"\<elrestr\>\"", ",", "\"\<0rest\>\"", ",", 
     "\"\<restid2\>\"", ",", "\"\<restsspw\>\"", ",", 
     "\"\<firest\>\"", ",", "\"\<restid\>\"", ",", "\"\<restid\>\"", 
     ",", "\"\<topnval\>\"", ",", "\"\<topnid\>\"", ",", 
     "\"\<topnpropd\>\"", ",", "\"\<df-0g\>\"", ",", 
     "\"\<df-gsum\>\"", ",", "\"\<df-gsum\>\"", ",", 
     "\"\<df-gsum\>\"", ",", "\"\<df-topgen\>\"", ",", 
     "\"\<df-pt\>\"", ",", "\"\<df-prds\>\"", ",", "\"\<df-prds\>\"", 
     ",", "\"\<reldmprds\>\"", ",", "\"\<reldmprds\>\"", ",", 
     "\"\<df-pws\>\"", ",", "\"\<prdsbasex\>\"", ",", 
     "\"\<imasvalstr\>\"", ",", "\"\<imasvalstr\>\"", ",", 
     "\"\<imasvalstr\>\"", ",", "\"\<prdsvalstr\>\"", ",", 
     "\"\<prdsvalstr\>\"", ",", "\"\<prdsvalstr\>\"", ",", 
     "\"\<prdsvallem\>\"", ",", "\"\<prdsvallem\>\"", ",", 
     "\"\<prdsval\>\"", ",", "\"\<prdsval\>\"", ",", 
     "\"\<prdsval\>\"", ",", "\"\<prdssca\>\"", ",", 
     "\"\<prdssca\>\"", ",", "\"\<prdssca\>\"", ",", 
     "\"\<prdsbas\>\"", ",", "\"\<prdsbas\>\"", ",", 
     "\"\<prdsbas\>\"", ",", "\"\<prdsplusg\>\"", ",", 
     "\"\<prdsplusg\>\"", ",", "\"\<prdsplusg\>\"", ",", 
     "\"\<prdsmulr\>\"", ",", "\"\<prdsmulr\>\"", ",", 
     "\"\<prdsmulr\>\"", ",", "\"\<prdsvsca\>\"", ",", 
     "\"\<prdsvsca\>\"", ",", "\"\<prdsvsca\>\"", ",", 
     "\"\<prdsip\>\"", ",", "\"\<prdsle\>\"", ",", "\"\<prdsle\>\"", 
     ",", "\"\<prdsless\>\"", ",", "\"\<prdsds\>\"", ",", 
     "\"\<prdsds\>\"", ",", "\"\<prdsdsfn\>\"", ",", 
     "\"\<prdstset\>\"", ",", "\"\<prdstset\>\"", ",", 
     "\"\<prdshom\>\"", ",", "\"\<prdshom\>\"", ",", "\"\<prdsco\>\"",
      ",", "\"\<prdsco\>\"", ",", "\"\<prdsbas2\>\"", ",", 
     "\"\<prdsbas2\>\"", ",", "\"\<prdsbasmpt\>\"", ",", 
     "\"\<prdsbasfn\>\"", ",", "\"\<prdsbasprj\>\"", ",", 
     "\"\<prdsplusgval\>\"", ",", "\"\<prdsplusgval\>\"", ",", 
     "\"\<prdsplusgfval\>\"", ",", "\"\<prdsmulrval\>\"", ",", 
     "\"\<prdsmulrfval\>\"", ",", "\"\<prdsleval\>\"", ",", 
     "\"\<prdsdsval\>\"", ",", "\"\<prdsvscaval\>\"", ",", 
     "\"\<prdsvscafval\>\"", ",", "\"\<prdsbas3\>\"", ",", 
     "\"\<prdsbasmpt2\>\"", ",", "\"\<prdsbasmpt2\>\"", ",", 
     "\"\<prdsbascl\>\"", ",", "\"\<prdsdsval2\>\"", ",", 
     "\"\<prdsdsval3\>\"", ",", "\"\<pwsval\>\"", ",", 
     "\"\<pwsbas\>\"", ",", "\"\<pwselbasb\>\"", ",", 
     "\"\<pwselbas\>\"", ",", "\"\<pwselbas\>\"", ",", 
     "\"\<pwsplusgval\>\"", ",", "\"\<pwsmulrval\>\"", ",", 
     "\"\<pwsle\>\"", ",", "\"\<pwsleval\>\"", ",", 
     "\"\<pwsvscafval\>\"", ",", "\"\<pwsvscaval\>\"", ",", 
     "\"\<pwssca\>\"", ",", "\"\<pwsdiagel\>\"", ",", 
     "\"\<pwssnf1o\>\"", ",", "\"\<df-ordt\>\"", ",", 
     "\"\<df-xrs\>\"", ",", "\"\<df-qtop\>\"", ",", "\"\<df-imas\>\"",
      ",", "\"\<df-qus\>\"", ",", "\"\<df-xps\>\"", ",", 
     "\"\<imasval\>\"", ",", "\"\<imasval\>\"", ",", 
     "\"\<imasval\>\"", ",", "\"\<imasbas\>\"", ",", 
     "\"\<imasbas\>\"", ",", "\"\<imasbas\>\"", ",", "\"\<imasds\>\"",
      ",", "\"\<imasds\>\"", ",", "\"\<imasds\>\"", ",", 
     "\"\<imasdsfn\>\"", ",", "\"\<imasdsval\>\"", ",", 
     "\"\<imasdsval2\>\"", ",", "\"\<imasplusg\>\"", ",", 
     "\"\<imasplusg\>\"", ",", "\"\<imasplusg\>\"", ",", 
     "\"\<imasmulr\>\"", ",", "\"\<imasmulr\>\"", ",", 
     "\"\<imasmulr\>\"", ",", "\"\<imassca\>\"", ",", 
     "\"\<imassca\>\"", ",", "\"\<imasvsca\>\"", ",", 
     "\"\<imasvsca\>\"", ",", "\"\<imasip\>\"", ",", 
     "\"\<imastset\>\"", ",", "\"\<imasle\>\"", ",", 
     "\"\<f1ocpbllem\>\"", ",", "\"\<f1ocpbl\>\"", ",", 
     "\"\<f1ovscpbl\>\"", ",", "\"\<f1olecpbl\>\"", ",", 
     "\"\<imasaddfnlem\>\"", ",", "\"\<imasaddvallem\>\"", ",", 
     "\"\<imasaddflem\>\"", ",", "\"\<imasaddfn\>\"", ",", 
     "\"\<imasaddfn\>\"", ",", "\"\<imasaddval\>\"", ",", 
     "\"\<imasaddf\>\"", ",", "\"\<imasmulfn\>\"", ",", 
     "\"\<imasmulval\>\"", ",", "\"\<imasmulf\>\"", ",", 
     "\"\<imasvscafn\>\"", ",", "\"\<imasvscaval\>\"", ",", 
     "\"\<imasvscaf\>\"", ",", "\"\<imasless\>\"", ",", 
     "\"\<imasleval\>\"", ",", "\"\<qusval\>\"", ",", 
     "\"\<quslem\>\"", ",", "\"\<qusin\>\"", ",", "\"\<qusbas\>\"", 
     ",", "\"\<quss\>\"", ",", "\"\<divsfval\>\"", ",", 
     "\"\<divsfval\>\"", ",", "\"\<ercpbllem\>\"", ",", 
     "\"\<ercpbl\>\"", ",", "\"\<ercpbl\>\"", ",", "\"\<erlecpbl\>\"",
      ",", "\"\<erlecpbl\>\"", ",", "\"\<qusaddvallem\>\"", ",", 
     "\"\<qusaddflem\>\"", ",", "\"\<qusaddval\>\"", ",", 
     "\"\<qusaddf\>\"", ",", "\"\<qusmulval\>\"", ",", 
     "\"\<qusmulf\>\"", ",", "\"\<xpsc\>\"", ",", "\"\<xpscg\>\"", 
     ",", "\"\<xpscfn\>\"", ",", "\"\<xpsc0\>\"", ",", 
     "\"\<xpsc1\>\"", ",", "\"\<xpscfv\>\"", ",", "\"\<xpsfrnel\>\"", 
     ",", "\"\<xpsfeq\>\"", ",", "\"\<xpsfrnel2\>\"", ",", 
     "\"\<xpscf\>\"", ",", "\"\<xpsfval\>\"", ",", "\"\<xpsff1o\>\"", 
     ",", "\"\<xpsfrn\>\"", ",", "\"\<xpsfrn2\>\"", ",", 
     "\"\<xpsff1o2\>\"", ",", "\"\<xpsval\>\"", ",", "\"\<xpslem\>\"",
      ",", "\"\<xpsbas\>\"", ",", "\"\<xpsaddlem\>\"", ",", 
     "\"\<xpsadd\>\"", ",", "\"\<xpsmul\>\"", ",", "\"\<xpssca\>\"", 
     ",", "\"\<xpsvsca\>\"", ",", "\"\<xpsless\>\"", ",", 
     "\"\<xpsle\>\"", ",", "\"\<df-plusg\>\"", ",", 
     "\"\<df-plusg\>\"", ",", "\"\<df-mulr\>\"", ",", 
     "\"\<df-mulr\>\"", ",", "\"\<df-starv\>\"", ",", 
     "\"\<df-starv\>\"", ",", "\"\<df-sca\>\"", ",", "\"\<df-sca\>\"",
      ",", "\"\<df-vsca\>\"", ",", "\"\<df-vsca\>\"", ",", 
     "\"\<df-ip\>\"", ",", "\"\<df-ip\>\"", ",", "\"\<df-tset\>\"", 
     ",", "\"\<df-tset\>\"", ",", "\"\<df-ple\>\"", ",", 
     "\"\<df-ple\>\"", ",", "\"\<df-ocomp\>\"", ",", 
     "\"\<df-ocomp\>\"", ",", "\"\<df-ds\>\"", ",", "\"\<df-ds\>\"", 
     ",", "\"\<df-unif\>\"", ",", "\"\<df-hom\>\"", ",", 
     "\"\<df-cco\>\"", ",", "\"\<strlemor0\>\"", ",", 
     "\"\<strlemor1\>\"", ",", "\"\<strlemor1\>\"", ",", 
     "\"\<strlemor2\>\"", ",", "\"\<strlemor2\>\"", ",", 
     "\"\<strlemor3\>\"", ",", "\"\<strlemor3\>\"", ",", 
     "\"\<strleun\>\"", ",", "\"\<strle1\>\"", ",", "\"\<strle2\>\"", 
     ",", "\"\<strle3\>\"", ",", "\"\<plusgndx\>\"", ",", 
     "\"\<plusgid\>\"", ",", "\"\<1strstr\>\"", ",", 
     "\"\<1strbas\>\"", ",", "\"\<1strwunbndx\>\"", ",", 
     "\"\<1strwun\>\"", ",", "\"\<2strstr\>\"", ",", 
     "\"\<2strbas\>\"", ",", "\"\<2strop\>\"", ",", "\"\<grpstr\>\"", 
     ",", "\"\<grpstr\>\"", ",", "\"\<grpbase\>\"", ",", 
     "\"\<grpbase\>\"", ",", "\"\<grpplusg\>\"", ",", 
     "\"\<grpplusg\>\"", ",", "\"\<ressplusg\>\"", ",", 
     "\"\<grpbasex\>\"", ",", "\"\<grpplusgx\>\"", ",", 
     "\"\<mulrndx\>\"", ",", "\"\<mulrid\>\"", ",", "\"\<rngstr\>\"", 
     ",", "\"\<rngstr\>\"", ",", "\"\<rngbase\>\"", ",", 
     "\"\<rngbase\>\"", ",", "\"\<rngplusg\>\"", ",", 
     "\"\<rngplusg\>\"", ",", "\"\<rngmulr\>\"", ",", 
     "\"\<rngmulr\>\"", ",", "\"\<starvndx\>\"", ",", 
     "\"\<starvid\>\"", ",", "\"\<ressmulr\>\"", ",", 
     "\"\<ressstarv\>\"", ",", "\"\<srngfn\>\"", ",", 
     "\"\<srngfn\>\"", ",", "\"\<srngbase\>\"", ",", 
     "\"\<srngbase\>\"", ",", "\"\<srngplusg\>\"", ",", 
     "\"\<srngmulr\>\"", ",", "\"\<srnginvl\>\"", ",", 
     "\"\<scandx\>\"", ",", "\"\<scaid\>\"", ",", "\"\<vscandx\>\"", 
     ",", "\"\<vscaid\>\"", ",", "\"\<vscaid\>\"", ",", 
     "\"\<lmodstr\>\"", ",", "\"\<lmodstr\>\"", ",", 
     "\"\<lmodbase\>\"", ",", "\"\<lmodbase\>\"", ",", 
     "\"\<lmodplusg\>\"", ",", "\"\<lmodplusg\>\"", ",", 
     "\"\<lmodsca\>\"", ",", "\"\<lmodsca\>\"", ",", 
     "\"\<lmodvsca\>\"", ",", "\"\<lmodvsca\>\"", ",", 
     "\"\<ipndx\>\"", ",", "\"\<ipid\>\"", ",", "\"\<ipsstr\>\"", 
     ",", "\"\<ipsstr\>\"", ",", "\"\<ipsstr\>\"", ",", 
     "\"\<ipsbase\>\"", ",", "\"\<ipsbase\>\"", ",", 
     "\"\<ipsbase\>\"", ",", "\"\<ipsaddg\>\"", ",", 
     "\"\<ipsaddg\>\"", ",", "\"\<ipsaddg\>\"", ",", 
     "\"\<ipsmulr\>\"", ",", "\"\<ipsmulr\>\"", ",", 
     "\"\<ipsmulr\>\"", ",", "\"\<ipssca\>\"", ",", "\"\<ipssca\>\"", 
     ",", "\"\<ipssca\>\"", ",", "\"\<ipsvsca\>\"", ",", 
     "\"\<ipsvsca\>\"", ",", "\"\<ipsvsca\>\"", ",", "\"\<ipsip\>\"", 
     ",", "\"\<ipsip\>\"", ",", "\"\<ipsip\>\"", ",", 
     "\"\<resssca\>\"", ",", "\"\<ressvsca\>\"", ",", 
     "\"\<ressip\>\"", ",", "\"\<phlstr\>\"", ",", "\"\<phlstr\>\"", 
     ",", "\"\<phlbase\>\"", ",", "\"\<phlbase\>\"", ",", 
     "\"\<phlplusg\>\"", ",", "\"\<phlplusg\>\"", ",", 
     "\"\<phlsca\>\"", ",", "\"\<phlsca\>\"", ",", "\"\<phlvsca\>\"", 
     ",", "\"\<phlvsca\>\"", ",", "\"\<phlip\>\"", ",", 
     "\"\<phlip\>\"", ",", "\"\<tsetndx\>\"", ",", "\"\<tsetid\>\"", 
     ",", "\"\<topgrpstr\>\"", ",", "\"\<topgrpbas\>\"", ",", 
     "\"\<topgrpplusg\>\"", ",", "\"\<topgrptset\>\"", ",", 
     "\"\<resstset\>\"", ",", "\"\<plendx\>\"", ",", "\"\<pleid\>\"", 
     ",", "\"\<otpsstr\>\"", ",", "\"\<otpsbas\>\"", ",", 
     "\"\<otpstset\>\"", ",", "\"\<otpsle\>\"", ",", "\"\<ressle\>\"",
      ",", "\"\<ocndx\>\"", ",", "\"\<ocid\>\"", ",", "\"\<dsndx\>\"",
      ",", "\"\<dsid\>\"", ",", "\"\<unifndx\>\"", ",", 
     "\"\<unifid\>\"", ",", "\"\<odrngstr\>\"", ",", 
     "\"\<odrngbas\>\"", ",", "\"\<odrngplusg\>\"", ",", 
     "\"\<odrngmulr\>\"", ",", "\"\<odrngtset\>\"", ",", 
     "\"\<odrngle\>\"", ",", "\"\<odrngds\>\"", ",", "\"\<ressds\>\"",
      ",", "\"\<homndx\>\"", ",", "\"\<homid\>\"", ",", 
     "\"\<ccondx\>\"", ",", "\"\<ccoid\>\"", ",", "\"\<resshom\>\"", 
     ",", "\"\<ressco\>\"", ",", "\"\<slotsbhcdif\>\""}], "}"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", " ", 
   RowBox[{"EdgeDelete", "[", 
    RowBox[{
     RowBox[{
     "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}], ",",
      " ", 
     RowBox[{"Select", "[", 
      RowBox[{
       RowBox[{"EdgeList", "[", 
        RowBox[{
         RowBox[{
         "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}], 
         ";"}], "]"}], ",", " ", 
       RowBox[{
        RowBox[{"MemberQ", "[", 
         RowBox[{"extensibleStructures", ",", " ", 
          RowBox[{"#", "[", 
           RowBox[{"[", "2", "]"}], "]"}]}], "]"}], " ", "&"}]}], 
      "]"}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"metamathAssoc", " ", "=", " ", 
    RowBox[{
    "CloudGet", "[", "\"\<https://wolfr.am/PLborw8R\>\"", "]"}]}], 
   ";"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "metamathAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<SUPPLEMENTARY MATERIAL (USER'S MATHBOXES)\>\"", ",", " ", 
     "\"\<GUIDES AND MISCELLANEA\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{
          "metamathDomains", ",", " ", "metamathInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{
             "metamathDomains", ",", " ", "metamathInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", "  ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "metamathInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathDomainWeights", " ", "=", " ", 
   RowBox[{"Tally", "[", 
    RowBox[{"Values", "[", "metamathAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathEdgeWeights", " ", "=", " ", 
   RowBox[{"Tally", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"metamathAssoc", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", "1", "]"}], "]"}], "]"}], ",", " ", 
        RowBox[{"metamathAssoc", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", "2", "]"}], "]"}], "]"}]}], "}"}], " ", "&"}], 
     " ", "/@", " ", 
     RowBox[{"EdgeList", "[", "metamathGraph", "]"}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathEdgesOutSimple", " ", "=", " ", 
   RowBox[{"Append", "[", 
    RowBox[{
     RowBox[{"Merge", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"AssociationThread", "[", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"#", "[", 
             RowBox[{"[", 
              RowBox[{"1", ",", " ", "1", ",", " ", "1"}], "]"}], 
             "]"}], "}"}], "\[Rule]", " ", 
           RowBox[{"Total", "[", 
            RowBox[{"#", "[", 
             RowBox[{"[", "2", "]"}], "]"}], "]"}]}], "]"}], " ", 
         "&"}], " ", "/@", " ", 
        RowBox[{"(", 
         RowBox[{"Transpose", " ", "/@", " ", 
          RowBox[{"GatherBy", "[", 
           RowBox[{
            RowBox[{"Select", "[", 
             RowBox[{"metamathEdgeWeights", ",", " ", 
              RowBox[{
               RowBox[{
                RowBox[{"#", "[", 
                 RowBox[{"[", 
                  RowBox[{"1", ",", " ", "1"}], "]"}], "]"}], " ", 
                "\[NotEqual]", " ", 
                RowBox[{"#", "[", 
                 RowBox[{"[", 
                  RowBox[{"1", ",", " ", "2"}], "]"}], "]"}]}], " ", 
               "&"}]}], "]"}], ",", " ", 
            RowBox[{
             RowBox[{"#", "[", 
              RowBox[{"[", 
               RowBox[{"1", ",", " ", "1"}], "]"}], "]"}], " ", 
             "&"}]}], "]"}]}], ")"}]}], ",", " ", "Identity"}], "]"}],
      ",", " ", 
     RowBox[{
     "\"\<CLASSICAL FIRST-ORDER LOGIC WITH EQUALITY\>\"", " ", 
      "\[Rule]", " ", 
      RowBox[{"{", "7649", "}"}]}]}], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathNormalizedEdgeWeights", " ", "=", " ", 
   RowBox[{
    RowBox[{
     RowBox[{
      RowBox[{"DirectedEdge", "[", 
       RowBox[{
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", " ", "1"}], "]"}], "]"}], ",", " ", 
        RowBox[{"#", "[", 
         RowBox[{"[", 
          RowBox[{"1", ",", " ", "2"}], "]"}], "]"}]}], "]"}], " ", 
      "\[Rule]", " ", 
      RowBox[{
       RowBox[{"#", "[", 
        RowBox[{"[", "2", "]"}], "]"}], "/", " ", 
       RowBox[{"Flatten", "[", 
        RowBox[{"metamathEdgesOutSimple", "[", 
         RowBox[{"#", "[", 
          RowBox[{"[", 
           RowBox[{"1", ",", "1"}], "]"}], "]"}], "]"}], "]"}]}]}], 
     " ", "&"}], " ", "/@", " ", "metamathEdgeWeights"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"diskedLine", "[", 
    RowBox[{"{", 
     RowBox[{"line_", ",", "radii_"}], "}"}], "]"}], ":=", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{
      RowBox[{"RegionIntersection", "[", 
       RowBox[{
        RowBox[{"Line", "[", "line", "]"}], ",", 
        RowBox[{"Circle", "[", 
         RowBox[{
          RowBox[{"line", "[", 
           RowBox[{"[", "1", "]"}], "]"}], ",", 
          RowBox[{"radii", "[", 
           RowBox[{"[", "1", "]"}], "]"}]}], "]"}]}], "]"}], "[", 
      RowBox[{"[", 
       RowBox[{"1", ",", "1"}], "]"}], "]"}], ",", 
     "\[IndentingNewLine]", 
     RowBox[{
      RowBox[{"RegionIntersection", "[", 
       RowBox[{
        RowBox[{"Line", "[", "line", "]"}], ",", 
        RowBox[{"Circle", "[", 
         RowBox[{
          RowBox[{"line", "[", 
           RowBox[{"[", "2", "]"}], "]"}], ",", 
          RowBox[{"radii", "[", 
           RowBox[{"[", "2", "]"}], "]"}]}], "]"}]}], "]"}], "[", 
      RowBox[{"[", 
       RowBox[{"1", ",", "1"}], "]"}], "]"}]}], "}"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"weightedArrow", "[", 
    RowBox[{"line_", ",", "weight_"}], "]"}], ":=", " ", 
   RowBox[{"Module", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
      "len", ",", "start", ",", "end", ",", "angle", ",", " ", 
       "thick", ",", " ", "rec", ",", " ", "mid"}], "}"}], ",", 
     "\[IndentingNewLine]", 
     RowBox[{
      RowBox[{"start", "=", 
       RowBox[{"line", "[", 
        RowBox[{"[", "1", "]"}], "]"}]}], ";", " ", 
      RowBox[{"end", "=", 
       RowBox[{"line", "[", 
        RowBox[{"[", "2", "]"}], "]"}]}], ";", " ", 
      RowBox[{"mid", "=", 
       RowBox[{"Mean", "[", "line", "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"len", "=", 
       RowBox[{"EuclideanDistance", "[", 
        RowBox[{"start", ",", "end"}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"angle", "=", 
       RowBox[{"Arg", "[", 
        RowBox[{
         RowBox[{"(", 
          RowBox[{"start", "-", "end"}], ")"}], ".", 
         RowBox[{"{", 
          RowBox[{"1", ",", "I"}], "}"}]}], "]"}]}], ";", 
      "\[IndentingNewLine]", 
      RowBox[{"thick", "=", 
       RowBox[{"weight", "/", "len"}]}], ";", "\[IndentingNewLine]", 
      RowBox[{"rec", "=", " ", 
       RowBox[{
        RowBox[{
         RowBox[{"#", "+", "mid"}], "&"}], "/@", 
        RowBox[{"(", 
         RowBox[{
          RowBox[{
           RowBox[{
            RowBox[{"RotationMatrix", "[", "angle", "]"}], ".", "#"}],
            "&"}], "/@", 
          RowBox[{"{", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{"-", "len"}], "/", "2"}], ",", 
              RowBox[{
               RowBox[{"-", " ", "thick"}], "/", "2"}]}], "}"}], ",", 
            
            RowBox[{"{", 
             RowBox[{
              RowBox[{"len", "/", "2"}], ",", 
              RowBox[{
               RowBox[{"-", " ", "thick"}], "/", "2"}]}], "}"}], ",", 
            
            RowBox[{"{", 
             RowBox[{
              RowBox[{"len", "/", "2"}], ",", " ", 
              RowBox[{"thick", "/", "2"}]}], "}"}], ",", 
            RowBox[{"{", 
             RowBox[{
              RowBox[{
               RowBox[{"-", "len"}], "/", "2"}], ",", " ", 
              RowBox[{"thick", "/", "2"}]}], "}"}]}], "}"}]}], 
         ")"}]}]}], ";", "\[IndentingNewLine]", 
      RowBox[{"Polygon", "[", "rec", "]"}]}]}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Show", "[", 
  RowBox[{
   RowBox[{"VertexDelete", "[", 
    RowBox[{
     RowBox[{"SimpleGraph", "[", 
      RowBox[{"Graph", "[", 
       RowBox[{"metamathDomains", ",", " ", 
        RowBox[{
        "First", " ", "/@", " ", "metamathNormalizedEdgeWeights"}], 
        ",", " ", 
        RowBox[{"EdgeStyle", "\[Rule]", 
         RowBox[{"Thread", "[", 
          RowBox[{
           RowBox[{"First", "/@", "metamathNormalizedEdgeWeights"}], 
           " ", "\[Rule]", " ", 
           RowBox[{"(", 
            RowBox[{
             RowBox[{
              RowBox[{"{", 
               RowBox[{
                RowBox[{"AbsoluteThickness", "[", 
                 RowBox[{"175", 
                  RowBox[{
                   RowBox[{"Last", "[", "#", "]"}], "[", 
                   RowBox[{"[", "1", "]"}], "]"}]}], "]"}], ",", 
                RowBox[{"Arrowheads", "[", 
                 RowBox[{
                  RowBox[{"Last", "[", "#", "]"}], "[", 
                  RowBox[{"[", "1", "]"}], "]"}], "]"}], ",", " ", 
                RowBox[{"GrayLevel", "[", 
                 RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], "}"}], 
              "&"}], "/@", "metamathNormalizedEdgeWeights"}], ")"}]}],
           "]"}]}], ",", " ", 
        RowBox[{"VertexSize", "\[Rule]", 
         RowBox[{"Thread", "[", 
          RowBox[{
           RowBox[{"First", "/@", "metamathDomainWeights"}], " ", 
           "\[Rule]", " ", 
           RowBox[{"(", 
            RowBox[{
             RowBox[{
              RowBox[{
               RowBox[{"Sqrt", "[", "#", "]"}], "/", "70"}], "&"}], "/@", 
             RowBox[{"(", 
              RowBox[{"Last", "/@", " ", "metamathDomainWeights"}], 
              ")"}]}], ")"}]}], "]"}]}], ",", " ", 
        RowBox[{"VertexStyle", " ", "\[Rule]", " ", 
         RowBox[{"(", 
          RowBox[{
           RowBox[{
            RowBox[{"#", " ", "\[Rule]", " ", 
             RowBox[{"{", 
              RowBox[{"Lighter", " ", "/@", " ", 
               RowBox[{"metamathColors", "[", "#", "]"}]}], "}"}]}], 
            " ", "&"}], " ", "/@", "  ", "metamathDomains"}], ")"}]}],
         ",", " ", 
        RowBox[{"VertexLabels", "\[Rule]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{
           "\"\<BASIC ALGEBRAIC STRUCTURES\>\"", " ", "\[Rule]", " ", 
            "\"\<algebraic structures\>\""}], ",", 
           RowBox[{
           "\"\<BASIC CATEGORY THEORY\>\"", " ", "\[Rule]", " ", 
            "\"\<category theory\>\""}], ",", 
           RowBox[{
           "\"\<BASIC LINEAR ALGEBRA\>\"", " ", "\[Rule]", " ", 
            "\"\<linear algebra\>\""}], ",", 
           RowBox[{
           "\"\<BASIC ORDER THEORY\>\"", " ", "\[Rule]", " ", 
            "\"\<order theory\>\""}], ",", 
           RowBox[{
           "\"\<BASIC REAL AND COMPLEX ANALYSIS\>\"", " ", "\[Rule]", 
            " ", "\"\<real and complex analysis\>\""}], ",", 
           RowBox[{
           "\"\<BASIC REAL AND COMPLEX FUNCTIONS\>\"", " ", "\[Rule]",
             " ", "\"\<real and complex functions\>\""}], ",", 
           RowBox[{
           "\"\<BASIC STRUCTURES\>\"", " ", "\[Rule]", " ", 
            "\"\<basic structures\>\""}], ",", 
           RowBox[{
           "\"\<BASIC TOPOLOGY\>\"", " ", "\[Rule]", " ", 
            "\"\<topology\>\""}], ",", 
           RowBox[{
           "\"\<CLASSICAL FIRST-ORDER LOGIC WITH EQUALITY\>\"", " ", 
            "\[Rule]", " ", "\"\<logic\>\""}], ",", 
           RowBox[{
           "\"\<ELEMENTARY GEOMETRY\>\"", " ", "\[Rule]", " ", 
            "\"\<geometry\>\""}], ",", 
           RowBox[{
           "\"\<ELEMENTARY NUMBER THEORY\>\"", " ", "\[Rule]", " ", 
            "\"\<number theory\>\""}], ",", 
           RowBox[{
           "\"\<GRAPH THEORY\>\"", " ", "\[Rule]", " ", 
            "\"\<graph theory\>\""}], ",", 
           RowBox[{
           "\"\<GUIDES AND MISCELLANEA\>\"", " ", "\[Rule]", " ", 
            "\"\<miscellaneous\>\""}], ",", 
           RowBox[{
           "\"\<REAL AND COMPLEX NUMBERS\>\"", " ", "\[Rule]", " ", 
            "\"\<real and complex numbers\>\""}], ",", 
           RowBox[{
           "\"\<SUPPLEMENTARY MATERIAL (USER'S MATHBOXES)\>\"", " ", 
            "\[Rule]", " ", "\"\<supplementary material\>\""}], ",", 
           RowBox[{
           "\"\<TG (TARSKI-GROTHENDIECK) SET THEORY\>\"", " ", 
            "\[Rule]", "  ", "\"\<TG set theory\>\""}], ",", 
           RowBox[{
           "\"\<ZFC (ZERMELO-FRAENKEL WITH CHOICE) SET THEORY\>\"", 
            " ", "\[Rule]", " ", "\"\<ZFC set theory\>\""}], ",", 
           RowBox[{
           "\"\<ZF (ZERMELO-FRAENKEL) SET THEORY\>\"", " ", "\[Rule]",
             " ", "\"\<ZF set theory\>\""}]}], "}"}]}], ",", " ", 
        RowBox[{
        "GraphLayout", " ", "\[Rule]", " ", 
         "\"\<SpringElectricalEmbedding\>\""}], ",", " ", 
        RowBox[{"PerformanceGoal", "\[Rule]", "\"\<Quality\>\""}], 
        ",", " ", 
        RowBox[{"AspectRatio", "\[Rule]", "1"}]}], "]"}], "]"}], ",", 
     " ", 
     RowBox[{"{", 
      RowBox[{
      "\"\<SUPPLEMENTARY MATERIAL (USER'S MATHBOXES)\>\"", ",", " ", 
       "\"\<TG (TARSKI-GROTHENDIECK) SET THEORY\>\"", ",", 
       "\"\<ZFC (ZERMELO-FRAENKEL WITH CHOICE) SET THEORY\>\"", ",", 
       "\"\<ZF (ZERMELO-FRAENKEL) SET THEORY\>\"", ",", " ", 
       "\"\<CLASSICAL FIRST-ORDER LOGIC WITH EQUALITY\>\"", ",", " ", 
       "\"\<GUIDES AND MISCELLANEA\>\"", ",", " ", 
       "\"\<REAL AND COMPLEX NUMBERS\>\""}], "}"}]}], "]"}], ",", " ", 
   RowBox[{"Editable", " ", "\[Rule]", " ", "True"}]}], 
  "]"}]], "Input"]
}, Open  ]]

It’s somewhat interesting to see how central algebra ends up being in both cases, and how comparatively “off on the side” category theory is. But it’s clear that much of what one’s seeing in these graphs is a reflection of the particular user communities of these systems, with some important pieces of modern mathematics (like the applications of algebraic geometry to number theory) notably missing.

But, OK, how do individual theorems work in these systems? As an example, let’s consider the Pythagorean theorem. In Euclid, this is 1.47, and here’s the first level of its dependency graph:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> 47|>, 1]]]

Here’s the full graph involving a total of 39 elements (including, by the way, all 10 of the axioms), and having “depth” 20:

EuclidGraphLarge
&#10005
CloudGet["https://wolfr.am/PJKo9Lnq"];
					EuclidGraphLarge[
 Subgraph[euc, 
  VertexOutComponent[euc, <|"Book" -> 1, "Theorem" -> 47|>]], 
 VertexSize -> 1.7]

In Lean’s mathlib, the theorem is called euclidean_geometry.dist_square_eq _dist _square _add _dist _square _iff _angle _eq _pi _div _two—and its stated proof directly involves 7 other theorems:

leanAssoc
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"leanAssoc", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL39QRbE\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "leanAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<init\>\"", ",", " ", "\"\<system\>\"", ",", " ", 
     "\"\<tactic\>\"", ",", " ", "\"\<data\>\"", ",", " ", 
     "\"\<meta\>\"", ",", " ", "\"\<control\>\"", ",", " ", 
     "\"\<computability\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", " ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "leanInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Column", "[", 
  RowBox[{
   RowBox[{"Rest", "[", 
    RowBox[{"VertexOutComponent", "[", 
     RowBox[{
     "leanGraph", ",", 
      "\"\<euclidean_geometry.dist_square_eq_dist_square_add_dist_\
square_iff_angle_eq_pi_div_two\>\"", ",", "1"}], "]"}], "]"}], ",", 
   RowBox[{"Frame", "\[Rule]", "All"}], ",", 
   RowBox[{"FrameStyle", "->", 
    RowBox[{"GrayLevel", "[", ".7", "]"}]}], ",", " ", 
   RowBox[{"With", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"leanA", "=", 
       RowBox[{
        RowBox[{
         RowBox[{"leanAssoc", "[", "#", "]"}], " ", "&"}], " ", "/@", 
        " ", 
        RowBox[{"Rest", "[", 
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{
          "leanGraph", ",", 
           "\"\<euclidean_geometry.dist_square_eq_dist_square_add_\
dist_square_iff_angle_eq_pi_div_two\>\"", ",", "1"}], "]"}], 
         "]"}]}]}], "}"}], ",", 
     RowBox[{"Background", " ", "\[Rule]", " ", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{
         RowBox[{"Lighter", "[", 
          RowBox[{"#", ",", "0.5"}], "]"}], "&"}], " ", "/@", " ", 
        RowBox[{"Flatten", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"leanColors", "[", "#", "]"}], " ", "&"}], " ", "/@",
           "leanA"}], "]"}]}], ")"}]}]}], "]"}]}], "]"}]], "Input"]
}, Open  ]]

Going 3 steps, the theorem dependency graph looks like (where “init” and “tactic” basically refer to structure rather than mathematical content):

Legended
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"leanAssoc", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL39QRbE\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "leanAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<init\>\"", ",", " ", "\"\<system\>\"", ",", " ", 
     "\"\<tactic\>\"", ",", " ", "\"\<data\>\"", ",", " ", 
     "\"\<meta\>\"", ",", " ", "\"\<control\>\"", ",", " ", 
     "\"\<computability\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", " ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "leanInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Legended", "[", 
  RowBox[{
   RowBox[{"Subgraph", "[", 
    RowBox[{"leanGraph", ",", 
     RowBox[{"VertexOutComponent", "[", 
      RowBox[{
      "leanGraph", ",", 
       "\"\<euclidean_geometry.dist_square_eq_dist_square_add_dist_\
square_iff_angle_eq_pi_div_two\>\"", ",", "3"}], "]"}], ",", 
     RowBox[{
     "GraphLayout", "\[Rule]", "\"\<LayeredDigraphEmbedding\>\""}], 
     ",", 
     RowBox[{"AspectRatio", "\[Rule]", 
      RowBox[{"1", "/", "3"}]}], ",", " ", 
     RowBox[{"EdgeStyle", "\[Rule]", " ", 
      RowBox[{"GrayLevel", "[", 
       RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], ",", " ", 
     RowBox[{"VertexStyle", " ", "\[Rule]", " ", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{
         RowBox[{"#", " ", "\[Rule]", " ", 
          RowBox[{"{", 
           RowBox[{"Lighter", " ", "/@", " ", 
            RowBox[{"leanColors", "[", 
             RowBox[{"leanAssoc", "[", "#", "]"}], "]"}]}], "}"}]}], 
         " ", "&"}], " ", "/@", "  ", 
        RowBox[{"VertexList", "[", "leanGraph", "]"}]}], ")"}]}], ",",
      " ", 
     RowBox[{"VertexSize", " ", "\[Rule]", " ", "0.75"}]}], "]"}], 
   ",", " ", 
   RowBox[{"SwatchLegend", "[", 
    RowBox[{
     RowBox[{"Flatten", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"Lighter", " ", "/@", " ", 
         RowBox[{"leanColors", "[", "#", "]"}]}], " ", "&"}], " ", "/@",
        " ", 
       RowBox[{"{", 
        RowBox[{
        "\"\<algebra\>\"", ",", "\"\<analysis\>\"", ",", 
         "\"\<geometry\>\"", ",", "\"\<init\>\"", ",", 
         "\"\<tactic\>\"", ",", "\"\<topology\>\""}], "}"}]}], "]"}], 
     ",", " ", 
     RowBox[{"{", 
      RowBox[{
      "\"\<algebra\>\"", ",", "\"\<analysis\>\"", ",", 
       "\"\<geometry\>\"", ",", "\"\<init\>\"", ",", "\"\<tactic\>\"",
        ",", "\"\<topology\>\""}], "}"}]}], "]"}]}], "]"}]], "Input"]
}, Open  ]]

The full graph involves a total of 2850 elements (and has “depth” 84), and after transitive reduction has the form:

leanGraph
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanAssoc", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL39QRbE\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", " ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "leanInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Graph", "[", 
  RowBox[{
   RowBox[{"TransitiveReductionGraph", "[", 
    RowBox[{"Subgraph", "[", 
     RowBox[{"leanGraph", ",", 
      RowBox[{"VertexOutComponent", "[", 
       RowBox[{
       "leanGraph", ",", 
        "\"\<euclidean_geometry.dist_square_eq_dist_square_add_dist_\
square_iff_angle_eq_pi_div_two\>\""}], "]"}]}], "]"}], "]"}], ",", 
   RowBox[{
   "GraphLayout", "\[Rule]", "\"\<LayeredDigraphEmbedding\>\""}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "2"}]}], ",", "  ", 
   RowBox[{"EdgeStyle", "\[Rule]", " ", 
    RowBox[{"GrayLevel", "[", 
     RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], ",", " ", 
   RowBox[{"VertexStyle", " ", "\[Rule]", " ", 
    RowBox[{"(", 
     RowBox[{
      RowBox[{
       RowBox[{"#", " ", "\[Rule]", " ", 
        RowBox[{"{", 
         RowBox[{"Lighter", " ", "/@", " ", 
          RowBox[{"leanColors", "[", 
           RowBox[{"leanAssoc", "[", "#", "]"}], "]"}]}], "}"}]}], 
       " ", "&"}], " ", "/@", " ", 
      RowBox[{"VertexList", "[", "leanGraph", "]"}]}], ")"}]}], ",", 
   " ", 
   RowBox[{"VertexSize", " ", "\[Rule]", " ", "0.75"}]}], 
  "]"}]], "Input"]
}, Open  ]]

And, yes, this is considerably more complicated than Euclid’s version—but presumably that’s what happens if you insist on full formalization. Of the 2850 theorems used, 1503 are basically structural. The remainder bring mathematical content from different areas, and it’s notable in the picture above that different parts of the proof seem to “concentrate” on different areas. Curiously, theorems from geometry (which is basically all Euclid used) occupy only a tiny sliver of the pie chart of all theorems used:

lpc = ReverseSort
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"leanAssoc", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL39QRbE\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "leanAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<init\>\"", ",", " ", "\"\<system\>\"", ",", " ", 
     "\"\<tactic\>\"", ",", " ", "\"\<data\>\"", ",", " ", 
     "\"\<meta\>\"", ",", " ", "\"\<control\>\"", ",", " ", 
     "\"\<computability\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{"leanDomains", ",", " ", "leanInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", " ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "leanInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"leanPythagoreanAntecedents", " ", "=", " ", 
   RowBox[{"Reverse", "[", 
    RowBox[{"KeyDrop", "[", 
     RowBox[{
      RowBox[{"Counts", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"leanAssoc", "[", "#", "]"}], " ", "&"}], " ", "/@", 
        " ", 
        RowBox[{"VertexList", "[", 
         RowBox[{"TransitiveReductionGraph", "[", 
          RowBox[{"Subgraph", "[", 
           RowBox[{"leanGraph", ",", 
            RowBox[{"VertexOutComponent", "[", 
             RowBox[{
             "leanGraph", ",", 
              "\"\<euclidean_geometry.dist_square_eq_dist_square_add_\
dist_square_iff_angle_eq_pi_div_two\>\""}], "]"}]}], "]"}], "]"}], 
         "]"}]}], "]"}], ",", " ", "leanInfrastructure"}], "]"}], 
    "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"PieChart", "[", 
  RowBox[{"leanPythagoreanAntecedents", ",", 
   RowBox[{"ChartLabels", "\[Rule]", 
    RowBox[{"Placed", "[", 
     RowBox[{
      RowBox[{"Join", "[", 
       RowBox[{
        RowBox[{"{", "\"\<order theory\>\"", "}"}], ",", " ", 
        RowBox[{"Rest", "[", 
         RowBox[{"Keys", "[", "leanPythagoreanAntecedents", "]"}], 
         "]"}]}], "]"}], ",", " ", "\"\<RadialCallout\>\""}], "]"}]}],
    ",", 
   RowBox[{"ImagePadding", "\[Rule]", "15"}], ",", 
   RowBox[{"ChartStyle", "\[Rule]", 
    RowBox[{"(", 
     RowBox[{
      RowBox[{
       RowBox[{"Lighter", "[", 
        RowBox[{
         RowBox[{"First", "[", 
          RowBox[{"leanColors", "[", "#", "]"}], "]"}], ",", ".2"}], 
        "]"}], "&"}], "/@", 
      RowBox[{"Keys", "[", "leanPythagoreanAntecedents", "]"}]}], 
     ")"}]}]}], "]"}]], "Input"]
}, Open  ]]

The Metamath set.mm version of the Pythagorean theorem is called pythag, and its proof directly depends on 26 other theorems:

metamathGraph
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathAssoc", " ", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLborw8R\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "metamathAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<SUPPLEMENTARY MATERIAL (USER'S MATHBOXES)\>\"", ",", " ", 
     "\"\<GUIDES AND MISCELLANEA\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{
          "metamathDomains", ",", " ", "metamathInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{
             "metamathDomains", ",", " ", "metamathInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", "  ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "metamathInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathPythagoreanColors", " ", "=", " ", 
   RowBox[{"Lighter", " ", "/@", " ", 
    RowBox[{"Flatten", "[", 
     RowBox[{
      RowBox[{
       RowBox[{"metamathColors", "[", "#", "]"}], " ", "&"}], " ", "/@", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{
         RowBox[{"metamathAssoc", "[", "#", "]"}], " ", "&"}], " ", "/@",
         " ", 
        RowBox[{"Rest", "[", 
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{"metamathGraph", ",", "\"\<pythag\>\"", ",", "1"}], 
          "]"}], "]"}]}], ")"}]}], "]"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Grid", "[", 
  RowBox[{
   RowBox[{"Partition", "[", 
    RowBox[{
     RowBox[{"Rest", "[", 
      RowBox[{"VertexOutComponent", "[", 
       RowBox[{"metamathGraph", ",", "\"\<pythag\>\"", ",", "1"}], 
       "]"}], "]"}], ",", " ", 
     RowBox[{"UpTo", "[", "6", "]"}]}], "]"}], ",", 
   RowBox[{"Frame", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{"All", ",", "All", ",", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", "3"}], "}"}], "\[Rule]", "False"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", "4"}], "}"}], "\[Rule]", "False"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", "5"}], "}"}], "\[Rule]", "False"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", "6"}], "}"}], "\[Rule]", "False"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", "2"}], "}"}], "\[Rule]", "True"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", "3"}], "}"}], "\[Rule]", "True"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", "4"}], "}"}], "\[Rule]", "True"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", "5"}], "}"}], "\[Rule]", "True"}], ",", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", "6"}], "}"}], "\[Rule]", "True"}]}], 
       "}"}]}], "}"}]}], ",", 
   RowBox[{"FrameStyle", "->", 
    RowBox[{"GrayLevel", "[", ".7", "]"}]}], ",", " ", 
   RowBox[{"Background", " ", "\[Rule]", " ", 
    RowBox[{"{", 
     RowBox[{"None", ",", " ", "None", ",", " ", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{
         RowBox[{"{", 
          RowBox[{"1", ",", " ", "1"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "1", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"1", ",", " ", "2"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "2", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"1", ",", " ", "3"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "3", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"1", ",", " ", "4"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "4", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"1", ",", " ", "5"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "5", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"1", ",", " ", "6"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "6", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"2", ",", " ", "1"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "7", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"2", ",", " ", "2"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "8", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"2", ",", " ", "3"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "9", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"2", ",", " ", "4"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "10", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"2", ",", " ", "5"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "11", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"2", ",", " ", "6"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "12", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"3", ",", " ", "1"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "13", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"3", ",", " ", "2"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "14", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"3", ",", " ", "3"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "15", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"3", ",", " ", "4"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "16", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"3", ",", " ", "5"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "17", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"3", ",", " ", "6"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "18", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", " ", "1"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "19", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", " ", "2"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "20", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", " ", "3"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "21", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", " ", "4"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "22", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", " ", "5"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "23", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"4", ",", " ", "6"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "24", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", " ", "1"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "25", "]"}], "]"}]}], ",", " ", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"5", ",", " ", "2"}], "}"}], " ", "\[Rule]", " ", 
         RowBox[{"metamathPythagoreanColors", "[", 
          RowBox[{"[", "26", "]"}], "]"}]}]}], "}"}]}], "}"}]}]}], 
  "]"}]], "Input"]
}, Open  ]]

After 1 step, the theorem dependency graph is:

metamathGraph
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathAssoc", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLborw8R\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "metamathAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<SUPPLEMENTARY MATERIAL (USER'S MATHBOXES)\>\"", ",", " ", 
     "\"\<GUIDES AND MISCELLANEA\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{
          "metamathDomains", ",", " ", "metamathInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{
             "metamathDomains", ",", " ", "metamathInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", "  ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "metamathInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Legended", "[", 
  RowBox[{
   RowBox[{"Subgraph", "[", 
    RowBox[{"metamathGraph", ",", 
     RowBox[{"VertexOutComponent", "[", 
      RowBox[{"metamathGraph", ",", "\"\<pythag\>\"", ",", "1"}], 
      "]"}], ",", 
     RowBox[{
     "GraphLayout", "\[Rule]", "\"\<LayeredDigraphEmbedding\>\""}], 
     ",", 
     RowBox[{"AspectRatio", "\[Rule]", 
      RowBox[{"1", "/", "2"}]}], ",", " ", 
     RowBox[{"VertexLabels", " ", "\[Rule]", " ", "None"}], ",", " ", 
     
     RowBox[{"EdgeStyle", "\[Rule]", " ", 
      RowBox[{"GrayLevel", "[", 
       RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], ",", " ", 
     RowBox[{"VertexStyle", " ", "\[Rule]", " ", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{
         RowBox[{"#", " ", "\[Rule]", " ", 
          RowBox[{"{", 
           RowBox[{"Lighter", " ", "/@", " ", 
            RowBox[{"metamathColors", "[", 
             RowBox[{"metamathAssoc", "[", "#", "]"}], "]"}]}], 
           "}"}]}], " ", "&"}], " ", "/@", " ", 
        RowBox[{"VertexList", "[", "metamathGraph", "]"}]}], ")"}]}], 
     ",", " ", 
     RowBox[{"VertexSize", " ", "\[Rule]", " ", "0.75"}]}], "]"}], 
   ",", " ", 
   RowBox[{"SwatchLegend", "[", 
    RowBox[{
     RowBox[{"Lighter", " ", "/@", " ", 
      RowBox[{"Flatten", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"metamathColors", "[", "#", "]"}], " ", "&"}], " ", "/@",
         " ", 
        RowBox[{"{", 
         RowBox[{
         "\"\<BASIC REAL AND COMPLEX FUNCTIONS\>\"", ",", 
          "\"\<CLASSICAL FIRST-ORDER LOGIC WITH EQUALITY\>\"", ",", 
          "\"\<REAL AND COMPLEX NUMBERS\>\"", ",", 
          "\"\<ZF (ZERMELO-FRAENKEL) SET THEORY\>\""}], "}"}]}], 
       "]"}]}], ",", " ", 
     RowBox[{"{", 
      RowBox[{
      "\"\<real and complex functions\>\"", ",", 
       "\"\<classical first-order logic with equality\>\"", ",", 
       "\"\<real and complex numbers\>\"", ",", " ", 
       "\"\<ZF (Zermelo-Frankel) set theory\>\""}], "}"}]}], "]"}]}], 
  "]"}]], "Input"]
}, Open  ]]

The full graph involves 7099 elements—and has depth 270. In other words, to get from the Pythagorean theorem all the way to the axioms can take as many as 270 steps.

metamathGraph
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}]}], 
  ";"}]], "Input",

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathAssoc", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLborw8R\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathDomains", " ", "=", " ", 
   RowBox[{"Union", "[", 
    RowBox[{"Values", "[", "metamathAssoc", "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathInfrastructure", " ", "=", " ", 
   RowBox[{"{", 
    RowBox[{
    "\"\<SUPPLEMENTARY MATERIAL (USER'S MATHBOXES)\>\"", ",", " ", 
     "\"\<GUIDES AND MISCELLANEA\>\""}], "}"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathColors", " ", "=", " ", 
   RowBox[{"Merge", "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"AssociationThread", "[", 
        RowBox[{
         RowBox[{"Complement", "[", 
          RowBox[{
          "metamathDomains", ",", " ", "metamathInfrastructure"}], 
          "]"}], " ", "\[Rule]", " ", 
         RowBox[{"Take", "[", 
          RowBox[{
           RowBox[{"ColorData", "[", 
            RowBox[{"54", ",", " ", "\"\<ColorList\>\""}], "]"}], ",",
            " ", 
           RowBox[{"Length", "[", 
            RowBox[{"Complement", "[", 
             RowBox[{
             "metamathDomains", ",", " ", "metamathInfrastructure"}], 
             "]"}], "]"}]}], "]"}]}], "]"}], ",", "  ", 
       RowBox[{"AssociationThread", "[", 
        RowBox[{
        "metamathInfrastructure", " ", "\[Rule]", " ", "LightGray"}], 
        "]"}]}], "}"}], ",", " ", "Identity"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Graph", "[", 
  RowBox[{
   RowBox[{"TransitiveReductionGraph", "[", 
    RowBox[{"Subgraph", "[", 
     RowBox[{"metamathGraph", ",", 
      RowBox[{"VertexOutComponent", "[", 
       RowBox[{"metamathGraph", ",", "\"\<pythag\>\""}], "]"}]}], 
     "]"}], "]"}], ",", 
   RowBox[{
   "GraphLayout", "\[Rule]", "\"\<LayeredDigraphEmbedding\>\""}], ",", 
   RowBox[{"AspectRatio", "\[Rule]", 
    RowBox[{"1", "/", "2"}]}], ",", "  ", 
   RowBox[{"EdgeStyle", "\[Rule]", " ", 
    RowBox[{"GrayLevel", "[", 
     RowBox[{"0.5", ",", " ", "0.5"}], "]"}]}], ",", " ", 
   RowBox[{"VertexStyle", " ", "\[Rule]", " ", 
    RowBox[{"(", 
     RowBox[{
      RowBox[{
       RowBox[{"#", " ", "\[Rule]", " ", 
        RowBox[{"{", 
         RowBox[{"Lighter", " ", "/@", " ", 
          RowBox[{"metamathColors", "[", 
           RowBox[{"metamathAssoc", "[", "#", "]"}], "]"}]}], "}"}]}],
        " ", "&"}], " ", "/@", " ", 
      RowBox[{"VertexList", "[", "metamathGraph", "]"}]}], ")"}]}], 
   ",", " ", 
   RowBox[{"VertexSize", " ", "\[Rule]", " ", "0.75"}]}], 
  "]"}]], "Input"]
}, Open  ]]

Given the complete Lean or Metamath corpuses, we can start doing the same kind of empirical metamathematics we did for Euclid’s Elements—except now the higher level of formalization that’s being used potentially allows us to go much further.

As a very simple example, here’s the distribution of numbers of theorems directly referenced in the proof of each theorem in Lean, Metamath and Euclid:

leanGraph
&#10005
Cell[CellGroupData[{
						Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"euc", "=", 
   RowBox[{"ResourceData", "[", 
    TagBox["\"\<Theorem Network from Euclid's Elements\>\"",
     #& ,
     BoxID -> 
     "ResourceTag-Theorem Network from Euclid's Elements-Input",
     AutoDelete->True], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"GraphicsRow", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"Histogram", "[", 
     RowBox[{
      RowBox[{"Last", "[", "#", "]"}], ",", 
      RowBox[{"Frame", "\[Rule]", "True"}], ",", 
      RowBox[{"ImageSize", "\[Rule]", "250"}], ",", 
      RowBox[{"Epilog", "\[Rule]", 
       RowBox[{"Text", "[", 
        RowBox[{
         RowBox[{"Style", "[", 
          RowBox[{
           RowBox[{"First", "[", "#", "]"}], ",", 
           RowBox[{"Directive", "[", 
            RowBox[{
             RowBox[{"FontSize", "\[Rule]", "12"}], ",", 
             RowBox[{"GrayLevel", "[", "0.25", "]"}], ",", 
             RowBox[{
             "FontFamily", "\[Rule]", "\"\<Source Sans Pro\>\""}]}], 
            "]"}]}], "]"}], ",", 
         RowBox[{"Scaled", "[", 
          RowBox[{"{", 
           RowBox[{"1", ",", "1"}], "}"}], "]"}], ",", 
         RowBox[{"{", 
          RowBox[{"1.5", ",", "1.4"}], "}"}]}], "]"}]}]}], "]"}], 
    "&"}], "/@", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{"\"\<Lean\>\"", ",", 
       RowBox[{"VertexOutDegree", "[", "leanGraph", "]"}]}], "}"}], 
     ",", 
     RowBox[{"{", 
      RowBox[{"\"\<Metamath\>\"", ",", 
       RowBox[{"VertexOutDegree", "[", "metamathGraph", "]"}]}], 
      "}"}], ",", 
     RowBox[{"{", 
      RowBox[{"\"\<Euclid\>\"", ",", 
       RowBox[{"VertexOutDegree", "[", "euc", "]"}]}], "}"}]}], 
    "}"}]}], "]"}]], "Input"]
}, Open  ]]

The differences presumably reflect different “hierarchical modularity conventions” in Lean and Metamath (and Euclid). But it’s interesting to note, for example, that in all three cases, the Pythagorean theorem is “above average” in terms of number of theorems referenced in its proof:

Grid
&#10005
Grid[{{"", "Lean", "Metamath", "Euclid"}, {"", 7, 26, 8}, 
  Style[#, GrayLevel[0.4]] & /@ {"mean", 4.9, 18.7, 4.3}}, 
 Frame -> All, FrameStyle -> Gray]

What are the most popular theorems used in proofs? In terms of direct references, here are the top-5 lists:

leanGraph
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/PJKo9Lnq\>\"", "]"}],
   ";", 
  RowBox[{"Text", "[", 
   RowBox[{"MapIndexed", "[", 
    RowBox[{
     RowBox[{
      RowBox[{"Labeled", "[", 
       RowBox[{
        RowBox[{
         RowBox[{"Function", "[", 
          RowBox[{"u", ",", 
           RowBox[{"Grid", "[", 
            RowBox[{
             RowBox[{"Take", "[", 
              RowBox[{
               RowBox[{"ReverseSortBy", "[", 
                RowBox[{
                 RowBox[{
                  RowBox[{
                   RowBox[{"{", 
                    RowBox[{
                    RowBox[{"If", "[", 
                    RowBox[{
                    RowBox[{
                    RowBox[{"Head", "[", "#", "]"}], "===", 
                    "Association"}], ",", 
                    RowBox[{"EuclidVertexName", "[", "#", "]"}], ",", 
                    "#"}], "]"}], ",", 
                    RowBox[{"VertexInDegree", "[", 
                    RowBox[{"u", ",", "#"}], "]"}]}], "}"}], "&"}], "/@",
                   " ", 
                  RowBox[{"VertexList", "[", "u", "]"}]}], ",", 
                 "Last"}], "]"}], ",", " ", "5"}], "]"}], ",", 
             RowBox[{"Frame", "\[Rule]", "All"}]}], "]"}]}], "]"}], 
         "[", 
         RowBox[{"Last", "[", "#", "]"}], "]"}], ",", 
        RowBox[{"First", "[", "#", "]"}]}], "]"}], "&"}], ",", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"\"\<Lean\>\"", ",", "leanGraph"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"\"\<Metamath\>\"", ",", "metamathGraph"}], "}"}], 
       ",", 
       RowBox[{"{", 
        RowBox[{"\"\<Euclid\>\"", ",", "euc"}], "}"}]}], "}"}]}], 
    "]"}], "]"}]}]], "Input"]
   }, Open  ]]

Not surprisingly, for Lean and Metamath these are quite “structural”. For Lean, congr_arg is the “congruency” statement that if a=b then f(a)=f(b); congr is a variant that says if a=b and f=g then f(a)=g(b); eq.trans is the transitivity statement if a=b and b=c then a=c (Euclid’s CN1); eq.symm is the statement if a=b then b=a; etc. For Metamath, syl is “transitive syllogism”: if xy and yz then xz; eqid is about reflexity of equality; etc. In Euclid, these kinds of low-level results—if they are even stated at all—tend to be “many levels down” in the hierarchy of theorems, leaving the single most popular theorem, 10.11, to be one about proportion and rationality.

If one looks at all theorems directly and indirectly referenced by a given theorem, the distribution of total numbers of theorems is as follows (with Lean showing the most obviously exponential decay):

leanGraph
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"leanGraph", " ", "=", " ", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PL3LfaQ4\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"metamathGraph", " ", "=", 
   RowBox[{
   "CloudGet", "[", "\"\<https://wolfr.am/PLbmdhRv\>\"", "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"euc", "=", 
   RowBox[{"ResourceData", "[", 
    TagBox["\"\<Theorem Network from Euclid's Elements\>\"",
     #& ,
     BoxID -> 
     "ResourceTag-Theorem Network from Euclid's Elements-Input",
     AutoDelete->True], "]"}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"outcs", "=", 
   RowBox[{
    RowBox[{"Function", "[", 
     RowBox[{"u", ",", 
      RowBox[{
       RowBox[{
        RowBox[{"Length", "[", 
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{"u", ",", "#"}], "]"}], "]"}], "&"}], "/@", 
       RowBox[{"VertexList", "[", "u", "]"}]}]}], "]"}], "/@", 
    RowBox[{"{", 
     RowBox[{"leanGraph", ",", "metamathGraph"}], "}"}]}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"GraphicsRow", "[", 
  RowBox[{
   RowBox[{"Function", "[", 
    RowBox[{"u", ",", 
     RowBox[{"Histogram", "[", 
      RowBox[{
       RowBox[{"Last", "[", "u", "]"}], ",", "50", ",", 
       RowBox[{"{", 
        RowBox[{"\"\<Log\>\"", ",", "\"\<Count\>\""}], "}"}], ",", 
       RowBox[{"PlotRange", "\[Rule]", "All"}], ",", 
       RowBox[{"Frame", "\[Rule]", "True"}], ",", 
       RowBox[{"ImageSize", "\[Rule]", "250"}], ",", 
       RowBox[{"Epilog", "\[Rule]", 
        RowBox[{"Text", "[", 
         RowBox[{
          RowBox[{"Style", "[", 
           RowBox[{
            RowBox[{"First", "[", "u", "]"}], ",", 
            RowBox[{"Directive", "[", 
             RowBox[{
              RowBox[{"FontSize", "\[Rule]", "12"}], ",", 
              RowBox[{"GrayLevel", "[", "0.25", "]"}], ",", 
              RowBox[{
              "FontFamily", "\[Rule]", "\"\<Source Sans Pro\>\""}]}],
              "]"}]}], "]"}], ",", 
          RowBox[{"Scaled", "[", 
           RowBox[{"{", 
            RowBox[{"1", ",", "1"}], "}"}], "]"}], ",", 
          RowBox[{"{", 
           RowBox[{"1.5", ",", "1.4"}], "}"}]}], "]"}]}]}], "]"}]}], 
    "]"}], "/@", 
   RowBox[{"Transpose", "[", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
       "\"\<Lean\>\"", ",", "\"\<Metamath\>\"", ",", 
        "\"\<Euclid\>\""}], "}"}], ",", 
      RowBox[{"Append", "[", 
       RowBox[{"outcs", ",", 
        RowBox[{
         RowBox[{"Function", "[", 
          RowBox[{"u", ",", 
           RowBox[{
            RowBox[{
             RowBox[{"Length", "[", 
              RowBox[{"VertexOutComponent", "[", 
               RowBox[{"u", ",", "#"}], "]"}], "]"}], "&"}], "/@", 
            RowBox[{"VertexList", "[", "u", "]"}]}]}], "]"}], "[", 
         "euc", "]"}]}], "]"}]}], "}"}], "]"}]}], "]"}]], "Input"]
}, Open  ]]

What about the overall structure of the Lean and Metamath dependency graphs? We can ask about effective dimension, about causal invariance, about “event horizons”, and much more. But right now I’ll leave that for another time...

The Future of Empirical Metamathematics

I don’t think empirical metamathematics has been much of a thing in the past. In fact, looking on the web as I write this, I’m surprised to see that essentially all references to the actual term “empirical metamathematics” seem to point directly or indirectly to that one note of mine on the subject in A New Kind of Science.

But as I hope this piece has made clear, there’s a lot that can be done in empirical metamathematics. In everything I’ve written here, I haven’t started analyzing questions like how one can recognize a powerful or a surprising theorem. And I’ve barely scratched the surface even of the empirical metamathematics that can be done on Euclid’s Elements from 2000 years ago.

But what kind of a thing is empirical metamathematics? Assuming one’s looking at theorems and proofs constructed by humans rather than by automated systems, it’s about analyzing large-scale human output—a bit like doing data science on literary texts, or on things like websites or legal corpuses. But it’s different. Because ultimately the theorems and proofs that are the subject of empirical metamathematics are derived not from features of the world, but from a formal system that defines some area of mathematics.

With computational language the goal is to be able to describe anything in formalized, computational terms. But in empirical metamathematics, things are in a sense “born formalized”. Whatever the actual presentation of theorems and proofs may be there, their “true form” is ultimately something grounded in the formal structure of the mathematics being used.

Of course there is also a strong human element to the raw material of empirical metamathematics. It is (at least for now) humans who have chosen which of the infinite number of possible theorems should be considered interesting, and worthy of presentation. And at least traditionally, when humans write proofs, they usually do it less as a way to certify correctness, and more as a form of exposition: to explain to other humans why a particular theorem is true, and what structure it fits into.

In a sense, empirical metamathematics is a quite desiccated way to look at mathematics, in which all the elegant conceptual structure of its content has been removed. But if we’re to make a “science of metamathematics”, it’s almost inevitable that we have to think this way. Part of what we need to do is to understand some of the human aesthetics of mathematics, and in effect to see to deduce laws by which it may operate.

In this piece I’ve mostly concentrated on doing fairly straightforward graph-oriented data science, primarily on Euclid’s Elements. But in moving forward with empirical metamathematics a key question is what kind of model one should be trying to fit one’s observations into.

And this comes back to my current motivation for studying empirical metamathematics: as a window onto a general “bulk” theory of metamathematics—and as the foundation for a science not just of how we humans have explored metamathematical space, but of what fundamentally is out there in metamathematical space, and what its overall structure may be.

No doubt there are already clues in what I’ve done here, but probably only after we have the general theory will we have the paradigm that’s needed to identify them. But even without this, there’s much to do in studying empirical metamathematics for its own sake—and of better characterizing the remarkable human achievement that is mathematics.

And for now, it’s interesting to be able to look at something as old as Euclid’s Elements and to realize what new perspectives modern computational thinking can give us about it. Euclid was a pioneer in the notion of building everything up from formal rules—and the seeds he sowed played an important role in leading us to the modern computational paradigm. So it’s something of a thrill to be able to come back two thousand years later and see that paradigm—now all grown up—applied not only to something like the fundamental theory of physics, but also to what Euclid did all those years ago.

Thanks

For help with various aspects of the content of this piece I’d like to thank Peter Barendse, Ian Ford, Jonathan Gorard, Rob Lewis, Jose Martin-Garcia, Norm Megill, James Mulnix, Nik Murzin, Mano Namuduri, Ed Pegg, Michael Trott, and Xiaofan Zhang, as well as Sushma Kini and Jessica Wong, and for past discussions about related topics, also Bruno Buchberger, Dana Scott and the various participants of our 2016 workshop on the Semantic Representation of Mathematical Knowledge.

Note Added

As I was working on this piece, I couldn’t help wondering whether—in 2300 years—anyone else had worked on the empirical metamathematics of Euclid before. Turns out (as Don Knuth pointed out to me) at least one other person did—more than 400 years ago.

The person in question was Thomas Harriot (1560–1621).

The only thing Thomas Harriot published in his lifetime was the book A Briefe and True Report of the New Found Land of Virginia, based on a trip that he made to America in 1585. But his papers show that he did all sorts of math and science (including inventing the · notation for multiplication, < and >, as well as drawing pictures of the Moon through a telescope before Galileo, etc.). He seems to have had a well-ahead-of-his-time interest in discrete mathematics, apparently making Venn diagrams a couple of centuries before Venn

Venn diagram

doing various enumerations of structures

Enumerations of StructuresEnumerations of Structures

as well as various repeated computations (but no cellular automata, so far as I can tell!):

Repeated computations

And he seems to have made a detailed study of Euclid’s Elements, listing in detail (as I did) what theorems are used in each proof (this is for Book 1):

Harriot’s listing of theorems

But then, in his “moment of empirical metamathematics” he lists out the full dependency table for theorems in Book 1, having computed what we’d now call the transitive closure:

Book 1 transitive closure

Book 1 transitive closure Book 1 transitive closure Image Map

It’s easy for us to reproduce this now, and, yes, he did make a few mistakes:

Harriot’s analysis of Euclid with modern overlay—click to enlarge

Studying the empirical metamathematics of Euclid seems (to me) like an obvious thing to do, and it’s good to know I’m not the first one doing it. And actually I’m now wondering if someone actually already did it not “just” 400 years ago, but perhaps 2000 (or more) years ago...


Faster than Light in Our Model of Physics: Some Preliminary Thoughts

$
0
0
faster-than-light-icon

When the NASA Innovative Advanced Concepts Program asked me to keynote their annual conference I thought it would be a good excuse to spend some time on a question I’ve always wanted to explore…

Faster than Light in Our Model of Physics: Some Preliminary Thoughts

Can You Build a Warp Drive?

“So you think you have a fundamental theory of physics. Well, then tell us if warp drive is possible!” Despite the hopes and assumptions of science fiction, real physics has for at least a century almost universally assumed that no genuine effect can ever propagate through physical space any faster than light. But is this actually true? We’re now in a position to analyze this in the context of our model for fundamental physics. And I’ll say at the outset that it’s a subtle and complicated question, and I don’t know the full answer yet.

But I increasingly suspect that going faster than light is not a physical impossibility; instead, in a sense, doing it is “just” an engineering problem. But it may well be an irreducibly hard engineering problem. And one that can’t be solved with the computational resources available to us in our universe. But it’s also conceivable that there may be some clever “engineering solution”, as there have been to so many seemingly insuperable engineering problems in the past. And that in fact there is a way to “move through space” faster than light.

It’s a little tricky even to define what it means to “go faster than light”. Do we allow an existing “space tunnel” (like the wormholes of general relativity)? Perhaps a space tunnel that has been there since the beginning of the universe. Or even if no space tunnel already exists, do we allow the possibility of building one—that we can then travel through? I’ll discuss these possibilities later. But the most dramatic possibility is that even if one’s going where “no one has gone before”, it might still be possible to traverse space faster than light to get there.

To give a preview of why doing this might devolve into an “engineering problem”, let’s consider a loose (but, in the end, not quite so loose) analogy. Imagine you’ve got molecules of gas in a room, all bouncing around and colliding with each other. Now imagine there’s a special molecule—or even a tiny speck of dust or a virus particle—somewhere in the room. Normally the special molecule will be buffeted by the molecules in the air, and will move in some kind of random walk, gradually diffusing across the room. But imagine that the special molecule somehow knows enough about the motion of the air molecules that it can compute exactly where to go to avoid being buffeted. Then that special molecule can travel much faster than diffusion—and effectively make a beeline from one side of the room to the other.

Of course this requires more knowledge and more computation than we currently imagine something like a molecule can muster (though it’s not clear this is true when we start thinking about explicitly constructing molecule-scale computers). But the point is that the limit on the speed of the molecule is less a question of what’s physically possible, and more a question of what’s “engineerable”.

And so, I suspect, it is with space, and motion through space. Like our room full of air molecules, space in our theory of physics has a complex structure with many component parts that act in seemingly (but not actually) random ways. And in our theory the question of whether we can “move through space” faster than light can then be thought of as becoming a question of whether there can exist a “space demon” that can find ways to do computations fast enough to be able to successfully “hack space”.

But before we can discuss this further, we have to talk about just what space—and time—are in our models.

The Structure of Space and the Nature of Time

In standard physics, space (and the “spacetime continuum”) is just a background on which everything exists. Mathematically, it’s thought of as a manifold, in which every possible position can ultimately be labeled by 3 coordinate values. In our model, space is different. It’s not just a background; it’s got definite, intrinsic structure. And in fact everything in the universe is ultimately defined by that structure; in fact, at some level, everything is just “made of space”.

We might think of something like water as being a continuous fluid. But we know that at a small scale it’s actually made of discrete molecules. And so it is, I suspect, with space. At a small enough scale, there are actually discrete “atoms of space”—and only on a large scale does space appear to be continuous.

In our model, the “atoms of space” correspond to abstract elements whose only property is their relation to other abstract elements. Mathematically the structure can be thought of as a hypergraph, where the atoms of space are nodes, which are related by hyperedges to other nodes. On a very small scale we might have for example:

Graph3D
&#10005
Graph3D[Rule @@@ 
  ResourceFunction[
    "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
       w}}}, {{0, 0}, {0, 0}}, 5, "FinalState"], 
 GraphLayout -> "SpringElectricalEmbedding"]

On a slightly larger scale we might have:

Graph3D
&#10005
Graph3D[Rule @@@ 
  ResourceFunction[
    "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
       w}}}, {{0, 0}, {0, 0}}, 12, "FinalState"]]

And in our actual universe we might have a hypergraph with perhaps 10400 nodes.

How does a giant hypergraph behave like continuous space? In a case like this we can see that the nodes can be thought of as forming a 2D grid on a (curved) surface:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{1, 2, 3}, {4, 2, 5}} -> {{6, 3, 1}, {3, 6, 4}, {1,
     2, 6}}, {{0, 0, 0}, {0, 0, 0}}, 1000, "FinalStatePlot"]

There’s nothing intrinsic about our model of space that determines the effective dimensionality it will have. These are all perfectly good possible (hyper)graphs, but on a large scale they behave like space in different numbers of dimensions:

Table
&#10005
Table[GridGraph[Table[10, n]], {n, 1, 3}]

It’s convenient to introduce the notion of a “geodesic ball”: the region in a (hyper)graph that one reaches by following at most r connections in the (hyper)graph. A key fact is that in a (hyper)graph that limits to d-dimensional space, the number of nodes in the geodesic ball grows like rd. In a curved space (say, on the surface of a sphere) there’s a correction to rd, proportional to the curvature of the space.

The full story is quite long, but ultimately what happens is that—much as we can derive the properties of a fluid from the large-scale aggregate dynamics of lots of discrete molecules—so we can derive the properties of space from the large-scale aggregate dynamics of lots of nodes in our hypergraphs. And—excitingly enough—it seems that we get exactly Einstein’s equations from general relativity.

OK, so if space is a collection of elements laid out in a “spatial hypergraph”, what is time? Unlike in standard physics, it’s something initially very different. It’s a reflection of the process of computation by which the spatial hypergraph is progressively updated.

Let’s say our underlying rule for updating the hypergraph is:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, y}, {x, w}, {y, w}, {z, 
     w}}]]

Here’s a representation of the results of a sequence of updates according to this:

Flatten
&#10005
Flatten[With[{eo = 
    ResourceFunction[
      "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z,
         w}}, {{0, 0}, {0, 0}}, 4]}, 
  TakeList[eo["EventsStatesPlotsList", ImageSize -> Tiny], 
   eo["GenerationEventsCountList", 
    "IncludeBoundaryEvents" -> "Initial"]]]]

Going further we’ll get for example:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}, {{1, 1}, {1, 1}}, 10]["StatesPlotsList", 
 "MaxImageSize" -> 100]

But there’s a crucial point here. The underlying rule just defines how a local piece of hypergraph that has a particular form should be updated. If there are several pieces of hypergraph that have that form, it doesn’t say anything about which of them should be updated first. But once we’ve done a particular update, that can affect subsequent updates—and in general there’s a whole “causal graph” of causal relationships between updates.

We can see what’s going on a little more easily if instead of using spatial hypergraphs we just use strings of characters. Here we’re updating a string by repeatedly applying the (“sorting”) rule BA  AB:

evo = (SeedRandom
&#10005
evo = (SeedRandom[2424];
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    "BBAAAABAABBABBBBBAAA", 10, {"Random", 4}]);
ResourceFunction["SubstitutionSystemCausalPlot"][evo, 
 EventLabels -> False, CellLabels -> True, CausalGraph -> False]

The yellow boxes indicate “updating events”, and we can join them by a causal graph that represents which event affects which other ones:

evo = (SeedRandom
&#10005
evo = (SeedRandom[2424];
   ResourceFunction[
     "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
    "BBAAAABAABBABBBBBAAA", 10, {"Random", 4}]);
ResourceFunction["SubstitutionSystemCausalPlot"][evo, 
 EventLabels -> False, CellLabels -> False, CausalGraph -> True]

If we’re an observer inside this system, all we can directly tell is what events are occurring, and how they’re causally connected. But to set up a description of what’s going on, it’s convenient to be able to talk about certain events happening “at a certain time”, and others happening later. Or, in other words, we want to define some kind of “simultaneity surfaces”—or a “reference frame”.

Here are two choices for how to do this

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; \
CloudGet["https://wolfr.am/KVl97Tf4"]; 
Show[regularCausalGraphPlot[10, {1, 0}, {#, 0.0}, lorentz[0]], 
   ImageSize -> 330] & /@ {0., .3}

where the second one can be reinterpreted as:

CloudGet
&#10005
CloudGet["https://wolfr.am/KVkTxvC5"]; \
CloudGet["https://wolfr.am/KVl97Tf4"]; regularCausalGraphPlot[10, {1, 
  0}, {0.3, 0.0}, lorentz[0.3]]

And, yes, this can be thought of as corresponding to a reference frame with a different speed, just like in standard special relativity. But now there’s a crucial point. The particular rule we’ve used here is an example of one with the property of causal invariance—which means that it doesn’t matter “at what time” we do a particular update; we’ll always get the same causal graph. And this is why—even though space and time start out so differently in our models—we end up being able to derive the fact that they follow special relativity.

Given a reference frame, we can always “reconstruct” a view of the behavior of the system from the causal graph. In the cases shown here we’d get:

CloudGet
&#10005
CloudGet["https://wolfr.am/LbaDFVSn"]; GraphicsRow[
 Show[ResourceFunction["SubstitutionSystemCausalPlot"][
     boostedEvolution[
      ResourceFunction[
        "SubstitutionSystemCausalEvolution"][{"BA" -> "AB"}, 
       StringRepeat["BA", 10], 5], #], EventLabels -> False, 
     CellLabels -> True, CausalGraph -> False], 
    ImageSize -> {250, Automatic}] & /@ {0., 0.3}, Alignment -> Top]

And the fact that the system seems to “take longer to do its thing” in the second reference frame is precisely a reflection of relativistic time dilation in that frame.

Just as with strings, we can also draw causal graphs to represent the causal relationships between updating events in spatial hypergraphs. Here’s an example of what we get for the rule shown above:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
      w}}}, {{0, 0}, {0, 0}}, 7]["LayeredCausalGraph", 
 AspectRatio -> 1/2]

And once again we can set up reference frames to define what events we want to consider “simultaneous”. The only fundamental constraint on our reference frames is that in each slice of the “foliation” that defines the reference frame there can never be two events in which one follows from the other. Or, in the language of relativity, no events in a given slice can be timelike separated; instead, all of them must be spacelike separated, so that the slice defines a purely spacelike hypersurface.

In drawing a causal graph like the one above, we’re picking a particular collection of relative orderings of different possible updating events in the spatial hypergraph. But why one choice and not another? A key feature of our models is that actually we can think of all possible orderings as being done, or said, differently, we can construct a whole multiway graph of possibilities. Here’s what the multiway graph looks like for the string system above:

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"BA" -> "AB"}, "BBABBAA", 8, 
  "StatesGraph"], AspectRatio -> 1]

Each node in this multiway graph represents a complete state of our system (in this case, a string), and a path through the multiway system corresponds to a possible history of the system, with a particular corresponding causal graph.

But now there’s an important connection with physics: the fact that we get a multiway graph makes quantum mechanics inevitable in our models. And it turns out that just like we can use reference frames to make sense of the evolution of our systems in space and time, so also we can use “quantum observation frames” to make sense of the time evolution of multiway graphs. But now the analog of space is what we call “branchial space”: in effect a space of possible quantum states, with the connections between states defined by their relationship on branches in the multiway system.

And much as we can define a spatial hypergraph representing relationships between “points in space”, so we can define a branchial graph that represents relationships (or “entanglements”) between quantum states, in branchial space:

LayeredGraphPlot
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{"LayeredGraphPlot", "[", 
  RowBox[{
   RowBox[{"Graph", "[", 
    RowBox[{
     RowBox[{"ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}],
      "[", 
     RowBox[{
      RowBox[{"{", 
       RowBox[{
        RowBox[{"\"\<A\>\"", "\[Rule]", "\"\<AB\>\""}], ",", 
        RowBox[{"\"\<B\>\"", "\[Rule]", "\"\<A\>\""}]}], "}"}], ",", 
      "\"\<A\>\"", ",", "5", ",", "\"\<EvolutionGraph\>\""}], "]"}], 
    "]"}], ",", 
   RowBox[{"Epilog", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{
       RowBox[{
       "ResourceFunction", "[", 
        "\"\<WolframPhysicsProjectStyleData\>\"", "]"}], "[", 
       RowBox[{"\"\<BranchialGraph\>\"", ",", "\"\<EdgeStyle\>\""}], 
       "]"}], ",", 
      RowBox[{"AbsoluteThickness", "[", "1.5", "]"}], ",", 
      RowBox[{"Table", "[", 
       RowBox[{
        RowBox[{"Line", "[", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{
             RowBox[{"-", "10"}], ",", "i"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"9", ",", "i"}], "}"}]}], "}"}], "]"}], ",", 
        RowBox[{"{", 
         RowBox[{"i", ",", ".4", ",", "5", ",", "1.05"}], "}"}]}], 
       "]"}]}], "}"}]}]}], "]"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], 
  "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"\"\<A\>\"", "\[Rule]", "\"\<AB\>\""}], ",", 
     RowBox[{"\"\<B\>\"", "\[Rule]", "\"\<A\>\""}]}], "}"}], ",", 
   "\"\<A\>\"", ",", "5", ",", "\"\<BranchialGraph\>\""}], 
  "]"}]], "Input"]
}, Open  ]]

I won’t go into the details here, but one of the beautiful things in our models is that just as we can derive the Einstein equations as a large-scale limiting description of the behavior of our spatial hypergraphs, so also we can figure out the large-scale limiting behavior for multiway systems—and it seems that we get the Feynman path integral for quantum mechanics!

By the way, since we’re talking about faster than light and motion in space, it’s worth mentioning that there’s also a notion of motion in branchial space. And just like we have the speed of light c that defines some kind of limit on how fast we can explore physical space, so also we have a maximal entanglement rate ζ that defines a limit on how fast we can explore (and thus “entangle”) different quantum states in branchial space. And just as we can ask about “faster than c”, we can also talk about “faster than ζ”. But before we get to that, we’ve got a lot of other things to discuss.

Can We Make Tunnels in Space?

Traditional general relativity describes space as a continuous manifold that evolves according to certain partial differential equations. But our models talk about what’s underneath that, and what space actually seems to be made of. And while in appropriate limits they reproduce what general relativity says, they also imply all sorts of new and different phenomena.

Imagine that the hypergraph that represents space has the form of a simple 2D grid:

GridGraph
&#10005
GridGraph[{15, 15}, 
 EdgeStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph", 
   "EdgeLineStyle"], 
 VertexStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph", 
   "VertexStyle"]]

In the limit this will be like 2D Euclidean space. But now suppose we add some extra “long-range threads” to the graph:

SeedRandom
&#10005
SeedRandom[243234]; With[{g = GridGraph[{20, 20}]}, 
 EdgeAdd[g, 
  UndirectedEdge @@@ 
   Select[Table[RandomInteger[{1, VertexCount[g]}, 2], 10], 
    GraphDistance[g, #[[1]], #[[2]]] > 8 &], 
  EdgeStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph", 
    "EdgeLineStyle"], 
  VertexStyle -> 
   ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph", 
    "VertexStyle"]]]

Here’s a different rendering of the same graph:

Graph3D
&#10005
Graph3D[EdgeList[%], 
 EdgeStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph3D",
    "EdgeLineStyle"], 
 VertexStyle -> 
  ResourceFunction["WolframPhysicsProjectStyleData"]["SpatialGraph3D",
    "VertexStyle"]]

Now let’s ask about distances on this graph. Some nodes on the graph will have distances that are just like what one would expect in ordinary 2D space. But some will be “anomalously close”, because one will be able to get from one to another not by going “all the way through 2D space” but by taking a shortcut along one of the long-range threads.

Let’s say that we’re able to move around so that at every elementary interval of time we traverse a single connection in the graph. Then if our view of “what space is like” is based on the general structure of the graph (ignoring the long-range threads) we’ll come to some conclusion about how far we can go in a certain time—and what the maximum speed is at which we can “go through space”. But then what happens if we encounter one of the long-range threads? If we go through it we’ll be able to get from one “place in space” to another much faster than would be implied by the maximum speed we deduced from looking at “ordinary space”.

In a graph, there are many ways to end up having “long-range threads”—and we can think of these as defining various kinds of “space tunnels” that provide ways to get around in space evading usual speed-of-light constraints. We can imagine both persistent space tunnels that could be repeatedly used, and spontaneous or “just-in-time” ones that exist only transiently. But—needless to say—there is all sorts of subtlety around the notion of space tunnels. If a tunnel is a pattern in a graph, what actually happens when something “goes through it”? And if a tunnel didn’t always exist, how does it get formed?

Space tunnels are a fairly general concept that can be defined on graphs or hypergraphs. But there’s at least a special case of them that can be defined even in standard general relativity: wormholes. General relativity describes space as a continuum—a manifold—in which there’s no way to have “just a few long-range threads”. The best one can do is to imagine that there’s a kind of “handle in space”, that provides an alternative path from one part of space to another:

Wormhole diagram

How would such a non-simply-connected manifold form? Perhaps it’s a bit like the gastrulation that happens in embryonic development. But mathematically one can’t continuously change the topology of something continuous; there has to at least be some kind of singularity. In general relativity it’s been tricky to see how this could work. But of course in our models there’s not the same kind of constraint, because one doesn’t have to “rearrange a whole continuum”; one can do something more like “growing a handle one thread at a time”.

Here’s an example where one can see something a bit like this happening. We’re using the rule:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{1, 2, 3}, {1, 4, 5}} -> {{3, 3, 6}, {6, 6, 
     5}, {4, 5, 6}}]]

And what it does is effectively to “knit handles” that provide “shortcuts” between “separated” points in patches of what limits to 2D Euclidean space:

Labeled
&#10005
Labeled[ResourceFunction[
     "WolframModel"][{{1, 2, 3}, {1, 4, 5}} -> {{3, 3, 6}, {6, 6, 
       5}, {4, 5, 6}}, {{0, 0, 0}, {0, 0, 0}}, #, "FinalStatePlot"], 
   Text[#]] & /@ {0, 5, 10, 50, 100, 500, 1000}

In our models—free from the constraints of continuity—space can have all sorts of exotic forms. First of all, there’s no constraint that space has to have an integer number of dimension (say 3). Dimension is just defined by the asymptotic growth rates of balls, and can have any value. Like here’s a case that approximates 2.3-dimensional space:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{{1, 2, 3}, {2, 4, 5}} -> {{6, 7, 2}, {5, 7, 
     8}, {4, 2, 8}, {9, 3, 5}}}, {{0, 0, 0}, {0, 0, 
   0}}, 20, "FinalStatePlot"]

It’s worth noting that although it’s perfectly possibly to define distance—and, in the limit, lots of other geometric concepts—on a graph like this, one doesn’t get to say that nodes are at positions defined by particular sets of coordinates, as one would in integer-dimensional space.

With a manifold, one basically has to pick a certain (integer) dimension, then stick to it. In our models, dimension can effectively become a dynamical variable, that can change with position (and time). So in our models one possible form of “space tunnel” is a region of space with higher or lower dimension. (Our derivation of general relativity is based on assuming that space has a limiting finite dimension, then asking what curvature and other properties it must have; the derivation is in a sense blind to different-dimensional space tunnels.)

It’s worth noting that both lower- and higher-dimensional space tunnels can be interesting in terms of “getting places quickly”. Lower-dimensional space tunnels (such as bigger versions of the 1D long-range threads in the 2D grid above) potentially connect some specific sparse set of “distant” points. Higher-dimensional space tunnels (which in the infinite-dimensional limit can be trees) are more like “switching stations” that make many points on their boundaries closer.

Negative Mass, Wormholes, etc.

Let’s say we’ve somehow managed to get a space tunnel. What will happen to it? Traditional general relativity suggests that it’s pretty hard to maintain a wormhole under the evolution of space implied by Einstein’s equations. A wormhole is in effect defined by geodesic paths coming together when they enter the wormhole and diverging again when they exit. In general relativity the presence of mass makes geodesics converge; that’s the “attraction due to gravity”. But what could make the geodesics diverge again? Basically one needs some kind of gravitational repulsion. And the only obvious way to get this in general relativity is to introduce negative mass.

Normally mass is assumed to be a positive quantity. But, for example, dark energy effectively has to have negative mass. And actually there are several mechanisms in traditional physics that effectively lead to negative mass. All of them revolve around the question of where one sets the zero to be. Normally one sets things up so that one can say that “the vacuum” has zero energy (and mass). But actually—even in traditional physics—there’s lots that’s supposed to be going on in “the vacuum”. For example, there’s supposed to be a constant intensity of the Higgs field, that interacts with all massive particles and has the effect of giving them mass. And there are supposed to be vacuum fluctuations associated with all quantum fields, each leading (at least in standard quantum field theory) to an infinite energy density.

But if these things exist everywhere in the universe, then (at least for most purposes) we can just set our zero of energy to include them. So then if there’s anything that can reduce their effects, we’ll effectively see negative mass. And one example of where this can in some sense happen is the Casimir effect. Imagine that instead of having an infinite vacuum, we just have vacuum inside a box. Having the box cuts out some of the possible vacuum fluctuations of quantum fields (basically modes with wavelengths larger than the size of the box)—and so in some sense leads to negative energy density inside the box (at least relative to outside). And, yes, the effect is observable with metal boxes, etc. But what becomes of the Casimir effects in a purely spacetime or gravitational setting isn’t clear.

(This leads to a personal anecdote. Back in 1981 I wrote two papers about the Casimir effect with Jan Ambjørn, titled Properties of the Vacuum: 1. Mechanical and …: 2. Electrodynamic. We had planned a “…: 3. Gravitational” but never wrote it, and now I’m really curious what the results would have been. By the way, our paper #1 computed Casimir effects for boxes of different shapes, and had the surprising implication that by changing shapes in a cycle it would in principle be possible to continuously “mine” energy from the vacuum. This was later suggested as a method for interstellar propulsion, but to make it work requires an infinitely impermeable box, which doesn’t seem physically constructible, except maybe using gravitational effects and event horizons… but we never wrote paper #3 to figure that out….)

In traditional physics there’s been a conflict between what the vacuum is like according to quantum field theory (with infinite energy density from vacuum fluctuations, etc.) and what the vacuum is assumed to be like in general relativity (effectively zero energy density). In our models there isn’t the same kind of conflict, but “the vacuum” is something with even more structure.

In particular, in our models, space isn’t some separate thing that exists; it is just a consequence of the large-scale structure of the spatial hypergraph. And any matter, particles, quantum fields, etc. that exist “in space” must also be features of this same hypergraph. Things like vacuum fluctuations aren’t something that happens in space; they are an integral part of the formation of space itself.

By the way, it’s important to note that in our models the hypergraph isn’t something static—and it’s in the end knitted together only through actual update events that occur. And the energy of some region of the hypergraph is directly related to the amount of updating activity in that region (or, more accurately, to the flux of causal edges through that portion of spacelike hypersurfaces).

So what does this mean for negative mass in our models? Well, if there was a region of the hypergraph where there was somehow less activity, it would have negative energy relative to the zero defined by the “normal vacuum”. It’s tempting to call whatever might reduce activity in the hypergraph a “vacuum cleaner”. And, no, we don’t know if vacuum cleaners can exist. But if they do, then there’s a fairly direct path to seeing how wormholes can be maintained (basically because geodesics almost by definition diverge wherever a vacuum cleaner has operated).

By the way, while a large-scale wormhole-like structure presumably requires negative mass, vacuum cleaners, etc., and other space tunnel structures may not have the same requirements. By their very construction, they tend to operate outside the regime described by general relativity and Einstein’s equations. So things like the standard singularity theorems of general relativity can’t be expected to apply. And instead there doesn’t seem to be any choice but to analyze them directly in the context of our models.

One might think: given a particular space tunnel configuration, why not just run a simulation of it, and see what happens? The problem is computational irreducibility. Yes, the simulation might show that the configuration is stable for a million or a billion steps. But that might still be far, far away from human-level timescales. And there may be no way to determine what the outcome for a given number of steps will be except in effect by doing that irreducible amount of computational work—so that if, for example, we want to find out the limiting result after an infinite time, that’ll in general require an infinite amount of computational work, and thus effectively be undecidable.

Or, put another way, even if we can successfully “engineer” a space tunnel, there may be no systematic way to guarantee that it’ll “stay up”; it may require an infinite sequence of “engineering tweaks” to keep it going, and eventually it may not be possible to keep it going. But before that, of course, we have to figure out how to construct a space tunnel in the first place…

It Doesn’t Mean Time Travel

In ordinary general relativity one tends to think of everything in terms of spacetime. So if a wormhole connects two different places, one assumes they are places in spacetime. Or, in other words, a wormhole can allow shortcuts between both different parts of space, and different parts of time. But with a shortcut between different parts of time one can potentially have time travel.

More specifically, one can have a situation where the future of something affects its past: in other words there is a causal connection from the future to the past. At some level this isn’t particularly strange. In any system that behaves in a perfectly periodic way one can think of the future as leading to a repetition of the past. But of course it’s not a future that one can freely determine; it’s just a future that’s completely determined by the periodic behavior.

How all this works is rather complicated to see in the standard mathematical treatment of general relativity, although in the end what presumably happens is that in the presence of wormholes the only consistent solutions to the equations are ones for which past and future are locked together with something like purely periodic behavior.

Still, in traditional physics there’s a certain sense that “time is just a coordinate”, so there’s the potential for “motion in time” just like we have motion in space. In our models, however, things work quite differently. Because now space and time are not the same kind of thing at all. Space is defined by the structure of the spatial hypergraph. But time is defined by the computational process of applying updates. And that computational process undoubtedly shows computational irreducibility.

So while we may go backwards and forwards in space, exploring different parts of the spatial hypergraph, the progress of time is associated with the progressive performance of irreducible computation by the universe. One can compute what will happen (or, with certain restrictions, what has happened), but one can only do so effectively by following the actual steps of it happening; one can’t somehow separately “move through it” to see what happens or has happened.

But in our models the whole causality of events is completely tracked, and is represented by the causal graph. And in fact each connection in the causal graph can be thought of as a representation of the very smallest unit of progression in time.

So now let’s look at a causal graph again:

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{x, y}, {z, y}} -> {{x, z}, {y, z}, {w, z}}, {{0, 
   0}, {0, 0}}, 12, "LayeredCausalGraph"]

There’s a very important feature of this graph: it contains no cycles. In other words, there’s a definite “flow of causality”. There’s a partial ordering of what events can affect what other events, and there’s never any looping back, and having an event affect itself.

There are different ways we can define “simultaneity surfaces”, corresponding to different foliations of this graph:

Show
&#10005
Show[#, ImageSize -> 400] & /@ {CloudGet["https://wolfr.am/KXgcRNRJ"];
   evolution = 
   ResourceFunction[
     "WolframModel"][{{x, y}, {z, y}} -> {{x, z}, {y, z}, {w, 
       z}}, {{0, 0}, {0, 0}}, 12];
  gg = Graph[evolution["LayeredCausalGraph"]]; 
  GraphPlot[gg, 
   Epilog -> {Directive[Red], 
     straightFoliationLines[{1/2, 0}, {0, 0}, (# &), {0, 1}]}], 
  CloudGet["https://wolfr.am/KXgcRNRJ"];(*drawFoliation*)
  gg = Graph[
    ResourceFunction[
      "WolframModel"][{{x, y}, {z, y}} -> {{x, z}, {y, z}, {w, 
        z}}, {{0, 0}, {0, 0}}, 12, "LayeredCausalGraph"]];
  semiRandomWMFoliation = {{1}, {1, 2, 4, 6, 9, 3}, {1, 2, 4, 6, 9, 3,
      13, 19, 12, 26, 36, 5, 7, 10, 51, 14, 69, 18, 8, 25, 11, 34, 20,
      35, 50, 17}, {1, 2, 4, 6, 9, 3, 13, 19, 12, 26, 36, 5, 7, 10, 
     51, 14, 69, 18, 8, 25, 11, 34, 20, 35, 50, 17, 24, 68, 47, 15, 
     92, 27, 48, 37, 21, 28, 42, 22, 30, 16, 32, 23, 33, 46, 64, 90, 
     94, 65, 88, 49, 67, 91, 66, 89}};
  Quiet[drawFoliation[gg, semiRandomWMFoliation, Directive[Red]], 
   FindRoot::cvmit]}

But there’s always a way to do it so that all events in a given slice are “causally before” events in subsequent slices. And indeed whenever the underlying rule has the property of causal invariance, it’s inevitable that things have to work this way.

But if we break causal invariance, other things can happen. Here’s an example of the multiway system for a (string) rule that doesn’t have causal invariance, and in which the same state can repeatedly be visited:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][{"AB" -> "BAB", "BA" -> "A"},
   "ABA", 5, "StatesGraph"], 
 GraphLayout -> {"LayeredDigraphEmbedding", "RootVertex" -> "ABA"}]

If we look at the corresponding (multiway) causal graph, it contains a loop:

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"AB" -> "BAB", "BA" -> "A"}, 
  "ABA", 4, "CausalGraphStructure"]]

In the language of general relativity, this loop represents a closed timelike curve, where the future can affect the past. And if we try to construct a foliation in which “time systematically moves forward” we won’t be able to do it.

But the presence of these kinds of loops is a different phenomenon from the existence of space tunnels. In a space tunnel there’s connectivity in the spatial hypergraph that makes the (graph) distance between two points be shorter than you’d expect from the overall structure of the hypergraph. But it’s just connecting different places in space. An event that happens at one end of the space tunnel can affect events associated with distant places in space, but (assuming causal invariance, etc.) those events have to be “subsequent events” with respect to the partial ordering defined by the causal graph.

Needless to say, there’s all sorts of subtlety about the events involved in maintaining the space tunnel, the definition of distance being “shorter than you’d expect”, etc. But the main point here is that “jumping” between distant places in space doesn’t in any way require or imply “traveling backwards in time”. Yes, if you think about flat, continuum space and you imagine a tachyon going faster than light, then the standard equations of special relativity imply that it must be going backwards in time. But as soon as space itself can have features like space tunnels, nothing like this needs to be going on. Time—and the computational process that corresponds to it—can still progress even as effects propagate, say through space tunnels, faster than light to places that seem distant in space.

Causal Cones and Light Cones

OK, now we’re ready to get to the meat of the question of faster-than-light effects in our models. Let’s say some event occurs. This event can affect a cone of subsequent events in the causal graph. When the causal graph is a simple grid, it’s all quite straightforward:

CloudGet
&#10005
CloudGet["https://wolfr.am/LcADnk1u"]; upTriangleGraph = 
 diamondCausalGraphPlot[11, {0, 0}, {}, # &, "Up", 
  ImageSize -> 450]; HighlightGraph[upTriangleGraph, 
 Style[Subgraph[upTriangleGraph, 
   VertexOutComponent[upTriangleGraph, 8]], Red, Thick]]

But in a more realistic causal graph the story is more complicated:

With
&#10005
With[{g = 
   ResourceFunction[
      "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
         w}, {z, w}}}, {{0, 0}, {0, 0}}, 8]["LayeredCausalGraph", 
    AspectRatio -> 1/2]}, 
 HighlightGraph[g, 
  Style[Subgraph[g, VertexOutComponent[g, 10]], Red, Thick]]]

The “causal cone” of affected events is very well defined. But now the question is: how does this relate to what happens in space and time?

When one thinks about the propagation of effects in space and time one typically thinks of light cones. Given a source of light somewhere in space and time, where in space and time can this affect?

And one might assume that the causal cone is exactly the light cone. But things are more subtle than that. The light cone is normally defined by the positions in space and time that it reaches. And that makes perfect sense if we’re dealing with a manifold representing continuous spacetime, on which we can, for example, set up numerical coordinates. But in our models there’s not intrinsically anything like that. Yes, we can say what element in a hypergraph is affected after some sequence of events. But there’s no a priori way to say where that element is in space. That’s only defined in some limit, relative to everything else in the whole hypergraph.

And this is the nub of the issue of faster-than-light effects in our models: causal (and, in a sense, temporal) relationships are immediately well defined. But spatial ones are not. One event can affect another through a single connection in the causal graph, but those events might be occurring at different ends of a space tunnel that traverses what we consider to be a large distance in space.

There are several related issues to consider, but they center around the question of what space really is in our models. We started off by talking about space corresponding to a collection of elements and relations, represented by a hypergraph. But the hypergraph is continually being updated. So the first question is: can we define an instantaneous snapshot of space?

Well, that’s what our reference frames, and foliations, and simultaneity surfaces, and so on, are about. They specify which particular collection of events we should consider to have happened at the moment when we “sample the structure of space”. There is arbitrariness to this choice, which corresponds directly to the arbitrariness that we’re used to in the selection of reference frames in relativity.

But can we choose any collection of events consistent with the partial ordering defined by the causal graph (i.e. where no events associated with a “single time slice” follow each other in the causal graph, and thus affect each other)? This is where things begin to get complicated. Let’s imagine we pick a foliation like this, or something even wilder:

CloudGet
&#10005
CloudGet["https://wolfr.am/LcADnk1u"];
upTriangleGraph = 
 diamondCausalGraphPlot[9, {0, 0}, {}, # &, "Up", 
  ImageSize -> 450]; Show[
 drawFoliation[
  Graph[upTriangleGraph, VertexLabelStyle -> Directive[8, Bold], 
   VertexSize -> .45], {{1}, {1, 3, 6, 10, 2, 4, 5}, {1, 3, 6, 10, 2, 
    4, 5, 8, 9, 15, 13, 14, 19, 20, 26, 7, 12}, {1, 3, 6, 10, 2, 4, 5,
     8, 9, 15, 13, 14, 19, 20, 26, 7, 12, 11, 17, 21, 18, 25, 24, 27, 
    32, 34, 28, 33, 16, 23, 31, 35, 42}}, 
  Directive[AbsoluteThickness[2], Red]], ImageSize -> 550]

We may know what the spatial hypergraph “typically” looks like. But perhaps with a weird enough foliation, it could be very different.

But for now, let’s ignore this (though it will be important later). And let’s just imagine we pick some “reasonable” foliation. Then we want to ask what the “projection” of the causal cone onto the instantaneous structure of space is. Or, in other words, what elements in space are affected by a particular event?

Let’s look at a specific example. Let’s consider the same rule and same causal cone as above, with the “flat” (“cosmological rest frame”) foliation:

CloudGet
&#10005
CloudGet["https://wolfr.am/KXgcRNRJ"];
With[{g = 
   ResourceFunction[
      "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
         w}, {z, w}}}, {{0, 0}, {0, 0}}, 8]["LayeredCausalGraph", 
    AspectRatio -> 1/2, 
    Epilog -> {Directive[Red], 
      straightFoliationLines[{0.22, 0}, {0, 0}, (# &), {0, -2}]}]}, 
 HighlightGraph[g, 
  Style[Subgraph[g, VertexOutComponent[g, 10]], Red, Thick]]]

Here are spatial hypergraphs associated with successive slices in this foliation, with the parts contained in the causal cone highlighted:

alt
&#10005

Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"EffectiveSpatialBall", "[", 
   RowBox[{"wmo_", ",", "expr0_"}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"t", "=", 
       RowBox[{
       "wmo", "[", "\"\<CompleteGenerationsCount\>\"", "]"}]}], ",", 
      "fexprs"}], "}"}], ",", 
    RowBox[{
     RowBox[{"fexprs", "=", 
      RowBox[{"wmo", "[", 
       RowBox[{"\"\<StateEdgeIndicesAfterEvent\>\"", ",", 
        RowBox[{"-", "1"}]}], "]"}]}], ";", 
     RowBox[{"Intersection", "[", 
      RowBox[{
       RowBox[{"Cases", "[", 
        RowBox[{
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{
           RowBox[{
           "wmo", "[", "\"\<ExpressionsEventsGraph\>\"", "]"}], ",", 
           RowBox[{"{", "expr0", "}"}]}], "]"}], ",", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"\"\<Expression\>\"", ",", "n_"}], "}"}], ":>", 
          "n"}]}], "]"}], ",", "fexprs"}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"EffectiveSpatialAtomBall", "[", 
   RowBox[{"wmo_", ",", "expr0_"}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"t", "=", 
       RowBox[{
       "wmo", "[", "\"\<CompleteGenerationsCount\>\"", "]"}]}], ",", 
      "fexprs"}], "}"}], ",", 
    RowBox[{
     RowBox[{"fexprs", "=", 
      RowBox[{"wmo", "[", 
       RowBox[{"\"\<StateEdgeIndicesAfterEvent\>\"", ",", 
        RowBox[{"-", "1"}]}], "]"}]}], ";", 
     RowBox[{
      RowBox[{"wmo", "[", "\"\<AllExpressions\>\"", "]"}], "[", 
      RowBox[{"[", 
       RowBox[{"Intersection", "[", 
        RowBox[{
         RowBox[{"Cases", "[", 
          RowBox[{
           RowBox[{"VertexOutComponent", "[", 
            RowBox[{
             RowBox[{
             "wmo", "[", "\"\<ExpressionsEventsGraph\>\"", "]"}], ",", 
             RowBox[{"{", "expr0", "}"}]}], "]"}], ",", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"\"\<Expression\>\"", ",", "n_"}], "}"}], ":>", 
            "n"}]}], "]"}], ",", "fexprs"}], "]"}], "]"}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"EffectiveSpatialBallPlot", "[", 
   RowBox[{"wmo_", ",", "expr0_"}], "]"}], ":=", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{"bb", "=", 
      RowBox[{"EffectiveSpatialAtomBall", "[", 
       RowBox[{"wmo", ",", "expr0"}], "]"}]}], "}"}], ",", 
    RowBox[{"wmo", "[", 
     RowBox[{"\"\<FinalStatePlot\>\"", ",", 
      RowBox[{"GraphHighlight", "\[Rule]", 
       RowBox[{"Join", "[", 
        RowBox[{"bb", ",", 
         RowBox[{"Union", "[", 
          RowBox[{"Catenate", "[", "bb", "]"}], "]"}]}], "]"}]}]}], 
     "]"}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"If", "[", 
    RowBox[{
     RowBox[{"t", "<", "4"}], ",", 
     RowBox[{
      RowBox[{"ResourceFunction", "[", "\"\<WolframModel\>\"", "]"}], 
      "[", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"x", ",", "y"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"x", ",", "z"}], "}"}]}], "}"}], "\[Rule]", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"x", ",", "z"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"x", ",", "w"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"y", ",", "w"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"0", ",", "0"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", "t", ",", 
       "\"\<FinalStatePlot\>\""}], "]"}], ",", 
     RowBox[{"EffectiveSpatialBallPlot", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
        "ResourceFunction", "[", "\"\<WolframModel\>\"", "]"}], "[", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"x", ",", "y"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"x", ",", "z"}], "}"}]}], "}"}], "\[Rule]", 
           RowBox[{"{", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"x", ",", "z"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"x", ",", "w"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"y", ",", "w"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"0", ",", "0"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", "t"}], 
        "]"}], ",", 
       RowBox[{"{", 
        RowBox[{"\"\<Event\>\"", ",", "10"}], "}"}]}], "]"}]}], "]"}],
    ",", 
   RowBox[{"{", 
    RowBox[{"t", ",", "9"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

For the first 3 slices the event that begins the causal cone hasn’t happened yet. But after that we start seeing the effect of the event, gradually spreading across successive spatial hypergraphs.

Yes, there are more subtleties ahead. But basically what we’re seeing here is the expansion of the light cone with time. So now we’ve got to ask the critical question: how fast does the edge of this light cone actually expand? How much space does it traverse at each unit in time? In other words, what is the effective speed of light here?

It is already clear from the pictures above that this is a somewhat subtle question. But let’s begin with an even more basic issue. The speed of light is something we measure in units like meters per second. But what we can potentially get from our model is instead a speed in spatial hypergraph edges per causal edge. We can say that each causal edge corresponds to a certain elementary time elapsing. And as soon as we quote the elementary time in seconds—say 100–100 s—we’re basically defining the second. And similarly, we can say that each spatial hypergraph edge corresponds to a distance of a certain elementary length. But now imagine that in t elementary times the light cone in the hypergraph has advanced by α t spatial hypergraph edges, or α t elementary lengths. What is α t in meters? It has to be α c t, where c is the speed of light, because in effect this defines the speed of light.

In other words, it’s at some level a tautology to say that the light cone in the spatial hypergraph advances at the speed of light—because this is the definition of the speed of light. But it’s more complicated than that. In continuum space there’s nothing inconsistent about saying that the speed of light is the same in every direction, everywhere. But when we’re projecting our causal cone onto the spatial hypergraph we can’t really say that anymore. But to know what happens we have to figure out more about how to characterize space.

In our models it’s clear what causal effects there are, and even how they spread. But what’s far from clear is where in detail these effects show up in what we call space. We know what the causal cones are like; but we still have to figure out how they map into positions in space. And from that we can try to work out whether—relative to the way we set up space—there can be effects that go faster than light.

How to Measure Distance

In a sense speeds are complicated to characterize in our models because positions and times are hard to define. But it’s useful to consider for a moment the much simpler case of cellular automata, where from the outset we just set up a grid in space and time. Given some cellular automaton, say with a random initial condition, we can ask how fast an effect can propagate. For example, if we change one cell in the initial condition, by how many cells per step can the effect of this expand? Here are a couple of typical results:

With
&#10005
With[{u = RandomInteger[1, 160]}, SeedRandom[24245];
   ArrayPlot[
    Sum[(2 + (-1)^i) CellularAutomaton[#, ReplacePart[u, 80 -> i], 
       80], {i, 0, 1}], 
    ColorRules -> {0 -> White, 4 -> Black, 1 -> Red, 3 -> Red}, 
    ImageSize -> 330]] & /@ {22, 30}

The actual speed of expansion can vary, but in both cases the absolute maximum speed is 1 cell/step. And this is very straightforward to understand from the underlying rules for the cellular automata:

RulePlot
&#10005
RulePlot[CellularAutomaton[#], ImageSize -> 300] & /@ {22, 30}

In both cases, the rule for each step “reaches” one cell away, so 1 cell/step is the maximum rate at which effects can propagate.

There’s something somewhat analogous that happens in our models. Consider a rule like:

RulePlot
&#10005
RulePlot[ResourceFunction[
   "WolframModel"][{{{1, 2}, {2, 3}} -> {{2, 4}, {2, 4}, {4, 1}, {4, 
      3}}}]]

A bit like in the cellular automaton, the rule only “reaches” a limited number of connections away. And what this means is that in each updating event only elements within a certain range of connections can “have an effect” on each other. But inevitably this is only a very local statement. Because while the structure of the rule implies that effects can only spread a certain distance in a single update there is nothing that says what the “relative geometry” of successive updates will be, and what connection might be connected to what. Unlike in a cellular automaton where the global spatial structure is predefined, in our models there is no immediate global consequence to the fact that the rules are fundamentally local with respect to the hypergraph.

It should be mentioned that the rules don’t strictly even have to be local. If the left-hand side is disconnected, as in

RulePlot
&#10005
RulePlot[ResourceFunction["WolframModel"][{{x}, {y}} -> {{x, y}}]]

then in a sense any individual update can pick up elements from anywhere in the spatial hypergraph—even disconnected parts. And as a result, something anywhere in the universe can immediately affect something anywhere else. But with a rule like this, there doesn’t seem to be a way to build up anything with the kind of locality properties that characterize what we think of as space.

OK, but given a spatial hypergraph, how do we figure out “how far” it is from one node to another? That’s a subtle question. It’s easy to figure out the graph distance: just find the geodesic path from one node to another and see how many connections it involves. But this is just an abstract distance on the hypergraph: now the question is how it relates to a distance we might measure “physically”, say with something like a ruler.

It’s a tricky thing: we have a hypergraph that is supposed to represent everything in the universe. And now we want something—presumably itself part of the hypergraph—to measure a distance in the hypergraph. In traditional treatments of relativity it’s common to think of measuring distances by looking at arrival times of light signals or photons. But this implicitly assumes that there’s an underlying structure of space, and photons are simply being added in to probe it. In our models, however, the photons have to themselves be part of the spatial hypergraph: they’re in a sense just “pieces of space”, albeit presumably with appropriate generalized topological properties.

Or, put another way: when we directly study the spatial hypergraph, we’re operating far below the level of things like photons. But if we’re going to compare what we see in spatial hypergraphs with actual distance measurements in physics we’re going to have to find some way to bridge the gap. Or, in other words, we need to find some adequate proxy for physical distance that we can compute directly on the spatial hypergraph.

A simple possibility that we’ve used a lot in practice in exploring our models is just graph distance, though with one wrinkle. The wrinkle is as follows: our hypergraphs represent collections of relations between elements, and we assume that these relations are ordered—so that the hyperedges in our hypergraphs are directed hyperedges. But in computing “physical-like distances” we ignore the directedness, and treat what we have as an undirected hypergraph. In the limit of sufficiently large hypergraphs, this shouldn’t make much difference, although it seems as if including directedness information may let us look at the analog of spinors, while the undirected case corresponds to ordinary vectors, which are what we’re more familiar with in terms of measuring distances.

So is there any other proxy for distance that we could use? Actually, there are several. But one that may be particularly good is directly derived from the causal graph. It’s in some ways the analog of what we might do in traditional discussions of relativity where we imagine a grid of beacons signaling to each other over a limited period of time. In terms of our models we can say that it’s the analog of a branchial distance for the causal graph.

Here’s how it works. Construct a causal graph, say:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
      w}}}, {{0, 0}, {0, 0}}, 5]["LayeredCausalGraph", 
 AspectRatio -> 1/2, VertexLabels -> Automatic]

Now look at the events in the last slice shown here. For each pair of events look at their ancestry, i.e. at what previous event(s) led to them. If a particular pair of events have a common ancestor on the step before, connect them. The result in this case is the graph:

SpatialReconstruction
&#10005
PacletInstall["SetReplace"]; << SetReplace`;
SpatialReconstruction[wmo_WolframModelEvolutionObject, 
  dt_Integer : 1] := 
 Module[{cg = wmo["CausalGraph"], ceg = wmo["EventGenerations"], ev0, 
   ev1, oc}, ev0 = First /@ Position[-(ceg - Max[ceg]), dt];
  ev1 = First /@ Position[-(ceg - Max[ceg]), 0];
  oc = Select[Rest[VertexOutComponent[cg, #]], MemberQ[ev1, #] &] & /@
     ev0; Graph[
   WolframPhysicsProjectStyleData["SpatialGraph", "Function"][
    Graph[ev1, 
     Flatten[(UndirectedEdge @@@ Subsets[#, {2}]) & /@ oc]]], 
   VertexStyle -> 
    WolframPhysicsProjectStyleData["CausalGraph", "VertexStyle"], 
   EdgeStyle -> 
    Blend[{First[
       WolframPhysicsProjectStyleData["SpatialGraph", 
        "EdgeLineStyle"]], 
      WolframPhysicsProjectStyleData["BranchialGraph", "EdgeStyle"]}]]]
Graph[SpatialReconstruction[
  WolframModel[{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
       w}}}, {{0, 0}, {0, 0}}, 5], 1], VertexLabels -> Automatic]

One can think of this as a “reconstruction of space”, based on the causal graph. In an appropriate limit, it should be essentially the same as the structure of space associated with the original hypergraph—though with this small a graph the spatial hypergraph still looks quite different:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
      w}}}, {{0, 0}, {0, 0}}, 5]["FinalStatePlot"]

It’s slightly complicated, but it’s important to understand the differences between these various graphs. In the underlying spatial hypergraph, the nodes are the fundamental elements in our model—that we’ve dubbed above “atoms of space”. The hyperedges connecting these nodes correspond to the relations between the elements. In the causal graph, however, the nodes represent updating events, joined by edges that represent the causal relationships between these events.

The “spatial reconstruction graph” has events as its nodes, but it has a new kind of edge connecting these nodes—an edge that represents immediate common ancestry of the events. Whenever an event “causes” other events one can think of the first event as “starting an elementary light cone” that contains the other events. The causal graph represents the way that the elementary light cones are “knitted together” by the evolution of the system, and, more specifically, by the overlap of effects of different events on relations in the spatial hypergraph. The spatial reconstruction graph now uses the fact that two events lie in the same elementary light cone as a way to infer that the events are “close together”, as recorded by an edge in the spatial reconstruction graph.

There is an analogy here to our discussions of quantum mechanics. In talking about quantum mechanics we start from multiway graphs whose nodes are quantum states, and then we look at (“time”) slices through these graphs, and construct branchial graphs from them—with two states being joined in this branchial graph when they have an immediate common ancestor in the multiway graph. Or, said another way: in the branchial graph we join states that are in the same elementary “entanglement cone”. And the resulting branchial graph can be viewed as a map of a space of quantum states and their entanglements:

ResourceFunction
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], 
   "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"\"\<A\>\"", "\[Rule]", "\"\<AB\>\""}], ",", 
      RowBox[{"\"\<B\>\"", "\[Rule]", "\"\<A\>\""}]}], "}"}], ",", 
    "\"\<A\>\"", ",", "4", ",", "\"\<EvolutionGraph\>\""}], "]"}], "//",
   "LayeredGraphPlot"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"ResourceFunction", "[", "\"\<MultiwaySystem\>\"", "]"}], 
  "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{
     RowBox[{"\"\<A\>\"", "\[Rule]", "\"\<AB\>\""}], ",", 
     RowBox[{"\"\<B\>\"", "\[Rule]", "\"\<A\>\""}]}], "}"}], ",", 
   "\"\<A\>\"", ",", "4", ",", "\"\<BranchialGraph\>\""}], 
  "]"}]], "Input"]
}, Open  ]]

The spatial reconstruction graph is the same idea: it’s like a branchial graph, but computed from the causal graph, rather than from a multiway graph. (Aficionados of our project may notice that the spatial reconstruction graph is a new kind of graph that we haven’t drawn before—and in which we’re coloring the edges with a new, purple color that happens to be a blend of our “branchial pink” with the blue-gray used for spatial hypergraphs.)

In the spatial reconstruction graph shown above, we’re joining events when they have a common ancestor one step before. But we can generalize the notion of a spatial reconstruction graph (or, for that matter, a branchial graph) by allowing common ancestors more than one step back.

In the case we showed above, going even two steps back causes almost all events to have common ancestors:

SpatialReconstruction
&#10005
PacletInstall["SetReplace"]; << SetReplace`;
SpatialReconstruction[wmo_WolframModelEvolutionObject, 
  dt_Integer : 1] := 
 Module[{cg = wmo["CausalGraph"], ceg = wmo["EventGenerations"], ev0, 
   ev1, oc}, ev0 = First /@ Position[-(ceg - Max[ceg]), dt];
  ev1 = First /@ Position[-(ceg - Max[ceg]), 0];
  oc = Select[Rest[VertexOutComponent[cg, #]], MemberQ[ev1, #] &] & /@
     ev0; Graph[
   WolframPhysicsProjectStyleData["SpatialGraph", "Function"][
    Graph[ev1, 
     Flatten[(UndirectedEdge @@@ Subsets[#, {2}]) & /@ oc]]], 
   VertexStyle -> 
    WolframPhysicsProjectStyleData["CausalGraph", "VertexStyle"], 
   EdgeStyle -> 
    Blend[{First[
       WolframPhysicsProjectStyleData["SpatialGraph", 
        "EdgeLineStyle"]], 
      WolframPhysicsProjectStyleData["BranchialGraph", "EdgeStyle"]}]]]
Graph[SpatialReconstruction[
  ResourceFunction[
    "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
       w}}}, {{0, 0}, {0, 0}}, 5], 2], VertexLabels -> Automatic]

And indeed if we go enough steps back, every event will inevitably share a common ancestor: the “big bang” event that started the evolution of the system.

Let’s say we have a rule that leads to a sequence of spatial hypergraphs:

ResourceFunction
&#10005
ResourceFunction[
   "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
      w}}}, {{0, 0}, {0, 0}}, 10]["StatesPlotsList", 
 ImageSize -> Tiny]

We can compare these with the spatial reconstruction graphs that we get from the causal graph for this system. Here are the results on successive steps, allowing a “lookback” of 2 steps:

Table
&#10005
PacletInstall["SetReplace"]; << SetReplace`;
SpatialReconstruction[wmo_WolframModelEvolutionObject, 
  dt_Integer : 1] := 
 Module[{cg = wmo["CausalGraph"], ceg = wmo["EventGenerations"], ev0, 
   ev1, oc}, ev0 = First /@ Position[-(ceg - Max[ceg]), dt];
  ev1 = First /@ Position[-(ceg - Max[ceg]), 0];
  oc = Select[Rest[VertexOutComponent[cg, #]], MemberQ[ev1, #] &] & /@
     ev0; Graph[
   WolframPhysicsProjectStyleData["SpatialGraph", "Function"][
    Graph[ev1, 
     Flatten[(UndirectedEdge @@@ Subsets[#, {2}]) & /@ oc]]], 
   VertexStyle -> 
    WolframPhysicsProjectStyleData["CausalGraph", "VertexStyle"], 
   EdgeStyle -> 
    Blend[{First[
       WolframPhysicsProjectStyleData["SpatialGraph", 
        "EdgeLineStyle"]], 
      WolframPhysicsProjectStyleData["BranchialGraph", "EdgeStyle"]}]]]
Table[Graph[
  SpatialReconstruction[
   ResourceFunction[
     "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z,
         w}}}, {{0, 0}, {0, 0}}, t], 2], ImageSize -> Tiny], {t, 10}]

And as the number of steps increases, there is increasingly commonality between the spatial hypergraph and the spatial reconstruction graph—though they are not identical.

It’s worth pointing out that the spatial reconstruction graphs we’ve drawn certainly aren’t the only ways to get a proxy for physical distances. One simple change is that we can look at common successors, rather than common ancestors.

Another thing is to look not at a spatial hypergraph in which the nodes are elements and the hyperedges are relations, but instead at a “dual spatial hypergraph” in which the nodes are relations and the hyperedges are associated with elements, with each (unordered) hyperedge recording which relations share a given element.

For example, for the spatial hypergraph

ResourceFunction
&#10005
ResourceFunction[
  "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, w}, {z, 
     w}}}, {{0, 0}, {0, 0}}, 5, "FinalStatePlot"]

the corresponding dual spatial hypergraph is

UnorderedHypergraphPlot
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"RelationsElementsHypergraph", "[", "wmo_", "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"ix", "=", 
       RowBox[{"wmo", "[", 
        RowBox[{"\"\<StateEdgeIndicesAfterEvent\>\"", ",", 
         RowBox[{"-", "1"}]}], "]"}]}], ",", "es"}], "}"}], ",", 
    RowBox[{"Values", "[", 
     RowBox[{"Merge", "[", 
      RowBox[{
       RowBox[{"Association", "@@@", 
        RowBox[{"(", 
         RowBox[{"Thread", "/@", 
          RowBox[{"Thread", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"wmo", "[", "\"\<AllExpressions\>\"", "]"}], "[", 
             RowBox[{"[", "ix", "]"}], "]"}], "\[Rule]", "ix"}], 
           "]"}]}], ")"}]}], ",", "Identity"}], "]"}], "]"}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"UnorderedHypergraphPlot", "[", 
   RowBox[{"h_", ",", "opts___"}], "]"}], ":=", 
  RowBox[{
   RowBox[{"ResourceFunction", "[", "\"\<WolframModelPlot\>\"", "]"}],
    "[", 
   RowBox[{"h", ",", "opts", ",", 
    RowBox[{"\"\<ArrowheadLength\>\"", "\[Rule]", "0"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"<|", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"_", ",", "_", ",", 
         RowBox[{"_", ".."}]}], "}"}], "\[Rule]", "Transparent"}], 
      "|>"}]}], ",", 
    RowBox[{"\"\<EdgePolygonStyle\>\"", "\[Rule]", 
     RowBox[{"<|", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"_", ",", "_", ",", 
         RowBox[{"_", ".."}]}], "}"}], "\[Rule]", 
       RowBox[{"Directive", "[", 
        RowBox[{
         RowBox[{"Hue", "[", 
          RowBox[{"0.63", ",", "0.66", ",", "0.81"}], "]"}], ",", 
         RowBox[{"Opacity", "[", "0.1", "]"}], ",", 
         RowBox[{"EdgeForm", "[", 
          RowBox[{"Directive", "[", 
           RowBox[{
            RowBox[{"Hue", "[", 
             RowBox[{"0.63", ",", "0.7", ",", "0.5"}], "]"}], ",", 
            RowBox[{"Opacity", "[", "0.7", "]"}]}], "]"}], "]"}]}], 
        "]"}]}], "|>"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"UnorderedHypergraphPlot", "[", 
  RowBox[{"RelationsElementsHypergraph", "[", 
   RowBox[{
    RowBox[{"ResourceFunction", "[", "\"\<WolframModel\>\"", "]"}], 
    "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"x", ",", "y"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"x", ",", "z"}], "}"}]}], "}"}], "\[Rule]", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"x", ",", "z"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"x", ",", "w"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"y", ",", "w"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], ",", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"0", ",", "0"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", "5"}], "]"}], 
   "]"}], "]"}]], "Input"]
}, Open  ]]

and the sequence of dual spatial hypergraphs corresponding to the evolution above is:

Table
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"RelationsElementsHypergraph", "[", "wmo_", "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"ix", "=", 
       RowBox[{"wmo", "[", 
        RowBox[{"\"\<StateEdgeIndicesAfterEvent\>\"", ",", 
         RowBox[{"-", "1"}]}], "]"}]}], ",", "es"}], "}"}], ",", 
    RowBox[{"Values", "[", 
     RowBox[{"Merge", "[", 
      RowBox[{
       RowBox[{"Association", "@@@", 
        RowBox[{"(", 
         RowBox[{"Thread", "/@", 
          RowBox[{"Thread", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"wmo", "[", "\"\<AllExpressions\>\"", "]"}], "[", 
             RowBox[{"[", "ix", "]"}], "]"}], "\[Rule]", "ix"}], 
           "]"}]}], ")"}]}], ",", "Identity"}], "]"}], "]"}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"UnorderedHypergraphPlot", "[", 
   RowBox[{"h_", ",", "opts___"}], "]"}], ":=", 
  RowBox[{
   RowBox[{"ResourceFunction", "[", "\"\<WolframModelPlot\>\"", "]"}],
    "[", 
   RowBox[{"h", ",", "opts", ",", 
    RowBox[{"\"\<ArrowheadLength\>\"", "\[Rule]", "0"}], ",", 
    RowBox[{"EdgeStyle", "\[Rule]", 
     RowBox[{"<|", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"_", ",", "_", ",", 
         RowBox[{"_", ".."}]}], "}"}], "\[Rule]", "Transparent"}], 
      "|>"}]}], ",", 
    RowBox[{"\"\<EdgePolygonStyle\>\"", "\[Rule]", 
     RowBox[{"<|", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"_", ",", "_", ",", 
         RowBox[{"_", ".."}]}], "}"}], "\[Rule]", 
       RowBox[{"Directive", "[", 
        RowBox[{
         RowBox[{"Hue", "[", 
          RowBox[{"0.63", ",", "0.66", ",", "0.81"}], "]"}], ",", 
         RowBox[{"Opacity", "[", "0.1", "]"}], ",", 
         RowBox[{"EdgeForm", "[", 
          RowBox[{"Directive", "[", 
           RowBox[{
            RowBox[{"Hue", "[", 
             RowBox[{"0.63", ",", "0.7", ",", "0.5"}], "]"}], ",", 
            RowBox[{"Opacity", "[", "0.7", "]"}]}], "]"}], "]"}]}], 
        "]"}]}], "|>"}]}]}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"Show", "[", 
    RowBox[{
     RowBox[{"UnorderedHypergraphPlot", "[", 
      RowBox[{"RelationsElementsHypergraph", "[", 
       RowBox[{
        RowBox[{
        "ResourceFunction", "[", "\"\<WolframModel\>\"", "]"}], "[", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"x", ",", "y"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"x", ",", "z"}], "}"}]}], "}"}], "\[Rule]", 
           RowBox[{"{", 
            RowBox[{
             RowBox[{"{", 
              RowBox[{"x", ",", "z"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"x", ",", "w"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"y", ",", "w"}], "}"}], ",", 
             RowBox[{"{", 
              RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"{", 
            RowBox[{"0", ",", "0"}], "}"}], ",", 
           RowBox[{"{", 
            RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", "t"}], 
        "]"}], "]"}], "]"}], ",", 
     RowBox[{"ImageSize", "\[Rule]", "Tiny"}]}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"t", ",", "0", ",", "10"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

There are still other possibilities, particularly if one goes “below” the causal graph, and starts looking not just at causal relations between whole events, but also at causal relations between specific relations in the underlying spatial hypergraph.

But the main takeaway is that there are various proxies we can use for physical distance. In the limit of a sufficiently large system, all of them should give compatible results. But when we’re dealing with small graphs, they won’t quite agree, and so we may not be sure what we should say the distance between two things is.

Causal Balls vs. Geodesic Balls

To measure speed, we basically have to divide distance by elapsed time. But, as I just discussed at some length, when we’re constructing space and time from something lower level, it’s not straightforward to say exactly what we mean by distance and by elapsed time, and how different possibilities will correspond to what we’d actually measure, say at a human scale.

But as a first approximation, let’s just ask about the effect of a single event. The effect of this event is captured by a causal cone:

With
&#10005
With[{g = 
   ResourceFunction[
      "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
         w}, {z, w}}}, {{0, 0}, {0, 0}}, 8]["LayeredCausalGraph", 
    AspectRatio -> 1/2]}, 
 HighlightGraph[g, 
  Style[Subgraph[g, VertexOutComponent[g, 10]], Red, Thick]]]

We can say that the elapsed time associated with a particular slice through this causal cone is the graph distance from the event at the top of the cone to events in this slice. (How the slice is chosen is determined by the reference frame we’re using.)

So now we want to see how far the effect of the event spreads in space. The first step is to “project” the causal cone onto some representation of “instantaneous space”. We can do this with the ordinary spatial hypergraph:

EffectiveSpatialBallPlot
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"EffectiveSpatialBall", "[", 
   RowBox[{"wmo_", ",", "expr0_"}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"t", "=", 
       RowBox[{
       "wmo", "[", "\"\<CompleteGenerationsCount\>\"", "]"}]}], ",", 
      "fexprs"}], "}"}], ",", 
    RowBox[{
     RowBox[{"fexprs", "=", 
      RowBox[{"wmo", "[", 
       RowBox[{"\"\<StateEdgeIndicesAfterEvent\>\"", ",", 
        RowBox[{"-", "1"}]}], "]"}]}], ";", 
     RowBox[{"Intersection", "[", 
      RowBox[{
       RowBox[{"Cases", "[", 
        RowBox[{
         RowBox[{"VertexOutComponent", "[", 
          RowBox[{
           RowBox[{
           "wmo", "[", "\"\<ExpressionsEventsGraph\>\"", "]"}], ",", 
           RowBox[{"{", "expr0", "}"}]}], "]"}], ",", 
         RowBox[{
          RowBox[{"{", 
           RowBox[{"\"\<Expression\>\"", ",", "n_"}], "}"}], ":>", 
          "n"}]}], "]"}], ",", "fexprs"}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"EffectiveSpatialAtomBall", "[", 
   RowBox[{"wmo_", ",", "expr0_"}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"t", "=", 
       RowBox[{
       "wmo", "[", "\"\<CompleteGenerationsCount\>\"", "]"}]}], ",", 
      "fexprs"}], "}"}], ",", 
    RowBox[{
     RowBox[{"fexprs", "=", 
      RowBox[{"wmo", "[", 
       RowBox[{"\"\<StateEdgeIndicesAfterEvent\>\"", ",", 
        RowBox[{"-", "1"}]}], "]"}]}], ";", 
     RowBox[{
      RowBox[{"wmo", "[", "\"\<AllExpressions\>\"", "]"}], "[", 
      RowBox[{"[", 
       RowBox[{"Intersection", "[", 
        RowBox[{
         RowBox[{"Cases", "[", 
          RowBox[{
           RowBox[{"VertexOutComponent", "[", 
            RowBox[{
             RowBox[{
             "wmo", "[", "\"\<ExpressionsEventsGraph\>\"", "]"}], ",", 
             RowBox[{"{", "expr0", "}"}]}], "]"}], ",", 
           RowBox[{
            RowBox[{"{", 
             RowBox[{"\"\<Expression\>\"", ",", "n_"}], "}"}], ":>", 
            "n"}]}], "]"}], ",", "fexprs"}], "]"}], "]"}], "]"}]}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"HighlightEffectiveSpatialBallPlot", "[", 
   RowBox[{"wmo_", ",", "expr0_"}], "]"}], ":=", 
  RowBox[{"With", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"bb", "=", 
       RowBox[{"EffectiveSpatialAtomBall", "[", 
        RowBox[{"wmo", ",", "expr0"}], "]"}]}], ",", 
      RowBox[{"edges", "=", 
       RowBox[{"wmo", "[", "\"\<FinalState\>\"", "]"}]}]}], "}"}], 
    ",", 
    RowBox[{"HighlightGraph", "[", 
     RowBox[{
      RowBox[{"Graph", "[", 
       RowBox[{"DirectedEdge", "@@@", 
        RowBox[{"Catenate", "[", 
         RowBox[{
          RowBox[{
           RowBox[{"Partition", "[", 
            RowBox[{"#", ",", "2", ",", "1"}], "]"}], "&"}], "/@", 
          "edges"}], "]"}]}], "]"}], ",", 
      RowBox[{"Style", "[", 
       RowBox[{
        RowBox[{"DirectedEdge", "@@@", 
         RowBox[{"Join", "[", 
          RowBox[{"bb", ",", 
           RowBox[{"Union", "[", 
            RowBox[{"Catenate", "[", "bb", "]"}], "]"}]}], "]"}]}], 
        ",", "Red", ",", "Thick"}], "]"}]}], "]"}]}], 
   "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"HighlightEffectiveSpatialBallPlot", "[", 
  RowBox[{
   RowBox[{
    RowBox[{"ResourceFunction", "[", "\"\<WolframModel\>\"", "]"}], 
    "[", 
    RowBox[{
     RowBox[{"{", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"x", ",", "y"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"x", ",", "z"}], "}"}]}], "}"}], "\[Rule]", 
       RowBox[{"{", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{"x", ",", "z"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"x", ",", "w"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"y", ",", "w"}], "}"}], ",", 
         RowBox[{"{", 
          RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], ",", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"{", 
        RowBox[{"0", ",", "0"}], "}"}], ",", 
       RowBox[{"{", 
        RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", "9"}], "]"}], 
   ",", 
   RowBox[{"{", 
    RowBox[{"\"\<Event\>\"", ",", "10"}], "}"}]}], "]"}]], "Input"]
}, Open  ]]

But to align with the most obvious notion of “elapsed time” in the causal cone it’s better to use the spatial reconstruction graph, whose nodes, just like those of the causal graph, are events:

With
&#10005
PacletInstall["SetReplace"]; << SetReplace`;
SpatialReconstruction[wmo_WolframModelEvolutionObject, 
  dt_Integer : 1] := 
 Module[{cg = wmo["CausalGraph"], ceg = wmo["EventGenerations"], ev0, 
   ev1, oc}, ev0 = First /@ Position[-(ceg - Max[ceg]), dt];
  ev1 = First /@ Position[-(ceg - Max[ceg]), 0];
  oc = Select[Rest[VertexOutComponent[cg, #]], MemberQ[ev1, #] &] & /@
     ev0; Graph[
   WolframPhysicsProjectStyleData["SpatialGraph", "Function"][
    Graph[ev1, 
     Flatten[(UndirectedEdge @@@ Subsets[#, {2}]) & /@ oc]]], 
   VertexStyle -> 
    WolframPhysicsProjectStyleData["CausalGraph", "VertexStyle"], 
   EdgeStyle -> 
    Blend[{First[
       WolframPhysicsProjectStyleData["SpatialGraph", 
        "EdgeLineStyle"]], 
      WolframPhysicsProjectStyleData["BranchialGraph", "EdgeStyle"]}]]]
With[{sg = 
   SpatialReconstruction[
    ResourceFunction[
      "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
         w}, {z, w}}}, {{0, 0}, {0, 0}}, 8], 2]}, 
 HighlightGraph[sg, 
  Style[Subgraph[sg, 
    With[{g = 
       ResourceFunction[
          "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
             w}, {z, w}}}, {{0, 0}, {0, 0}}, 8][
        "LayeredCausalGraph"]}, VertexOutComponent[g, 10]]], Red, 
   Thick]]]

Let’s “watch the intersection grow” from successive slices of the causal cone, projected onto spatial reconstruction graphs:

Table
&#10005
PacletInstall["SetReplace"]; << SetReplace`;
SpatialReconstruction[wmo_WolframModelEvolutionObject, 
  dt_Integer : 1] := 
 Module[{cg = wmo["CausalGraph"], ceg = wmo["EventGenerations"], ev0, 
   ev1, oc}, ev0 = First /@ Position[-(ceg - Max[ceg]), dt];
  ev1 = First /@ Position[-(ceg - Max[ceg]), 0];
  oc = Select[Rest[VertexOutComponent[cg, #]], MemberQ[ev1, #] &] & /@
     ev0; Graph[
   WolframPhysicsProjectStyleData["SpatialGraph", "Function"][
    Graph[ev1, 
     Flatten[(UndirectedEdge @@@ Subsets[#, {2}]) & /@ oc]]], 
   VertexStyle -> 
    WolframPhysicsProjectStyleData["CausalGraph", "VertexStyle"], 
   EdgeStyle -> 
    Blend[{First[
       WolframPhysicsProjectStyleData["SpatialGraph", 
        "EdgeLineStyle"]], 
      WolframPhysicsProjectStyleData["BranchialGraph", "EdgeStyle"]}]]]
Table[With[{sg = 
    SpatialReconstruction[
     ResourceFunction[
       "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
          w}, {z, w}}}, {{0, 0}, {0, 0}}, t], 2]}, 
  HighlightGraph[sg, 
   Style[Subgraph[sg, 
     With[{g = 
        ResourceFunction[
           "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
              w}, {z, w}}}, {{0, 0}, {0, 0}}, 10][
         "LayeredCausalGraph"]}, VertexOutComponent[g, 10]]], Red, 
    Thick]]], {t, 3, 10}]

Now the question we have to ask is: how “wide” is that area of intersection? The pictures make it clear that it’s not trivial to answer—or even precisely define—that question. Yes, in the continuum limit of sufficiently large graphs we’d better get something that looks like a light cone in continuum space, but it’s far from trivial how that limiting process might work.

We can think of the intersection of the causal cone with a spatial slice as defining a “causal ball” at a particular “time”. But now within that spatial slice we can ask about graph distances. So, for example, given a particular point in the slice we can ask what points lie within a certain graph distance of it—or, in other words, what the geodesic ball of some radius around that point is.

And fundamentally the computation of “speed” is all about the comparison of the “widths” of causal balls and of geodesic balls. Another way to look at this is to say that given two points in the causal ball (that by definition are produced from a common ancestor some “time” back) we want to know the “spatial distance” between them.

There are several ways we can assess “width”. We could compute the boundaries of causal balls, and for each point see what the “geodesically most distant” point is. Or we can just compute geodesic (i.e. spatial reconstruction graph) distances between all pairs of points in the causal ball. Here are distributions of these distances for each step shown above:

Table
&#10005
PacletInstall["SetReplace"]; << SetReplace`;
SpatialReconstruction[wmo_WolframModelEvolutionObject, 
  dt_Integer : 1] := 
 Module[{cg = wmo["CausalGraph"], ceg = wmo["EventGenerations"], ev0, 
   ev1, oc}, ev0 = First /@ Position[-(ceg - Max[ceg]), dt];
  ev1 = First /@ Position[-(ceg - Max[ceg]), 0];
  oc = Select[Rest[VertexOutComponent[cg, #]], MemberQ[ev1, #] &] & /@
     ev0; Graph[
   WolframPhysicsProjectStyleData["SpatialGraph", "Function"][
    Graph[ev1, 
     Flatten[(UndirectedEdge @@@ Subsets[#, {2}]) & /@ oc]]], 
   VertexStyle -> 
    WolframPhysicsProjectStyleData["CausalGraph", "VertexStyle"], 
   EdgeStyle -> 
    Blend[{First[
       WolframPhysicsProjectStyleData["SpatialGraph", 
        "EdgeLineStyle"]], 
      WolframPhysicsProjectStyleData["BranchialGraph", "EdgeStyle"]}]]]
Table[Histogram[
  Flatten[Module[{sg = 
      SpatialReconstruction[
       ResourceFunction[
         "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
            w}, {z, w}}}, {{0, 0}, {0, 0}}, t], 2], pts, dm},
    pts = 
     Intersection[
      With[{g = 
         ResourceFunction[
            "WolframModel"][{{{x, y}, {x, z}} -> {{x, z}, {x, w}, {y, 
               w}, {z, w}}}, {{0, 0}, {0, 0}}, 10][
          "LayeredCausalGraph"]}, VertexOutComponent[g, 10]], 
      VertexList[sg]]; 
    Outer[GraphDistance[sg, #1, #2] &, pts, pts]]], {1}, 
  PlotRange -> {{-.5, 8.5}, Automatic}, Frame -> True, 
  FrameTicks -> {Automatic, None}], {t, 5, 10}]

How do we assess the “speed of light” from this? We might imagine we should look at the “outer edge” of this histogram, and see how it advances with “time”. If we do that, we get the result:

ListLinePlot
&#10005
Cell[CellGroupData[{Cell[BoxData[{
 RowBox[{
  RowBox[{"PacletInstall", "[", "\"\<SetReplace\>\"", "]"}], ";", 
  RowBox[{"<<", "SetReplace`"}], ";"}], "\n", 
 RowBox[{
  RowBox[{"SpatialReconstruction", "[", 
   RowBox[{"wmo_WolframModelEvolutionObject", ",", 
    RowBox[{"dt_Integer", ":", "1"}]}], "]"}], ":=", 
  RowBox[{"Module", "[", 
   RowBox[{
    RowBox[{"{", 
     RowBox[{
      RowBox[{"cg", "=", 
       RowBox[{"wmo", "[", "\"\<CausalGraph\>\"", "]"}]}], ",", 
      RowBox[{"ceg", "=", 
       RowBox[{"wmo", "[", "\"\<EventGenerations\>\"", "]"}]}], ",", 
      "ev0", ",", "ev1", ",", "oc"}], "}"}], ",", 
    RowBox[{
     RowBox[{"ev0", "=", 
      RowBox[{"First", "/@", 
       RowBox[{"Position", "[", 
        RowBox[{
         RowBox[{"-", 
          RowBox[{"(", 
           RowBox[{"ceg", "-", 
            RowBox[{"Max", "[", "ceg", "]"}]}], ")"}]}], ",", "dt"}], 
        "]"}]}]}], ";", "\[IndentingNewLine]", 
     RowBox[{"ev1", "=", 
      RowBox[{"First", "/@", 
       RowBox[{"Position", "[", 
        RowBox[{
         RowBox[{"-", 
          RowBox[{"(", 
           RowBox[{"ceg", "-", 
            RowBox[{"Max", "[", "ceg", "]"}]}], ")"}]}], ",", "0"}], 
        "]"}]}]}], ";", "\[IndentingNewLine]", 
     RowBox[{"oc", "=", 
      RowBox[{
       RowBox[{
        RowBox[{"Select", "[", 
         RowBox[{
          RowBox[{"Rest", "[", 
           RowBox[{"VertexOutComponent", "[", 
            RowBox[{"cg", ",", "#"}], "]"}], "]"}], ",", 
          RowBox[{
           RowBox[{"MemberQ", "[", 
            RowBox[{"ev1", ",", "#"}], "]"}], "&"}]}], "]"}], "&"}], "/@",
        "ev0"}]}], ";", 
     RowBox[{"Graph", "[", 
      RowBox[{
       RowBox[{
        RowBox[{"WolframPhysicsProjectStyleData", "[", 
         RowBox[{"\"\<SpatialGraph\>\"", ",", "\"\<Function\>\""}], 
         "]"}], "[", 
        RowBox[{"Graph", "[", 
         RowBox[{"ev1", ",", 
          RowBox[{"Flatten", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"(", 
              RowBox[{"UndirectedEdge", "@@@", 
               RowBox[{"Subsets", "[", 
                RowBox[{"#", ",", 
                 RowBox[{"{", "2", "}"}]}], "]"}]}], ")"}], "&"}], "/@",
             "oc"}], "]"}]}], "]"}], "]"}], ",", 
       RowBox[{"VertexStyle", "\[Rule]", 
        RowBox[{"WolframPhysicsProjectStyleData", "[", 
         RowBox[{"\"\<CausalGraph\>\"", ",", "\"\<VertexStyle\>\""}], 
         "]"}]}], ",", 
       RowBox[{"EdgeStyle", "\[Rule]", 
        RowBox[{"Blend", "[", 
         RowBox[{"{", 
          RowBox[{
           RowBox[{"First", "[", 
            RowBox[{"WolframPhysicsProjectStyleData", "[", 
             RowBox[{
             "\"\<SpatialGraph\>\"", ",", "\"\<EdgeLineStyle\>\""}], 
             "]"}], "]"}], ",", 
           RowBox[{"WolframPhysicsProjectStyleData", "[", 
            RowBox[{
            "\"\<BranchialGraph\>\"", ",", "\"\<EdgeStyle\>\""}], 
            "]"}]}], "}"}], "]"}]}]}], "]"}]}]}], 
   "]"}]}], "\[IndentingNewLine]", 
 RowBox[{"Table", "[", 
  RowBox[{
   RowBox[{"{", 
    RowBox[{"t", ",", 
     RowBox[{"Max", "[", 
      RowBox[{"Flatten", "[", 
       RowBox[{"Module", "[", 
        RowBox[{
         RowBox[{"{", 
          RowBox[{
           RowBox[{"sg", "=", 
            RowBox[{"SpatialReconstruction", "[", 
             RowBox[{
              RowBox[{
               RowBox[{
               "ResourceFunction", "[", "\"\<WolframModel\>\"", "]"}],
                "[", 
               RowBox[{
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"{", 
                   RowBox[{
                    RowBox[{"{", 
                    RowBox[{"x", ",", "y"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"x", ",", "z"}], "}"}]}], "}"}], 
                  "\[Rule]", 
                  RowBox[{"{", 
                   RowBox[{
                    RowBox[{"{", 
                    RowBox[{"x", ",", "z"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"x", ",", "w"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"y", ",", "w"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], 
                ",", 
                RowBox[{"{", 
                 RowBox[{
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}], ",", 
                  RowBox[{"{", 
                   RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", 
                "t"}], "]"}], ",", "2"}], "]"}]}], ",", "pts", ",", 
           "dm"}], "}"}], ",", "\n", 
         RowBox[{
          RowBox[{"pts", "=", 
           RowBox[{"Intersection", "[", 
            RowBox[{
             RowBox[{"With", "[", 
              RowBox[{
               RowBox[{"{", 
                RowBox[{"g", "=", 
                 RowBox[{
                  RowBox[{
                   RowBox[{
                   "ResourceFunction", "[", "\"\<WolframModel\>\"", 
                    "]"}], "[", 
                   RowBox[{
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{"x", ",", "y"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"x", ",", "z"}], "}"}]}], "}"}], 
                    "\[Rule]", 
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{"x", ",", "z"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"x", ",", "w"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"y", ",", "w"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"z", ",", "w"}], "}"}]}], "}"}]}], "}"}], 
                    ",", 
                    RowBox[{"{", 
                    RowBox[{
                    RowBox[{"{", 
                    RowBox[{"0", ",", "0"}], "}"}], ",", 
                    RowBox[{"{", 
                    RowBox[{"0", ",", "0"}], "}"}]}], "}"}], ",", 
                    "12"}], "]"}], "[", "\"\<LayeredCausalGraph\>\"", 
                  "]"}]}], "}"}], ",", 
               RowBox[{"VertexOutComponent", "[", 
                RowBox[{"g", ",", "10"}], "]"}]}], "]"}], ",", 
             RowBox[{"VertexList", "[", "sg", "]"}]}], "]"}]}], ";", 
          RowBox[{"Outer", "[", 
           RowBox[{
            RowBox[{
             RowBox[{"GraphDistance", "[", 
              RowBox[{"sg", ",", "#1", ",", "#2"}], "]"}], "&"}], ",",
             "pts", ",", "pts"}], "]"}]}]}], "]"}], "]"}], "]"}]}], 
    "}"}], ",", 
   RowBox[{"{", 
    RowBox[{"t", ",", "5", ",", "12"}], "}"}]}], "]"}]}], "Input"],

Cell[BoxData[
 RowBox[{"ListLinePlot", "[", 
  RowBox[{"%", ",", 
   RowBox[{"Mesh", "\[Rule]", "All"}]}], "]"}]], "Input"]
}, Open  ]]

But the full story is more complicated. Because, yes, the large-scale limit should be like a light cone, where we can measure the speed of light from its slope. But that doesn’t tell us about the “fine structure”. It doesn’t tell us whether at the edge of the causal ball, there are, for example, effectively space tunnels that “reach out” in the geodesic ball.

There are lots of subtle issues here. And there’s another issue in the example we’ve been using: not only does this involve a causal cone that’s expanding, but the “whole universe” (i.e. the whole spatial hypergraph) is also expanding.

So why not look at a simpler, “more static” case? Well, it isn’t so easy. Because in our models space is being “made dynamically”: it can’t really ever be “static”. At best we might imagine just having a rule that “trivially tills space”, touching elements but not “doing much” to them. But doing this introduces its own collection of artifacts.

To Travel? To Communicate?

We’ve so far been talking mainly about the very low-level structure of spacetime, and how fast “threads of causality” can effectively “traverse space”. But if we’re actually going to be able to make use of faster-than-light phenomena, we’ve somehow got to “send something through them”. It’s not good enough to just have the structure of spacetime show some kind of faster-than-light phenomenon. We’ve got to be able to take something that we’ve chosen, and “send it through”.

When we talk about “traveling faster than light”, what we normally mean is that we can take ourselves, made of ordinary matter, atoms, etc. and transport that whole structure faster than light across space. A lower bar is to consider faster-than-light communication. To do this we have to be able to take some message that we have chosen, and convert it to a form that can be transferred across space faster than light.

To achieve true faster-than-light travel we presumably have to be able to construct some form of space tunnel in which the interior of the tunnel (and its entrance and exit) are sufficiently close to ordinary, flat space that they wouldn’t destroy us if we passed through them. It doesn’t seem difficult to imagine a spatial hypergraph that at least statically contains such a space tunnel. But it’s much more challenging to think about how this would be created dynamically.

But, OK, so let’s say we just want to send individual particles, like photons, through. Well, in our models it’s not clear that’s that much easier. Because it seems likely that even a single photon of ordinary energy will correspond to a quite large region in the spatial hypergraph. Presumably the “core” of the photon is some kind of persistent topological-like structure in the hypergraph. And to understand the propagation of a photon, what one should do is to trace this structure in the causal graph.

What about “communication without travel”? To propagate a “signal” in space requires that the signal has persistence of some kind, and the most obvious mechanism for such persistence would be a topological-like structure of the kind we assume exists in particles like photons. But—at least with some of the processes we’ll discuss below—there will be a premium on having our “signal carrier” involve as few underlying elements in the spatial hypergraph as possible. And one might imagine that this would be best achieved by something like the oligon particles that our models suggest could exist, and that involve many fewer elements in the spatial hypergraph than the particles we currently know about.

Of course, using “oligon radio” requires that we have some kind of transducer between ordinary familiar particles and oligons, and it’s not clear how that can be achieved.

There is probably a close connection in our models between what we might think of as black holes and what we might think of as particles. Quite what the details of this connection or correspondence are we don’t know yet, but both correspond to persistent structures “created purely from the structure of space”.

And it’s quite possible that there is a whole spectrum of persistent structures that don’t quite have characteristics like particles (indeed, our space tunnels would presumably be examples). The question of whether any of these can be used for communication is in a sense quite easy to define. To communicate, we need some structure in the causal graph that maintains information through time, and that has parts that can be arbitrarily changed. In other words, there needs to be some way to encode something like arbitrary patterns of bits in the causal graph, and have them persist.

The Second Law of Thermodynamics

I’ve been interested in the Second Law of thermodynamics and its origins for nearly 50 years, and it’s remarkable that it now seems to be intimately connected to questions about going faster than light in our models. Fundamentally, what the Second Law says is that initially orderly configurations of things like molecules have a seemingly inexorable tendency to become more disorderly over time. And as we’ll discuss, this is something very general, ultimately rooted in the general phenomenon of computational irreducibility. And it doesn’t just apply to familiar things like molecules: it also applies—in our models—to the very structure of space.

So what’s the underlying story of the Second Law? I thought about this for many years, and finally in the 1990s got to the point where I felt I understood it. At first, the Second Law seems like a paradox: if the laws of physics are reversible then one would think that one could run any process as well backwards as forwards. Yet what the Second Law—and our experience—says is that things that start orderly tend to become more disorderly.

But here’s a simple model that illustrates what’s going on. Consider a cellular automaton that’s reversible (like the standard laws of physics), in the sense that for every configuration (or, actually, in this case, every pair of configurations) there’s both a unique successor in time, and a unique predecessor. Now start the cellular automaton from a simple initial condition:

ArrayPlot
&#10005
ArrayPlot[
 CellularAutomaton[{10710, {2, {{0, 8, 0}, {4, 2, 1}}}, 1, 
   2}, {{{1}, {1}}, 0}, 51]]

We see a fundamental computational fact: just like my favorite rule 30 cellular automaton, even though the initial condition is simple, the system behaves in a complex—and in many ways seemingly random—way.

But here’s the thing: this happens both if one runs it forward in time, and backward:

ArrayPlot
&#10005
ArrayPlot[
 CellularAutomaton[{10710, {2, {{0, 8, 0}, {4, 2, 1}}}, 1, 2}, 
  Take[Reverse[
    CellularAutomaton[{10710, {2, {{0, 8, 0}, {4, 2, 1}}}, 1, 
      2}, {{{1}, {1}}, 0}, 51]], 2], 101]]

The randomization is just a feature of the execution of the rule—forward or backward. At some moment we have a configuration that looks simple. But when we run it forward in time, it “randomizes”. And the same happens if we go backward in time.

But why is there this apparent randomization? The evolution of the cellular automaton is effectively performing a computation. And to recognize a pattern in its output we have to do a computation too. But the point is that as soon as the evolution of the cellular automaton is computationally irreducible, recognizing a pattern inevitably takes an irreducible amount of computational work. It’s as if the cellular automaton is “encrypting” its initial condition—and so we have to do lots of computational work (perhaps even exponentially more than the cellular automaton itself) to be able to “decrypt” it.

It’s not that it’s impossible to invert the final state of the cellular automaton and find that it evolved from a simple state. It’s just that to do so takes an irreducible amount of computational work. And if we as observers are bounded in our computational capabilities we eventually won’t be able to do it—so we won’t be able to recognize that the system evolved from a simple state.

The picture above shows that once we have a simple state it’ll tend to evolve to a randomized state—just like we typically see. But the picture also shows that we can in principle set up a complicated initial state that will evolve to produce the simple state. So why don’t we typically see this happening in everyday life? It’s basically again a story of limited computational capabilities. Assume we have some computational system for setting up initial states. Then we can readily imagine that it would take only a limited number of computational operations to set up a simple state. But to set up the complicated and seemingly random state we’d need to be able to evolve to the simple state will take a lot more computational operations—and if we’re bounded in our computational capabilities we won’t be able to do it.

What we’ve seen here in a simple cellular automaton also happens with gas molecules—or idealized hard spheres. Say you start the molecules off in some special “simple” configuration, perhaps with all the molecules in the corner of a box. Then you let the system run, with molecules repeatedly colliding and so on. Looked at in a computational way, we can say that the process of evolution of the system is a computation—and we can expect that it will be a computationally irreducible one. And just like with the cellular automaton, any computationally bounded observer will inevitably see “Second-Law behavior”.

The traditional treatment of the Second Law talks a lot about entropy—which measures the number of possible configurations consistent with a measurement one makes on the system. (Needless to say, counting configurations is a lot easier in a fundamentally discrete system like a cellular automaton, than in standard real-number classical mechanics.) Well, if we measure the value of every single cell in a cellular automaton, there’s only one configuration consistent with our measurement—and given this measurement the whole past and future of the cellular automaton is determined, and we’ll always measure the same entropy for it.

But imagine instead that we can’t do such complete and precise measurements. Then there may be many configurations of the system consistent with the results we get. But the point is that if the actual configuration of the system is actually simple, computationally bounded measurements will readily be able to recognize this, and determine that there’s only one (or a few) configurations consistent with their results. But if the actual configuration is complicated, computationally bounded measurements won’t be able to determine which of many configurations one’s looking at. The result is that in terms of such measurements, the entropy of the system will be considered larger.

In the typical treatment of statistical mechanics over the past century one usually talks about “coarse-grained” measurements, but it’s always been a bit unclear what constitutes a “valid” coarse graining. I think what we now understand about computational irreducibility finally clarifies this, and lets us say what’s really going on in the Second Law: entropy seems to increase because the irreducible computation done by a system can’t successfully be “decrypted” by a computationally bounded observer.

Even back in the 1860s James Maxwell realized that if you could have a “demon” who basically tweaked individual molecules to unrandomize a gas, then you wouldn’t see Second-Law behavior. And, yes, if the demon had sufficient computational capabilities you could make this work; the Second Law relies on the idea that no such computational capabilities are available.

And as soon as the Second Law is in effect, one can start “assuming that things are random”, or, more specifically, that at least in some aggregate sense, the behavior of a system will follow statistical averages. This assumption is critical in deriving standard continuum fluid behavior from underlying molecular dynamics. And it’s also critical in deriving the continuum form of space from our underlying discrete model—and for deriving things like special and general relativity.

In other words, the fact that a fluid—or space—seems like a continuum to us is a reflection of the boundedness of our computational capabilities. If we could apply as much computation as the underlying molecules in the gas—or the discrete elements in space—then we could recognize many details that would go beyond the continuum description. But with bounded computation, we just end up describing fluids—or space—in terms of aggregate continuum parameters.

We talk about mechanical work—that involves patterns of motion in molecules that we can readily recognize as organized—being useful. And we talk about “heat”—that involves patterns of motion in molecules that seem random to us—as being fundamentally less useful. But this is really just a reflection of our computational boundedness. There is all sorts of detailed information in the motions associated with heat; it’s just that we can’t decode them to make use of them.

Today when we describe a gas we’ll typically say that it’s characterized by temperature and pressure. But that misses all the detail associated with the motion of molecules. And I suspect that in time the coarseness of our current descriptions of things like gases will come to seem quite naive. There’ll be all sorts of other features and parameters that effectively correspond to different kinds of computations performed on the configuration of molecules.

People sometimes talk disparagingly about the possible “heat death of the universe”, in which all of the orderly “mechanical work” motion has degraded into “heat”. But I don’t think that’s the right characterization. Yes, our current ways of looking at microscopic motions might only be to say it’s “generic heat”. But actually there’ll be all this rich structure in there, if only we were making the right measurements, and doing the right computations.

Space Demons

If our models are going to reproduce what we currently know about physics, it’s got to be the case that in some large-scale limit, casual balls behave essentially like geodesic balls expanding at the speed of light. But this will only be an aggregate statement—that doesn’t, for example, talk about each individual relation in the spatial graph.

Computational irreducibility implies that—just like with molecules in a gas—the configurations of the evolving spatial hypergraph will tend to appear seemingly random with respect to sufficiently bounded computations. And it’s important for us to use this in doing statistical averaging for our mathematical derivations.

But the question is: Can we “compute around” that seeming randomness? Perhaps at the edge of the causal cone there are lots of little space tunnels that transiently arise from the detailed underlying dynamics of the system. But will these just seem to arise “randomly”, or can we compute where they will be, so we can potentially make use of them?

In other words, can we have a kind of analog of Maxwell’s demon not for molecules in a gas, but for atoms of space: what we might call a “space demon”? And if we had such an entity, could it let us go faster than light?

Let’s look again at the case of gas molecules. Consider an idealized hard-sphere gas in a box and track the motion of one of the “molecules”:

GraphicsGrid
&#10005
CloudGet["https://wolfr.am/PYKieD46"]; GraphicsGrid[
 Partition[Rest[visualize2D[20, 2000, 10, 2, 200]], 5]]

The molecule bounces around having a sequence of collisions, and moves according to what seems to be a random walk. But now let’s imagine we have a “gas demon” who’s “riding on a molecule”. And every time its molecule collides with another one, let’s imagine that the demon can make a decision about whether to stay with the molecule it’s already on, or to jump to the other molecule in the collision.

And now let’s say the demon is trying to “compute its way” across the box, deciding by looking at the history of the system which molecule it should hitch a ride on at each collision. Yes, the demon will have to do lots of computation. But the result will be that it can get itself transported across the system much faster than if it just stuck with one molecule. In other words, by using computation, it can “beat randomness” (and diffusion).

If we think of the collisions between hard spheres as events, we can construct a causal graph of their causal relationships:

SeedRandom
&#10005
SeedRandom[1234]; hscg = 
 Graph[ResourceFunction["WolframPhysicsProjectStyleData"][
     "CausalGraph"]["Function"][genCausalGraph[20, 500, 10, 2]], 
  AspectRatio -> .9]

At each event there are two incoming causal edges and two outgoing ones, corresponding to the spheres involved in a particular collision. And we can think of what the demon is doing as having to choose at each node in the causal graph which outgoing edge to follow. Or, in other words, the demon is determining its path in the causal graph.

Just like for our models, we can construct a causal cone for the hard-sphere gas (here continuing for more steps)—and the path taken by the demon is restricted to not go outside this cone:

HighlightGraph
&#10005
HighlightGraph[hscg, 
 Style[Subgraph[hscg, 
   VertexOutComponent[hscg, {SortBy[VertexList[hscg], Last][[25]]}]], 
  Red, Thick]]

But also like for our models, the relationship between positions in the causal ball obtained from this causal cone, and actual spatial positions, is in general complicated. At least if we were operating in an infinite region (as opposed to a finite box), the border of the causal ball in the hard-sphere gas would just be a circle. But the point is that there are always “tendrils” that stick out, and if there’s a finite box, it’s even more complicated:

With
&#10005
With[{boxSize = 20}, Graphics[
  {{FaceForm[], EdgeForm[Black], 
    Rectangle[{-boxSize, -boxSize}, {boxSize, boxSize}]},
   MapIndexed[
    Style[Disk[#, rad], EdgeForm[GrayLevel[.2]], 
      If[MemberQ[mems, First[#2]], Lighter[Red, .2], Gray]] &, 
    trajectories2D[200, 500, 20][[All, 500]]]}]]

But the point is that if the demon can make a judicious choice of which “tendrils” to follow, it can move faster than the speed defined by the “average border” of the causal cone.

If our “hard-sphere gas” were made, for example, of idealized electronic turtles, each with a computer and sensors on board, it wouldn’t seem too difficult to have a “demon turtle”. Even if our “hard spheres” were the size of microorganisms, it wouldn’t seem surprising to have a “demon”. It’s harder to imagine for actual molecules or particles; there just doesn’t seem to be anywhere to “put the computation apparatus”. Though if we started thinking about cooperation among many different hard spheres then it begins to seem more plausible again. After all, perhaps we could set up a configuration of a group of hard spheres, whose evolution will do the computation we need.

OK, so what about the case of actual space in our models? In some ways it’s a more demanding situation: after all, every aspect of the internal structure of a space demon must—like everything else—be encoded in the structure of the spatial hypergraph.

There is much we don’t know yet. For example, if there are “transient space tunnels” formed, what regularities might they show? In a hard-sphere gas, especially in 2D, there are surprisingly long time correlations between spheres, associated with what amounts to collective “hydrodynamic” behavior. And we don’t know what similar phenomena might exist in the spatial hypergraphs in our models.

But then, of course, there is the question of how to actually construct “space demons” to take advantage of transient space tunnels. The Principle of Computational Equivalence has both good and bad news here. The bad news is that it implies that the evolution of the spatial hypergraph will show computational irreducibility—so it’ll take irreducible amounts of computational work to predict what it does. But the good news is that the dynamics of the hypergraph will be capable of universal computation, and can therefore in principle be “programmed” to do computations that could do whatever can be done to “figure out what will happen”.

The key question is then whether there are sufficient “pockets of computational reducibility” associated with space tunnels that we’ll be able to successfully exploit. We know that in the continuum limit there’s plenty of computational reducibility: that’s why our models can reproduce mathematical theories like general relativity and quantum mechanics.

But space tunnels aren’t a phenomenon of the usual continuum limit; they’re something different. We don’t know what a “mathematical theory of space tunnels” would be like. Conceivably, insofar as ordinary continuum behavior can be thought of as related to the central limit theorem and Gaussian distributions, a “theory of space tunnels” could have something to do with extreme value distributions. But most likely the mathematics—if it exists, and if we can even call it that—will be much more alien.

When we say that a gas can be characterized as having a certain temperature, we’re saying that we’re not going to describe anything about the specific motions of the molecules; we’re just going to say that they’re “random”, with some average speed. But as I mentioned above, in reality there are all sorts of detailed patterns and correlations in these motions. And while as a whole they will show computational irreducibility, it is inevitable that there will be pockets of computational reducibility too. We don’t know what they are—and perhaps if we did, we could even use some of them for technological purposes. (Right now, we pretty much only use the very organized motions of molecules that we call “mechanical work”.)

But now the challenge in creating a space demon is to find such pockets of reducibility in the underlying behavior of space. In a sense, much of the historical task of engineering has been to identify pockets of reducibility in our familiar physical world: circular motion, ferromagnetic alignment of spins, wave configurations of fields, etc. In any given case, we’ll never know how hard it’s going to be: the process of finding pockets of reducibility is itself a computationally irreducible process.

But let’s say we could construct a space demon. We don’t know what characteristics it would have. Would it let us create borders around a space tunnel that would allow some “standard material object” to pass through the tunnel? Or would it instead allow a space tunnel to be constructed that could only pass through some special kind of hypergraph structure—that we might even characterize (in a nod to science fiction) as a means of “subspace communication” (i.e. communication that’s making use of structures that lie “below” space as we usually experience it)?

Quantum Effects

Most of what I’ve said about causal graphs, etc. so far has basically been classical. I’ve assumed that there’s in a sense just one thread of history for the universe. But the full story in our models—and in physics—is more complicated. Instead of there being a single theory of history, there’s a whole multiway graph that includes all the possible choices for how updating events can happen.

And in general instead of just having an ordinary causal cone, one really has a multiway causal cone—that in effect has extent not only in physical space but also in branchial space. And just as we have talked about selecting reference frames in spacetime, we also need to talk about selecting quantum observation frames in branchtime. And just as reference frames in spacetime give us a way to make sense of how events are organized in spacetime, and how we would observe or measure them there, so similarly quantum observation frames give us a way to make sense of how events are organized in branchtime, and what we would infer about them from quantum measurements.

In what we’ve said so far about space tunnels, we’re basically always assuming there’s a single thread of history involved. But really we should be talking about multiway causal cones, and tunnels that have extent both in physical space and branchial space, or, in other words, multispace tunnels.

We might imagine space tunnels are always “just fluctuations”, and that they’d be different on every “branch of history”. But a key point about multiway systems—and about multispace—is that they imply that we can expect coherence not only in physical space but also in branchial space, just as a “wave packet” is bounded both in physical and branchial space.

In our models, “vacuum fluctuations” in quantum mechanics and in the structure of space are intimately connected; in the end they are both just facets of the multiway causal graph. In ordinary quantum field theory one is used to virtual particles which individually have propagators (typically like ) that imply they can show “virtual” faster-than-light effects. But we also know—as technically implemented in the commutation relations for field operators—that in the structure of standard quantum field theory there can be no real correlations “outside the light cone”. In our models, there can also be no correlations outside the (multiway) causal cone. But the whole issue is how projections of that multiway causal cone map onto geodesic balls representing distance in space.

So what does all this mean for space demons? That they actually need to be not just space demons, but multispace demons, operating not just in physical space, but also in branchial space, or in the space of quantum states. And, yes, this is yet more complicated, but it doesn’t in any obvious way change whether things are possible.

When we imagine a space demon identifying features of space that can form a space tunnel, we can expect that it’ll do this at a particular place in physical space. In other words, if we end up going faster than light, there’ll be a particular origination point in our physical space for our journey (or, in some science fiction terms, our “jump”). And it’s really no different for branchial space and multispace demons. A multispace tunnel will presumably have some location both in physical space and branchial space.

In the way we currently think about things, “going there” in branchial space basically means doing a certain quantum measurement—though causal invariance implies that in the end all quantum observers will agree about what happened (and e.g. that one successfully “went faster than light”).

It’s all quite complicated, and certainly far from completely worked out. And there’s another issue as well. The speed of light constrains maximum speeds in physical space. But in our models, there’s also the maximum entanglement speed, which constrains maximum speeds in branchial space. And just as we can imagine space tunnels providing ways to go “faster than c”, so also we can imagine multispace tunnels providing ways to go “faster than ζ”.

Is It Possible? Can We Make It Work?

OK, so what’s the bottom line? Is it in principle possible to go faster than light? And if so, how can we actually do it?

I’m pretty sure that, yes, in principle it’s possible. In fact, as soon as one views space as having an underlying structure, and not just being a mathematical manifold “all the way down”, it’s pretty much inevitable. But it still requires essentially “hacking” space, and “reverse engineering” its structure to find features like “space tunnels” that one can use.

How is all this consistent with relativity, and its assumption of the absoluteness of the speed of light? Well, it isn’t. The phenomena and possibilities I’m describing here are ones that occur in the “substrate” below where relativity operates. It’s as if our standard physics—with relativity, etc.—are part of the “high-level operating system” of the universe. But what we’re talking here about doing is creating hacks down at the “machine code” level.

Put another way: relativity is something that arises in our models as a large-scale limit, when one’s averaged out all the underlying details. But the whole point here is to somehow leverage and “line up” those underlying details, so they produce the effects we’re interested in. But when we look at the whole “bulk” universe, and the full large-scale limit, anything we might be able to do at the level of the details will seem infinitesimal—and won’t affect our overall conclusion that relativity is a feature of the general physics of the universe.

Now of course, even though something may in principle be possible, that doesn’t mean it can be done in practice. Maybe it’s fairly easy to go a tiny distance faster than light, but to scale up to anything substantial requires resources beyond what we—or even the universe—could ever muster. And, yes, as I discussed, that is a possibility. Because in a sense what we have to do is to “beat computational irreducibility” in the evolution of space. And in the abstract there is no way to tell how hard this might be.

Let’s say we have the general objective of “going faster than light”. There will be an immense (and probably infinite) number of detailed ways we could imagine achieving this. And in general there will be no upper bound on the amount of computation needed for any one of them. So if we ask “Will any of them work?”, that’ll be formally undecidable. If we find one that we can show works, great. But we could in principle have to go on testing things forever, never being sure that nothing can work.

And, yes, this means that even though we might know the final underlying rule for physics, we still might fundamentally never be sure whether it’s possible to go faster than light. We might have successfully “reduced physics to mathematics”, but then we still have all the issues of mathematics—like Gödel’s theorem—to contend with. And just as Gödel’s theorem tells us there’s no upper bound on the lengths of proofs we might need in arithmetic, so now we’re in a situation where there’s no upper bound on the “complexity of the process” that we might need in physics to establish whether it’s possible to go faster than light.

Still, just because something is in general undecidable, it doesn’t mean we won’t be able to figure it out. Maybe we’ll have to give up on transporting ordinary material faster than light, and we’ll only be dealing with some specially crafted form of information. But there’s no reason to think that with an objective as broad as “somehow go faster than light” that we won’t be able to, in effect, find some pocket of computational reducibility that makes it possible for us to do it.

And the fact is that the history of engineering is full of cases where an initial glimmer of possibility was eventually turned into large-scale technological success. “Can one achieve heavier-than-air flight?” There were detailed hydrodynamic effects, and there were pieces of what later became control theory. And eventually there was an engineering construction that made it work.

It’s hard to predict the process of engineering innovation. We’ve known the basic physics around controlled nuclear fusion for more than half a century. But when will we actually make it work as an engineering reality? Right now the idea of hacking space to go faster than light seems far away from anything we could in practice do. But we have no idea how high—or low—the barrier actually is.

Might it require having our own little black hole? Or might it be something that just requires putting together things we already have in just the right way? Not long ago it was completely unclear that we could “beat the uncertainty principle” enough to measure gravitational waves. Or that we could build an atomic force microscope that could move individual atoms around. Or that we could form a physical Bose–Einstein condensate. But in cases like these it turned out that we already had the “raw materials” we needed; we just had to figure out what to do with them.

A few years ago, when I was trying to make up fictional science for the movie Arrival, I thought a little about how a present-day physicist might think about the mechanism for an interstellar spacecraft that showed up one day. It was before our current models, but I had already thought a lot about the potential discrete structure of spacetime. And the best fictional idea I came up with then about how to “access it” was through some kind of “gravitational laser”. Gravitons, like photons, are bosons that can in principle form quantum condensates. And at least at the level of a made-for-a-movie whiteboard I figured out a little of how this might work.

But from what we know now, there are other ideas. Perhaps the best analogy—at least for “communication” if not “travel”—is that one’s trying to get a signal “through a complex medium” as efficiently as possible. And that’s of course been the basic problem forever in communications systems based on electrical, electromagnetic or optical processes.

Often it’s been claimed that there’s some fundamental limit, say to transmission rates. But then an engineering solution will be found that overcomes it. And actually the typical mechanism used is a little like our demons. If one’s signal is going to be degraded by “noise”, figure out how to predict the noise, then “sculpt” the process of transmission around it. In 5G technology for example, there’s even an explicit concept of “pilot signals” that continually probe the local radio environment so that actual communication signals can be formed in just the right ways.

But, OK, let’s say there is a practical way to go faster than light, or at least to send signals faster than light. Then why aren’t we seeing lots of more-advanced-than-us extraterrestrial intelligences doing this all over the universe? Maybe we just have to figure out the right engineering trick and then we’ll immediately be able to tap into a universe-scale conversation. And while it’s fun to imagine just how wild the social network of the universe might get, I think there’s a fundamental problem here (even beyond the “what’s really the use case?”). Let’s say we can see processes that correspond to faster-than-light communication. Are they part of a “conversation”, “saying” something meaningful? Or are they just “physical processes” that are going on?

Well, of course, anything that happens in the universe is, essentially by definition, a “physical process”. So then we might start talking about whether what we’re seeing is an “intentionally created” physical process, or one that’s just “happening naturally”. But—as I’ve written extensively about elsewhere—it’s a slippery slope. And the Principle of Computational Equivalence basically tells us that in the end we’ll never be able to distinguish the “intelligent” from the “merely computational”, or, given our model of physics, the “merely physical”—at least unless what we’re seeing is aligned in detail with our particular human ways of thinking.

At the outset we might have imagined that going faster than light was an open-and-shut case, and that physics had basically proved that—despite a few seemingly pathological examples in general relativity—it isn’t possible. I hope what’s become clear here is that actually the opposite is true. In our models of physics, going faster than light is almost inevitably possible in principle. But to actually do it requires engineering that may be irreducibly difficult.

But maybe it’s like in 1687 when a then-new model of physics implied that artificial satellites might be possible. After 270 years of steady engineering progress, there they were. And so it may be with going faster than light. Our models now suggest it’s possible. But whether the engineering required can be done in ten, a hundred, a thousand, a million or a billion years we don’t know. But maybe at least there’s now a path to turn yet another “pure-science-fiction impossibility” into reality.


A Few Questions

My talk at NASA generated many questions. Here are a few answers.

What about warp bubbles and the Alcubierre metric?

Warp bubbles are a clever way to get something a bit like faster-than-light travel in ordinary general relativity. The basic idea is to set up a solution to Einstein’s equations in which space is “rapidly contracting” in front of a “bubble region”, and expanding behind it:

Plot3D
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"expansion", "=", 
   RowBox[{
    RowBox[{"(", 
     RowBox[{
      InterpretationBox[
       StyleBox["\[Sigma]",
        ShowAutoStyles->False,
        AutoSpacing->False],
       $CellContext`\[Sigma],
       Editable->False], " ", 
      RowBox[{
       InterpretationBox[
        StyleBox["Coth",
         ShowAutoStyles->False,
         AutoSpacing->False],
        Coth,
        Editable->False], "[", 
       RowBox[{
        InterpretationBox[
         StyleBox["R",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`R,
         Editable->False], " ", 
        InterpretationBox[
         StyleBox["\[Sigma]",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`\[Sigma],
         Editable->False]}], "]"}], " ", 
      RowBox[{"(", 
       RowBox[{
        RowBox[{"-", 
         SuperscriptBox[
          RowBox[{
           InterpretationBox[
            StyleBox["Sech",
             ShowAutoStyles->False,
             AutoSpacing->False],
            Sech,
            Editable->False], "[", 
           RowBox[{
            InterpretationBox[
             StyleBox["\[Sigma]",
              ShowAutoStyles->False,
              AutoSpacing->False],
             $CellContext`\[Sigma],
             Editable->False], " ", 
            RowBox[{"(", 
             RowBox[{
              RowBox[{"-", 
               InterpretationBox[
                StyleBox["R",
                 ShowAutoStyles->False,
                 AutoSpacing->False],
                $CellContext`R,
                Editable->False]}], "+", 
              SqrtBox[
               RowBox[{
                SuperscriptBox[
                 RowBox[{"(", 
                  RowBox[{
                   InterpretationBox[
                    StyleBox["x",
                    ShowAutoStyles->False,
                    AutoSpacing->False],
                    $CellContext`x[],
                    Editable->False], "-", 
                   RowBox[{
                    InterpretationBox[
                    StyleBox["xs",
                    ShowAutoStyles->False,
                    AutoSpacing->False],
                    $CellContext`xs,
                    Editable->False], "[", 
                    InterpretationBox[
                    StyleBox["t",
                    ShowAutoStyles->False,
                    AutoSpacing->False],
                    $CellContext`t[],
                    Editable->False], "]"}]}], ")"}], "2"], "+", 
                SuperscriptBox[
                 InterpretationBox[
                  StyleBox["y",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`y[],
                  Editable->False], "2"], "+", 
                SuperscriptBox[
                 InterpretationBox[
                  StyleBox["z",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`z[],
                  Editable->False], "2"]}]]}], ")"}]}], "]"}], "2"]}],
         "+", 
        SuperscriptBox[
         RowBox[{
          InterpretationBox[
           StyleBox["Sech",
            ShowAutoStyles->False,
            AutoSpacing->False],
           Sech,
           Editable->False], "[", 
          RowBox[{
           InterpretationBox[
            StyleBox["\[Sigma]",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`\[Sigma],
            Editable->False], " ", 
           RowBox[{"(", 
            RowBox[{
             InterpretationBox[
              StyleBox["R",
               ShowAutoStyles->False,
               AutoSpacing->False],
              $CellContext`R,
              Editable->False], "+", 
             SqrtBox[
              RowBox[{
               SuperscriptBox[
                RowBox[{"(", 
                 RowBox[{
                  InterpretationBox[
                   StyleBox["x",
                    ShowAutoStyles->False,
                    AutoSpacing->False],
                   $CellContext`x[],
                   Editable->False], "-", 
                  RowBox[{
                   InterpretationBox[
                    StyleBox["xs",
                    ShowAutoStyles->False,
                    AutoSpacing->False],
                    $CellContext`xs,
                    Editable->False], "[", 
                   InterpretationBox[
                    StyleBox["t",
                    ShowAutoStyles->False,
                    AutoSpacing->False],
                    $CellContext`t[],
                    Editable->False], "]"}]}], ")"}], "2"], "+", 
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["y",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`y[],
                 Editable->False], "2"], "+", 
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["z",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`z[],
                 Editable->False], "2"]}]]}], ")"}]}], "]"}], "2"]}], 
       ")"}], " ", 
      RowBox[{"(", 
       RowBox[{
        InterpretationBox[
         StyleBox["x",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`x[],
         Editable->False], "-", 
        RowBox[{
         InterpretationBox[
          StyleBox["xs",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`xs,
          Editable->False], "[", 
         InterpretationBox[
          StyleBox["t",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`t[],
          Editable->False], "]"}]}], ")"}], " ", 
      RowBox[{
       SuperscriptBox[
        InterpretationBox[
         StyleBox["xs",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`xs,
         Editable->False], "\[Prime]",
        MultilineFunction->None], "[", 
       InterpretationBox[
        StyleBox["t",
         ShowAutoStyles->False,
         AutoSpacing->False],
        $CellContext`t[],
        Editable->False], "]"}]}], ")"}], "/", 
    RowBox[{"(", 
     SqrtBox[
      RowBox[{
       SuperscriptBox[
        InterpretationBox[
         StyleBox["x",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`x[],
         Editable->False], "2"], "-", 
       RowBox[{"2", " ", 
        InterpretationBox[
         StyleBox["x",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`x[],
         Editable->False], " ", 
        RowBox[{
         InterpretationBox[
          StyleBox["xs",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`xs,
          Editable->False], "[", 
         InterpretationBox[
          StyleBox["t",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`t[],
          Editable->False], "]"}]}], "+", 
       SuperscriptBox[
        RowBox[{
         InterpretationBox[
          StyleBox["xs",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`xs,
          Editable->False], "[", 
         InterpretationBox[
          StyleBox["t",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`t[],
          Editable->False], "]"}], "2"], "+", 
       SuperscriptBox[
        InterpretationBox[
         StyleBox["y",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`y[],
         Editable->False], "2"], "+", 
       SuperscriptBox[
        InterpretationBox[
         StyleBox["z",
          ShowAutoStyles->False,
          AutoSpacing->False],
         $CellContext`z[],
         Editable->False], "2"]}]], ")"}]}]}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{
   RowBox[{"expansionF", "[", 
    RowBox[{"x_", ",", "\[Rho]_"}], "]"}], "=", 
   RowBox[{
    RowBox[{
     RowBox[{"expansion", "/.", 
      RowBox[{"{", 
       RowBox[{
        RowBox[{"\[Sigma]", "\[Rule]", "8"}], ",", 
        RowBox[{"R", "\[Rule]", "1"}]}], "}"}]}], "/.", 
     RowBox[{"{", 
      RowBox[{"xs", "\[Rule]", 
       RowBox[{"Function", "[", 
        RowBox[{"t", ",", "t"}], "]"}]}], "}"}]}], "/.", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{
       RowBox[{"t", "[", "]"}], "\[Rule]", "0"}], ",", 
      RowBox[{
       RowBox[{"x", "[", "]"}], "\[Rule]", "x"}], ",", 
      RowBox[{
       RowBox[{
        RowBox[{"y", "[", "]"}], "^", "2"}], "\[Rule]", 
       RowBox[{
        RowBox[{"\[Rho]", "^", "2"}], "-", 
        RowBox[{
         RowBox[{"z", "[", "]"}], "^", "2"}]}]}]}], "}"}]}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{"Plot3D", "[", 
  RowBox[{
   RowBox[{"expansionF", "[", 
    RowBox[{"x", ",", "\[Rho]"}], "]"}], ",", 
   RowBox[{"{", 
    RowBox[{"x", ",", 
     RowBox[{"-", "2"}], ",", "2"}], "}"}], ",", 
   RowBox[{"{", 
    RowBox[{"\[Rho]", ",", 
     RowBox[{"-", "2"}], ",", "2"}], "}"}], ",", 
   RowBox[{"PlotRange", "\[Rule]", "All"}], ",", 
   RowBox[{"MaxRecursion", "\[Rule]", "5"}], ",", 
   RowBox[{"Boxed", "\[Rule]", "False"}], ",", 
   RowBox[{"Axes", "\[Rule]", "None"}], ",", 
   RowBox[{"Mesh", "\[Rule]", "30"}]}], "]"}]], "Input"]
}, Open  ]]

To maintain this configuration, one needs negative mass on each side of the bubble:

Plot3D
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{"density", "=", 
   RowBox[{"-", 
    RowBox[{"(", 
     RowBox[{
      RowBox[{"(", 
       RowBox[{
        SuperscriptBox[
         InterpretationBox[
          StyleBox["\[Sigma]",
           ShowAutoStyles->False,
           AutoSpacing->False],
          $CellContext`\[Sigma],
          Editable->False], "2"], " ", 
        SuperscriptBox[
         RowBox[{
          InterpretationBox[
           StyleBox["Cosh",
            ShowAutoStyles->False,
            AutoSpacing->False],
           Cosh,
           Editable->False], "[", 
          RowBox[{
           InterpretationBox[
            StyleBox["R",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`R,
            Editable->False], " ", 
           InterpretationBox[
            StyleBox["\[Sigma]",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`\[Sigma],
            Editable->False]}], "]"}], "4"], " ", 
        SuperscriptBox[
         RowBox[{
          InterpretationBox[
           StyleBox["Sech",
            ShowAutoStyles->False,
            AutoSpacing->False],
           Sech,
           Editable->False], "[", 
          RowBox[{
           InterpretationBox[
            StyleBox["\[Sigma]",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`\[Sigma],
            Editable->False], " ", 
           RowBox[{"(", 
            RowBox[{
             RowBox[{"-", 
              InterpretationBox[
               StyleBox["R",
                ShowAutoStyles->False,
                AutoSpacing->False],
               $CellContext`R,
               Editable->False]}], "+", 
             SqrtBox[
              RowBox[{
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["x",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`x[],
                 Editable->False], "2"], "-", 
               RowBox[{"2", " ", 
                InterpretationBox[
                 StyleBox["x",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`x[],
                 Editable->False], " ", 
                RowBox[{
                 InterpretationBox[
                  StyleBox["xs",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`xs,
                  Editable->False], "[", 
                 InterpretationBox[
                  StyleBox["t",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`t[],
                  Editable->False], "]"}]}], "+", 
               SuperscriptBox[
                RowBox[{
                 InterpretationBox[
                  StyleBox["xs",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`xs,
                  Editable->False], "[", 
                 InterpretationBox[
                  StyleBox["t",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`t[],
                  Editable->False], "]"}], "2"], "+", 
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["y",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`y[],
                 Editable->False], "2"], "+", 
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["z",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`z[],
                 Editable->False], "2"]}]]}], ")"}]}], "]"}], "4"], 
        " ", 
        SuperscriptBox[
         RowBox[{
          InterpretationBox[
           StyleBox["Sech",
            ShowAutoStyles->False,
            AutoSpacing->False],
           Sech,
           Editable->False], "[", 
          RowBox[{
           InterpretationBox[
            StyleBox["\[Sigma]",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`\[Sigma],
            Editable->False], " ", 
           RowBox[{"(", 
            RowBox[{
             InterpretationBox[
              StyleBox["R",
               ShowAutoStyles->False,
               AutoSpacing->False],
              $CellContext`R,
              Editable->False], "+", 
             SqrtBox[
              RowBox[{
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["x",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`x[],
                 Editable->False], "2"], "-", 
               RowBox[{"2", " ", 
                InterpretationBox[
                 StyleBox["x",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`x[],
                 Editable->False], " ", 
                RowBox[{
                 InterpretationBox[
                  StyleBox["xs",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`xs,
                  Editable->False], "[", 
                 InterpretationBox[
                  StyleBox["t",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`t[],
                  Editable->False], "]"}]}], "+", 
               SuperscriptBox[
                RowBox[{
                 InterpretationBox[
                  StyleBox["xs",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`xs,
                  Editable->False], "[", 
                 InterpretationBox[
                  StyleBox["t",
                   ShowAutoStyles->False,
                   AutoSpacing->False],
                  $CellContext`t[],
                  Editable->False], "]"}], "2"], "+", 
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["y",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`y[],
                 Editable->False], "2"], "+", 
               SuperscriptBox[
                InterpretationBox[
                 StyleBox["z",
                  ShowAutoStyles->False,
                  AutoSpacing->False],
                 $CellContext`z[],
                 Editable->False], "2"]}]]}], ")"}]}], "]"}], "4"], 
        " ", 
        SuperscriptBox[
         RowBox[{
          InterpretationBox[
           StyleBox["Sinh",
            ShowAutoStyles->False,
            AutoSpacing->False],
           Sinh,
           Editable->False], "[", 
          RowBox[{"2", " ", 
           InterpretationBox[
            StyleBox["\[Sigma]",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`\[Sigma],
            Editable->False], " ", 
           SqrtBox[
            RowBox[{
             SuperscriptBox[
              InterpretationBox[
               StyleBox["x",
                ShowAutoStyles->False,
                AutoSpacing->False],
               $CellContext`x[],
               Editable->False], "2"], "-", 
             RowBox[{"2", " ", 
              InterpretationBox[
               StyleBox["x",
                ShowAutoStyles->False,
                AutoSpacing->False],
               $CellContext`x[],
               Editable->False], " ", 
              RowBox[{
               InterpretationBox[
                StyleBox["xs",
                 ShowAutoStyles->False,
                 AutoSpacing->False],
                $CellContext`xs,
                Editable->False], "[", 
               InterpretationBox[
                StyleBox["t",
                 ShowAutoStyles->False,
                 AutoSpacing->False],
                $CellContext`t[],
                Editable->False], "]"}]}], "+", 
             SuperscriptBox[
              RowBox[{
               InterpretationBox[
                StyleBox["xs",
                 ShowAutoStyles->False,
                 AutoSpacing->False],
                $CellContext`xs,
                Editable->False], "[", 
               InterpretationBox[
                StyleBox["t",
                 ShowAutoStyles->False,
                 AutoSpacing->False],
                $CellContext`t[],
                Editable->False], "]"}], "2"], "+", 
             SuperscriptBox[
              InterpretationBox[
               StyleBox["y",
                ShowAutoStyles->False,
                AutoSpacing->False],
               $CellContext`y[],
               Editable->False], "2"], "+", 
             SuperscriptBox[
              InterpretationBox[
               StyleBox["z",
                ShowAutoStyles->False,
                AutoSpacing->False],
               $CellContext`z[],
               Editable->False], "2"]}]]}], "]"}], "2"], " ", 
        RowBox[{"(", 
         RowBox[{
          SuperscriptBox[
           InterpretationBox[
            StyleBox["y",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`y[],
            Editable->False], "2"], "+", 
          SuperscriptBox[
           InterpretationBox[
            StyleBox["z",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`z[],
            Editable->False], "2"]}], ")"}], " ", 
        SuperscriptBox[
         RowBox[{
          SuperscriptBox[
           InterpretationBox[
            StyleBox["xs",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`xs,
            Editable->False], "\[Prime]",
           MultilineFunction->None], "[", 
          InterpretationBox[
           StyleBox["t",
            ShowAutoStyles->False,
            AutoSpacing->False],
           $CellContext`t[],
           Editable->False], "]"}], "2"]}], ")"}], "/", 
      RowBox[{"(", 
       RowBox[{"4", " ", 
        RowBox[{"(", 
         RowBox[{
          SuperscriptBox[
           InterpretationBox[
            StyleBox["x",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`x[],
            Editable->False], "2"], "-", 
          RowBox[{"2", " ", 
           InterpretationBox[
            StyleBox["x",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`x[],
            Editable->False], " ", 
           RowBox[{
            InterpretationBox[
             StyleBox["xs",
              ShowAutoStyles->False,
              AutoSpacing->False],
             $CellContext`xs,
             Editable->False], "[", 
            InterpretationBox[
             StyleBox["t",
              ShowAutoStyles->False,
              AutoSpacing->False],
             $CellContext`t[],
             Editable->False], "]"}]}], "+", 
          SuperscriptBox[
           RowBox[{
            InterpretationBox[
             StyleBox["xs",
              ShowAutoStyles->False,
              AutoSpacing->False],
             $CellContext`xs,
             Editable->False], "[", 
            InterpretationBox[
             StyleBox["t",
              ShowAutoStyles->False,
              AutoSpacing->False],
             $CellContext`t[],
             Editable->False], "]"}], "2"], "+", 
          SuperscriptBox[
           InterpretationBox[
            StyleBox["y",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`y[],
            Editable->False], "2"], "+", 
          SuperscriptBox[
           InterpretationBox[
            StyleBox["z",
             ShowAutoStyles->False,
             AutoSpacing->False],
            $CellContext`z[],
            Editable->False], "2"]}], ")"}]}], ")"}]}], ")"}]}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"densityF", "[", 
   RowBox[{"x_", ",", "\[Rho]_"}], "]"}], "=", 
  RowBox[{
   RowBox[{
    RowBox[{"density", "/.", 
     RowBox[{"{", 
      RowBox[{
       RowBox[{"\[Sigma]", "\[Rule]", "8"}], ",", 
       RowBox[{"R", "\[Rule]", "1"}]}], "}"}]}], "/.", 
    RowBox[{"{", 
     RowBox[{"xs", "\[Rule]", 
      RowBox[{"Function", "[", 
       RowBox[{"t", ",", "t"}], "]"}]}], "}"}]}], "/.", 
   RowBox[{"{", 
    RowBox[{
     RowBox[{
      RowBox[{"t", "[", "]"}], "\[Rule]", "0"}], ",", 
     RowBox[{
      RowBox[{"x", "[", "]"}], "\[Rule]", "x"}], ",", 
     RowBox[{
      RowBox[{
       RowBox[{"y", "[", "]"}], "^", "2"}], "\[Rule]", 
      RowBox[{
       RowBox[{"\[Rho]", "^", "2"}], "-", 
       RowBox[{
        RowBox[{"z", "[", "]"}], "^", "2"}]}]}]}], "}"}]}]}]], "Input"],

Cell[BoxData[
 RowBox[{"-", 
  FractionBox[
   RowBox[{"16", " ", 
    SuperscriptBox["\[Rho]", "2"], " ", 
    SuperscriptBox[
     RowBox[{"Cosh", "[", "8", "]"}], "4"], " ", 
    SuperscriptBox[
     RowBox[{"Sech", "[", 
      RowBox[{"8", " ", 
       RowBox[{"(", 
        RowBox[{
         RowBox[{"-", "1"}], "+", 
         SqrtBox[
          RowBox[{
           SuperscriptBox["x", "2"], "+", 
           SuperscriptBox["\[Rho]", "2"]}]]}], ")"}]}], "]"}], "4"], 
    " ", 
    SuperscriptBox[
     RowBox[{"Sech", "[", 
      RowBox[{"8", " ", 
       RowBox[{"(", 
        RowBox[{"1", "+", 
         SqrtBox[
          RowBox[{
           SuperscriptBox["x", "2"], "+", 
           SuperscriptBox["\[Rho]", "2"]}]]}], ")"}]}], "]"}], "4"], 
    " ", 
    SuperscriptBox[
     RowBox[{"Sinh", "[", 
      RowBox[{"16", " ", 
       SqrtBox[
        RowBox[{
         SuperscriptBox["x", "2"], "+", 
         SuperscriptBox["\[Rho]", "2"]}]]}], "]"}], "2"]}], 
   RowBox[{
    SuperscriptBox["x", "2"], "+", 
    SuperscriptBox["\[Rho]", "2"]}]]}]], "Output",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.},
 CellChangeTimes->{3.810596452088109*^9, 3.810596636896936*^9, 
  3.810599942079987*^9},
 CellLabel->"Out[57]="],

Cell[BoxData[
 RowBox[{"Plot3D", "[", 
  RowBox[{
   RowBox[{"-", 
    RowBox[{"densityF", "[", 
     RowBox[{"x", ",", "\[Rho]"}], "]"}]}], ",", 
   RowBox[{"{", 
    RowBox[{"x", ",", 
     RowBox[{"-", "2"}], ",", "2"}], "}"}], ",", 
   RowBox[{"{", 
    RowBox[{"\[Rho]", ",", 
     RowBox[{"-", "2"}], ",", "2"}], "}"}], ",", 
   RowBox[{"PlotRange", "\[Rule]", "All"}], ",", 
   RowBox[{"MaxRecursion", "\[Rule]", "5"}], ",", 
   RowBox[{"Boxed", "\[Rule]", "False"}], ",", 
   RowBox[{"Axes", "\[Rule]", "None"}], ",", 
   RowBox[{"Mesh", "\[Rule]", "30"}], ",", 
   RowBox[{"PlotStyle", "\[Rule]", "LightYellow"}]}], "]"}]], "Input"]
}, Open  ]]

In a sense it’s like an asymmetric local analog of the expansion of the universe. Inside the bubble space is flat. But other parts of the universe are approaching or receding as a result of the contraction and expansion of space. And in fact this is happening so rapidly that (1) the bubble is effectively moving faster than light relative to the rest of the universe, and (2) there’s an event horizon around the bubble, so nothing can go in or out.

It’s rather easy to make a toy version of this within our models; here’s the corresponding causal graph:

Graph
&#10005
Graph[ResourceFunction["SubstitutionSystemCausalGraph"][
  ResourceFunction["SubstitutionSystemCausalEvolution"][{"xo" -> "ox",
     "Xo" -> "Xo", "oX" -> "oX"}, "xoxoxoxoxoxooXoXoXxoxoxoxoxoxo", 
   5], "CausalGraph" -> True, "ColorTable" -> (LightGray &)], 
 GraphLayout -> "LayeredDigraphEmbedding"]

“Reconstructions of space” will then show that “parts of space” can “slip past others”, “as fast as they want”—but without causal interaction. Our space demon / space tunnel setup is rather different: there are no horizons involved; the whole point is to trace causal connections, but then to see how these map onto space.

What about quantum teleportation?

In quantum teleportation, there’s some sense in which different quantum measurements seem to “communicate faster than light”. But there’s always a slower-than-light back channel that sets up the measurements. In our models, the whole phenomenon is decently easy to see. It involves measurement inducing “communication” through causal connections in the multiway causal graph, but the point is that these are branchlike edges, not spacelike ones—so there’s no “travel through physical space”. (A whole different issue is limitations on quantum teleportation associated with the maximum entanglement space ζ.)

Our Mission and the Opportunity of Artifacts from the Future

$
0
0
our-mission-icon

In preparing my keynote at our 31st annual technology conference, I tried to collect some of my thoughts about our long-term mission and how I view the opportunities it is creating…

What I’ve Spent My Life On

I’ve been fortunate to live at a time in history when there’s a transformational intellectual development: the rise of computation and the computational paradigm. And I’ve devoted my adult life to doing what I can to make computation and the computational method achieve their potential, both intellectually and in the world at large. I’ve alternated (about five times so far) between doing this with basic science and with practical technology, each time building on what I’ve been able to do before.

The basic science has shown me the immense power and potential of what’s out there in the computational universe: the capability of even simple programs to generate behavior of immense complexity, including, I now believe, the fundamental physics of our whole universe. But how can we humans harness all that power and potential? How do we use the computational universe to achieve things we want: to take our human objectives and automate achieving them?

I’ve now spent four decades in an effort to build a bridge between what’s possible with computation, and what we humans care about and think about. It’s a story of technology, but it’s also a story of big and deep ideas. And the result has been the creation of the first and only full-scale computational language—that we now call the Wolfram Language.

The goal of our computational language is to define a medium for expressing our thoughts in computational terms—whether they be about abstract things or real things in the actual world. We want a language that both helps us think in a new way, and lets us communicate with actual computers that can automate working out their consequences. It’s a powerful combination, not really like anything seen before in history.

When I began on this path a little more than forty years ago, I only understood a small part of what a full-scale computational language would give us, and just how far it would diverge from the aspirations of programming languages. But with every passing year—particularly as we develop our language ever further—I see yet more of what’s possible. Along the way it’s brought us Mathematica, Wolfram|Alpha, my A New Kind of Science and now our Physics Project. It’s delivered the tools for countless inventions and discoveries, as well as the education of several generations of students. And it’s become a unique part of the technology stack for some of the world’s largest companies.

And, yes, it’s nice to see that validation of the bold vision of computational language. But even after all these years we’re still only at the very beginning of what’s possible. Computation has the potential to change so much for so many people. For every field X there’s going to be a computational X, and it’s going to be dramatically more powerful, more accessible, and more general than anything that came before. We’re seeing a major watershed in intellectual history.

There was a precursor four hundred years ago—when mathematical notation for the first time provided a streamlined way to represent and think about mathematics, and led to algebra, calculus and the mathematical sciences and engineering we have today. But computation is much bigger than mathematics, with much more far-reaching consequences. It affects not just the “technical layer” of understanding the world, but the full spectrum of how we think about the world, what we can create in it and what can happen in it. And now, with our computational language, we have a medium—a notation—for humans and computers to together take advantage of this.

We’re at a moment of great potential. For the first time, we have broad access to the power of the computational paradigm. But just what can be done with this, and by whom? There’s been a trend for the front lines of thinking to become increasingly specialized and inaccessible. But rather like literacy half a millennium ago, computation and computational language provides the potential to open things up: to have a framework in which pretty much anyone can partake in front-level thinking, now with the clarity and concreteness of computation, and with the practical assistance of computers.

The arrival of the computational paradigm—and computational language—is the single largest change in content to have happened since the advent of public education a century or so ago. But whatever practical difficulty it may cause, I view it as a critical responsibility to educate future generations to be able to take advantage of the power of computation—and to make the rise of computation and everything it brings be what our time in history is most remembered for.

The Inexorable Future & Its Opportunities

In the history of ideas, some things are inexorable. And the rise of the computational paradigm is one of those things. I have seen it myself over the course of nearly half a century. From “computation is just for specialists”, to “computation is useful in lots of places”, to “everyone should know about computation”, to a dawning awareness that “computation is a way of thinking about the world”. But this is just a foretaste of what is to come.

Computation is an incredibly general and powerful concept—which now indeed appears to be fundamental to our whole universe—and it seems inevitable that in time computation will provide the framework for describing and thinking about pretty much everything. But how will this actually work? We already know: computational language is the key.

And there is an inexorability to this as well. In the early days of computing, one programmed directly in the machine code of the computer. But slowly programming languages developed that gave us more convenient ways to describe and organize what we wanted to tell computers to do. Over time the languages gradually got higher and higher level, abstracting further and further away from the details of the operations of the computer.

It’s a pretty big jump to go to our modern conception of computational language, but it’s an inevitable one. Unlike programming languages—which are about describing what computers should do—my concept with the Wolfram Language is to have a way to represent everything in computational terms, for both computers and humans.

Over the past 40 years I’ve gradually understood more and more about how to construct a computer language for everything, and gradually we’ve been covering more and more with the Wolfram Language. But the endpoint is clear: to have a symbolic, computational representation for everything we humans choose to describe and work with in the world.

Some parts of this vision were absorbed quickly after we first delivered them. Mathematica as a “system for doing mathematics by computer” took only a few years to sweep across theoretical science. But even our concept of notebooks (which I always considered quite straightforward) took a solid quarter of a century to be widely absorbed, and copied.

Part of my original motivation for building the Wolfram Language was to have a tool that I could use myself, and it has turned out to be vastly more powerful than I could ever have imagined. It’s always a pleasure to see what people do with the Wolfram Language. Whether they’re distinguished leaders in their fields, or young students, they somehow seem to have a superpower that they can apply.

Yes, at some point in the future, the whole concept of computational language—and what we’ve done with Wolfram Language—will be part of what everyone takes for granted. But even after all these years, pretty much whenever I demo what we can do, many people still seem to think it’s magic. It’s as if I’m bringing an artifact from the future.

For oneself there’s no question that it’s fun—and valuable—to have an artifact from the future to use. But I feel a strong responsibility to try to bring everyone to the future—and to let everyone take advantage of the power of the computational paradigm as soon as possible.

I used to think that it wouldn’t take too long for this to just happen. But I’m realizing that the timescales are much, much longer than I imagined. Our physics project, for example, I first conceptualized 25 years ago, and nearly 20 years ago millions of people were exposed to it. Yet had it not been for a fortunate coincidence a year or so ago, I think the project could easily have languished for 50 years.

What about the whole concept of computational language? Some parts of it are quickly absorbed. But the further and further we go, the longer it’s going to take for the full story to be absorbed, and at this point it seems we’re looking at timescales of at least 50 years and perhaps 100 or more.

I’ve always wanted to build the best engine for innovation that I can. And for the past 34 years that’s been our company—which I’ve worked hard to optimize to consistently develop and deliver the best technology we can. I’ve considered other models, but what we’ve built seems basically unique in its ability to consistently sustain highly innovative R&D over the course of decades.

Over the years, our company has become more and more of an outlier in the technology world. Yes, we’re a company. But our focus is not so much commercial as intellectual. And I view what we’re doing more as a mission than a business. We want to build the computational future, and we want to do that by creating the technology to make that possible.

By now we’ve built a tower that reaches into the distant future, and we’re energetically working to extend it even further. It’s wonderful to see our community of users enabled by what we’re building—and to see the things they’re able to do.

But so far it’s still a comparatively small number of people who can harness artifacts from the future to do magic today. At some level it’s a shame it isn’t more widespread. But of course, it creates some amazing opportunities.

Who will bring computational language to this or that field? Who will write the definitive book or do the definitive research that leverages computational language in some particular way? Who will have the pleasure of seeing all those epiphanies as, one by one, people learn what the computational paradigm can do? Who will really develop the large-scale communities and disciplines enabled by the computational paradigm?

It has been wonderful to plant the seeds to make all these things possible, and I personally look forward to continuing to push further into the computational future. But more than that, I hope to see an increasing number of other people take advantage of all the opportunities there are for bringing what now seem like artifacts from the future to the benefit of the world today.

Combinators: A Centennial View

$
0
0
combinators-centenary-1.4.1

Combinators: A Centennial View

Ultimate Symbolic Abstraction

Before Turing machines, before lambda calculus—even before Gödel’s theorem—there were combinators. They were the very first abstract examples ever to be constructed of what we now know as universal computation—and they were first presented on December 7, 1920. In an alternative version of history our whole computing infrastructure might have been built on them. But as it is, for a century, they have remained for the most part a kind of curiosity—and a pinnacle of abstraction, and obscurity.

It’s not hard to see why. In their original form from 1920, there were two basic combinators, s and k, which followed the simple replacement rules (now represented very cleanly in terms of patterns in the Wolfram Language):

s
&#10005
s[x_][y_][z_] -> x[z][y[z]]
k
&#10005
k[x_][y_] -> x

The idea was that any symbolic structure could be generated from some combination of s’s and k’s. As an example, consider a[b[a][c]]. We’re not saying what a, b and c are; they’re just symbolic objects. But given a, b and c how do we construct a[b[a][c]]? Well, we can do it with the s, k combinators.

Consider the (admittedly obscure) object

s
&#10005
s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]]

(sometimes instead written S(S(KS)(S(KK)(S(KS)K)))(S(K(S(SKK)))K)).

Now treat this like a function and apply it to a,b,c s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]][a][b][c]. Then watch what happens when we repeatedly use the s, k combinator replacement rules:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]][a][b][
   c]], "StatesDisplay"]

Or, a tiny bit less obscurely:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[(CombinatorPlot[#, "FramedMatches"] & /@ 
    CombinatorFixedPointList[
     s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]][a][b][c]]), 
  "StatesDisplay"], .9]

After a number of steps, we get a[b[a][c]]! And the point is that whatever symbolic construction we want, we can always set up some combination of s’s and k’s that will eventually do it for us—and ultimately be computation universal. They’re equivalent to Turing machines, lambda calculus and all those other systems we know are universal. But they were discovered before any of these systems.

By the way, here’s the Wolfram Language way to get the result above (&sol;&sol;&period; repeatedly applies the rules until nothing changes anymore):

s
&#10005
s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]][a][b][
  c] //. {s[x_][y_][z_] -> x[z][y[z]], k[x_][y_] -> x}

And, yes, it’s no accident that it’s extremely easy and natural to work with combinators in the Wolfram Language—because in fact combinators were part of the deep ancestry of the core design of the Wolfram Language.

For me, though, combinators also have another profound personal resonance. They’re examples of very simple computational systems that turn out (as we’ll see at length here) to show the same remarkable complexity of behavior that I’ve spent so many years studying across the computational universe.

A century ago—particularly without actual computers on which to do experiments—the conceptual framework that I’ve developed for thinking about the computational universe didn’t exist. But I’ve always thought that of all systems, combinators were perhaps the earliest great “near miss” to what I’ve ended up discovering in the computational universe.

Computing with Combinators

Let’s say we want to use combinators to do a computation on something. The first question is: how should we represent the “something”? Well, the obvious answer is: just use structures built out of combinators!

For example, let’s say we want to represent integers. Here’s an (at first bizarre-seeming) way to do that. Take s[k] and repeatedly apply s[s[k[s]][k]]. Then we’ll get a sequence of combinator expressions:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 Append[NestList[s[s[k[s]][k]], s[k], 
   5], \[VerticalEllipsis]], "StatesDisplay"]

On their own, these expressions are inert under the s and k rules. But take each one (say e) and form e[s][k]. Here’s what happens for example to the third case above when you then apply the s and k rules:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  Nest[s[s[k[s]][k]], s[k], 2][s][k]], "StatesDisplay"]

To get this in the Wolfram Language, we can use Nest, which nestedly applies functions:

Nest
&#10005
Nest[f, x, 4]

Then the final result above is obtained as:

Nest
&#10005
Nest[s[s[k[s]][k]], s[k], 2][s][k] //. {s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}

Here’s an example involving nesting 7 times:

Nest
&#10005
Nest[s[s[k[s]][k]], s[k], 7][s][k] //. {s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}

So this gives us a (perhaps seemingly obscure) way to represent an integer n. Just form:

Nest
&#10005
Nest[s[s[k[s]][k]], s[k], n]

This is a combinator representation of n, that we can “decode” by applying to [s][k]. OK, so given two integers represented this way, how would we add them together? Well, there’s a combinator for that! And here it is:

s[k[s]][s[k[s[k[s]]]][s[k[k]]]]
&#10005
s[k[s]][s[k[s[k[s]]]][s[k[k]]]]

If we call this plus, then let’s compute plus[1][2][s][k], where 1 and 2 are represented by combinators:

CombinatorEvolutionPlot
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{
  "CloudGet", "[", 
   "\"\<https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.\
wl\>\"", "]"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"plus", "=", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{"k", "[", "s", "]"}], "]"}], "[", 
    RowBox[{
     RowBox[{"s", "[", 
      RowBox[{"k", "[", 
       RowBox[{"s", "[", 
        RowBox[{"k", "[", "s", "]"}], "]"}], "]"}], "]"}], "[", 
     RowBox[{"s", "[", 
      RowBox[{"k", "[", "k", "]"}], "]"}], "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"integer", "[", "n_", "]"}], ":=", 
  RowBox[{"Nest", "[", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{
      RowBox[{"s", "[", 
       RowBox[{"k", "[", "s", "]"}], "]"}], "[", "k", "]"}], "]"}], 
    ",", 
    RowBox[{"s", "[", "k", "]"}], ",", "n"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"CombinatorEvolutionPlot", "[", 
  RowBox[{
   RowBox[{"CombinatorFixedPointList", "[", 
    RowBox[{
     RowBox[{
      RowBox[{
       RowBox[{"plus", "[", 
        RowBox[{"integer", "[", "1", "]"}], "]"}], "[", 
       RowBox[{"integer", "[", "2", "]"}], "]"}], "[", "s", "]"}], 
     "[", "k", "]"}], "]"}], ",", " ", "\"\<StatesDisplay\>\"", ",", 
   " ", 
   RowBox[{"Spacings", "\[Rule]", ".85"}], ",", 
   RowBox[{"BaseStyle", "\[Rule]", 
    RowBox[{"{", 
     RowBox[{
      RowBox[{"GrayLevel", "[", ".4", "]"}], ",", 
      RowBox[{"FontSize", "\[Rule]", "9.8"}]}], "}"}]}]}], 
  "]"}]], "Input"]
}, Open  ]]

It takes a while, but there’s the result: 1 + 2 = 3.

Here’s 4 + 3, giving the result s[s[s[s[s[s[s[k]]]]]]] (i.e. 7), albeit after 51 steps:

CombinatorFixedPointList
&#10005
Cell[CellGroupData[{
Cell[BoxData[
 RowBox[{
  RowBox[{
  "CloudGet", "[", 
   "\"\<https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.\
wl\>\"", "]"}], ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"plus", "=", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{"k", "[", "s", "]"}], "]"}], "[", 
    RowBox[{
     RowBox[{"s", "[", 
      RowBox[{"k", "[", 
       RowBox[{"s", "[", 
        RowBox[{"k", "[", "s", "]"}], "]"}], "]"}], "]"}], "[", 
     RowBox[{"s", "[", 
      RowBox[{"k", "[", "k", "]"}], "]"}], "]"}], "]"}]}], 
  ";"}]], "Input"],

Cell[BoxData[
 RowBox[{
  RowBox[{"integer", "[", "n_", "]"}], ":=", 
  RowBox[{"Nest", "[", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{
      RowBox[{"s", "[", 
       RowBox[{"k", "[", "s", "]"}], "]"}], "[", "k", "]"}], "]"}], 
    ",", 
    RowBox[{"s", "[", "k", "]"}], ",", "n"}], "]"}]}]], "Input"],

Cell[BoxData[
 RowBox[{"Magnify", "[", 
  RowBox[{
   RowBox[{"Column", "[", 
    RowBox[{"CombinatorFixedPointList", "[", 
     RowBox[{
      RowBox[{
       RowBox[{
        RowBox[{"plus", "[", 
         RowBox[{"integer", "[", "4", "]"}], "]"}], "[", 
        RowBox[{"integer", "[", "3", "]"}], "]"}], "[", "s", "]"}], 
      "[", "k", "]"}], "]"}], "]"}], ",", ".3"}], "]"}]], "Input"]
}, Open  ]]

What about doing multiplication? There’s a combinator for that too, and it’s actually rather simple:

s
&#10005
s[k[s]][k]

Here’s the computation for 3 × 2—giving 6 after 58 steps:

CombinatorFixedPointList
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{
  "CloudGet", "[", 
   "\"\<https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.\
wl\>\"", "]"}], ";"}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}],

Cell[BoxData[
 RowBox[{
  RowBox[{"integer", "[", "n_", "]"}], ":=", 
  RowBox[{"Nest", "[", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{
      RowBox[{"s", "[", 
       RowBox[{"k", "[", "s", "]"}], "]"}], "[", "k", "]"}], "]"}], 
    ",", 
    RowBox[{"s", "[", "k", "]"}], ",", "n"}], "]"}]}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}],

Cell[BoxData[
 RowBox[{
  RowBox[{"times", "=", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{"k", "[", "s", "]"}], "]"}], "[", "k", "]"}]}], 
  ";"}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}],

Cell[BoxData[
 RowBox[{"Magnify", "[", 
  RowBox[{
   RowBox[{"Column", "[", 
    RowBox[{"CombinatorFixedPointList", "[", 
     RowBox[{
      RowBox[{
       RowBox[{
        RowBox[{"times", "[", 
         RowBox[{"integer", "[", "3", "]"}], "]"}], "[", 
        RowBox[{"integer", "[", "2", "]"}], "]"}], "[", "s", "]"}], 
      "[", "k", "]"}], "]"}], "]"}], ",", ".3"}], "]"}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}]
}, Open  ]]

Here’s a combinator for power:

s
&#10005
s[k[s[s[k][k]]]][k]

And here’s the computation of 32 using it (which takes 116 steps):

CombinatorFixedPointList
&#10005
Cell[CellGroupData[{Cell[BoxData[
 RowBox[{
  RowBox[{
  "CloudGet", "[", 
   "\"\<https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.\
wl\>\"", "]"}], ";"}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}],

Cell[BoxData[
 RowBox[{
  RowBox[{"power", "=", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{"k", "[", 
      RowBox[{"s", "[", 
       RowBox[{
        RowBox[{"s", "[", "k", "]"}], "[", "k", "]"}], "]"}], "]"}], 
     "]"}], "[", "k", "]"}]}], ";"}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}],

Cell[BoxData[
 RowBox[{
  RowBox[{"integer", "[", "n_", "]"}], ":=", 
  RowBox[{"Nest", "[", 
   RowBox[{
    RowBox[{"s", "[", 
     RowBox[{
      RowBox[{"s", "[", 
       RowBox[{"k", "[", "s", "]"}], "]"}], "[", "k", "]"}], "]"}], 
    ",", 
    RowBox[{"s", "[", "k", "]"}], ",", "n"}], "]"}]}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}],

Cell[BoxData[
 RowBox[{"Magnify", "[", 
  RowBox[{
   RowBox[{"Column", "[", 
    RowBox[{
     RowBox[{"CombinatorFixedPointList", "[", 
      RowBox[{
       RowBox[{
        RowBox[{
         RowBox[{"power", "[", 
          RowBox[{"integer", "[", "3", "]"}], "]"}], "[", 
         RowBox[{"integer", "[", "2", "]"}], "]"}], "[", "s", "]"}], 
       "[", "k", "]"}], "]"}], ",", 
     RowBox[{"BaseStyle", "\[Rule]", 
      RowBox[{"{", 
       RowBox[{"FontWeight", "\[Rule]", " ", "\"\<Fat\>\""}], 
       "}"}]}]}], "]"}], ",", ".12"}], "]"}]], "Input",
 CellGroupingRules->{"GroupTogetherGrouping", 10001.}]
}, Open  ]]

One might think this is a crazy way to compute things. But what’s important is that it works, and, by the way, the basic idea for it was invented in 1920.

And while it might seem complicated, it’s very elegant. All you need are s and k. Then you can construct everything from them: functions, data, whatever.

So far we’re using what’s essentially a unary representation of numbers. But we can set up combinators to handle binary numbers instead. Or, for example, we can set up combinators to do logic operations.

Imagine having k stand for true, and s[k] stand for false (so, like If[p,x,y], k[x][y] gives x while s[k][x][y] gives y). Then the minimal combinator for And is just

s[s][k]
&#10005
s[s][k]

and we can check this works by computing a truth table (TT, TF, FT, FF):

CombinatorFixedPointList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 Map[If[LeafCount[#] <= 2, #, Magnify[#, .8]] &, 
  CombinatorFixedPointList /@ 
   Apply[s[s][k][#1][#2] &] /@ 
    Tuples[{s[k], k}, 2], {2}], "StatesDisplay", Spacings -> 2]

A search gives the minimal combinator expressions for the 16 possible 2-input Boolean functions:

SKTruthTable
&#10005

And by combining these (or even just copies of the one for Nand) one can make combinators that compute any possible Boolean function. And in fact in general one can—at least in principle—represent any computation by “compiling” it into combinators.

Here’s a more elaborate example, from my book A New Kind of Science. This is a combinator that represents one step in the evolution of the rule 110 cellular automaton:

Style
&#10005

And, here from the book, are representations of repeatedly applying this combinator to compute—with great effort—three steps in the evolution of rule 110:

Rule 110

There’s a little further to go, involving fixed-point combinators, etc. But basically, since we know that rule 110 is computation universal, this shows that combinators also are.

A Hundred Years Later…

Now that a century has passed, what should we think of combinators? In some sense, they still might be the purest way to represent computation that we know. But they’re also very hard for us humans to understand.

Still, as computation and the computational paradigm advance, and become more familiar, it seems like on many fronts we’re moving ever closer to core ideas of combinators. And indeed the foundational symbolic structure of the Wolfram Language—and much of what I’ve personally built over the past 40 years—can ultimately be seen as deeply informed by ideas that first arose in combinators.

Computation may be the single most powerful unifying intellectual concept ever. But the actual engineering development of computers and computing has tended to keep different aspects of it apart. There’s data. There are data types. There’s code. There are functions. There are variables. There’s flow of control. And, yes, it may be convenient to keep these things apart in the traditional approach to the engineering of computer systems. But it doesn’t need to be that way. And combinators show us that actually there’s no need to have any of these distinctions: everything can be together, and can made of the same, dynamic “computational stuff”.

It’s a very powerful idea. But in its raw form, it’s also very disorienting for us humans. Because to understand things, we tend to rely on having “fixed anchors” to which we can attach meaning. And in pure, ever-changing seas of s, k combinators—like the ones we saw above—we just don’t have these.

Still, there’s a compromise—and in a sense that’s exactly what’s made it possible for me to build the full-scale computational language that the Wolfram Language now is. The point is that if we’re going to be able to represent everything in the world computationally we need the kind of unity and flexibility that combinator-like constructs provide. But we don’t just want raw, simple combinators. We need to in effect pre-define lots of combinator-like constructs that have particular meanings related to what we’re representing in the world.

At a practical level, the crucial idea is to represent everything as a symbolic expression, and then to say that evaluating these expressions consists in repeatedly applying transformations to them. And, yes, symbolic expressions in the Wolfram Language are just like the expressions we’ve made out of combinators—except that instead of involving only s’s and k’s, they involve thousands of different symbolic constructs that we define to represent molecules, or cities or polynomials. But the key point is that—like with combinators—the things we’re dealing with are always structurally just nested applications of pure symbolic objects.

Something we immediately learn from combinators is that “data” is really no different from “code”; they can both be represented as symbolic expressions. And both can be the raw material for computation. We also learn that “data” doesn’t have to maintain any particular type or structure; not only its content, but also the way it is built up as a symbolic expression can be the dynamic output of a computation.

One might imagine that things like this would just be esoteric matters of principle. But what I’ve learned in building the Wolfram Language is that actually they’re natural and crucially important in having convenient ways to capture computationally how we humans think about things, and the way the world is.

From the early days of practical computing, there was an immediate instinct to imagine that programs should be set up as sequences of instructions saying for example “take a thing, then do this to it, then do that” and so on. The result would be a “procedural” program like:

x = f
&#10005
x = f[x]; x = g[x]; x = h[x]; x

But as the combinator approach suggests, there’s a conceptually much simpler way to write this in which one’s just successively applying functions, to make a “functional” program:

h
&#10005
h[g[f[x]]]

(In the Wolfram Language, this can also be written h@g@f@x or x//f//g//h.)

Given the notion that everything is a symbolic expression, one’s immediately led to have functions to operate on other functions, like

Nest
&#10005
Nest[f, x, 6]

or:

ReverseApplied
&#10005
ReverseApplied[f][a, b]

This idea of such “higher-order functions” is quintessentially combinator informed—and very elegant and powerful. And as the years go by we’re gradually managing to see how to make more and more aspects of it understandable and accessible in the Wolfram Language (think: Fold, MapThread, SubsetMap, FoldPair, …).

OK, but there’s one more thing combinators do—and it’s their most famous: they allow one to set things up so that one never needs to define variables or name things. In typical programming one might write things like:

With
&#10005
With[{x = 3}, 1 + x^2]
f
&#10005
f[x_] := 1 + x^2
Function
&#10005
Function[x, 1 + x^2]
x |-> 1 + x^2
&#10005
x |-> 1 + x^2

But in none of these cases does it matter what the actual name x is. The x is just a placeholder that’s standing for something one’s “passing around” in one’s code.

But why can’t one just “do the plumbing” of specifying how something should be passed around, without explicitly naming anything? In a sense a nested sequence of functions like f[g[x]] is doing a simple case of this; we’re not giving a name to the result of g[x]; we’re just feeding it as input to f in a “single pipe”. And by setting up something like Function[x, 1+x^2] we’re constructing a function that doesn’t have a name, but which we can still apply to things:

Function
&#10005
Function[x, 1 + x^2][4]

The Wolfram Language gives us an easy way to get rid of the x here too:

(1 + #^2) &
&#10005
(1 + #^2) &[4]

In a sense the # (“slot”) here acts a like a pronoun in a natural language: we’re saying that whatever we’re dealing with (which we’re not going to name), we want to find “one plus the square of it”.

OK, but so what about the general case? Well, that’s what combinators provide a way to do.

Consider an expression like:

f
&#10005
f[g[x][y]][y]

Imagine this was called q, and that we wanted q[x][y] to give f[g[x][y]][y]. Is there a way to define q without ever mentioning names of variables? Yes, here’s how to do it with s, k combinators:

CombinatorEvolutionPlot
&#10005
CombinatorEvolutionPlot[{SKCombinatorCompile[
   f[g[x][y]][y], {x, y}]}, "StatesDisplay", 
 "DisplayStyles" -> {s -> Style[s, Black, FontWeight -> "SemiBold"], 
   k -> Style[k, Black, FontWeight -> "SemiBold"], 
   g -> Style[g, Gray], f -> Style[f, Gray]}]

There’s no mention of x and y here; the combinator structure is just defining—without naming anything—how to “flow in” whatever one provides as “arguments”. Let’s watch it happen:

CombinatorEvolutionPlot
&#10005
CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s[k[s]][s[k[s[k[f]]]][g]]][k[s[k][k]]][x][y]], "StatesDisplay", 
 "DisplayStyles" -> {s -> Style[s, Black, FontWeight -> "SemiBold"], 
   k -> Style[k, Black, FontWeight -> "SemiBold"], 
   g -> Style[g, Gray], f -> Style[f, Gray], x -> Style[x, Gray], 
   y -> Style[y, Gray]}]

Yes, it seems utterly obscure. And try as I might over the years to find a usefully human-understandable “packaging” of this that we could build into the Wolfram Language, I have so far failed.

But it’s very interesting—and inspirational—that there’s even in principle a way to avoid all named variables. Yes, it’s often not a problem to use named variables in writing programs, and the names may even communicate useful information. But there are all sorts of tangles they can get one into.

It’s particularly bad when a name is somehow global, and assigning a value to it affects (potentially insidiously) everything one’s doing. But even if one keeps the scope of a name localized, there are still plenty of problems that can occur.

Consider for example:

Function
&#10005
Function[x, Function[y, 2 x + y]]

It’s two nested anonymous functions (AKA lambdas)—and here the x “gets” a, and y “gets” b:

Function
&#10005
Function[x, Function[y, 2 x + y]][a][b]

But what about this:

Function
&#10005
Function[x, Function[x, 2 x + x]]

The Wolfram Language conveniently colors things red to indicate that something bad is going on. We’ve got a clash of names, and we don’t know “which x” is supposed to refer to what.

It’s a pretty general problem; it happens even in natural language. If we write “Jane chased Bob. Jane ran fast.” it’s pretty clear what we’re saying. But “Jane chased Jane. Jane ran fast.” is already confused. In natural language, we avoid names with pronouns (which are basically the analog of # in the Wolfram Language). And because of the (traditional) gender setup in English “Jane chased Bob. She ran fast.” happens to work. But “The cat chased the mouse. It ran fast.” again doesn’t.

But combinators solve all this, by in effect giving a symbolic procedure to describe what reference goes where. And, yes, by now computers can easily follow this (at least if they deal with symbolic expressions, like in the Wolfram Language). But the passage of a century—and even our experience with computation—doesn’t seem to have made it much easier for us humans to follow it.

By the way, it’s worth mentioning one more “famous” feature of combinators—that actually had been independently invented before combinators—and that these days, rather ahistorically, usually goes by the name “currying”. It’s pretty common—say in the Wolfram Language—to have functions that naturally take multiple arguments. GeoDistance[a, b] or Plus[a, b, c] (or a+b+c) are examples. But in trying to uniformize as much as possible, combinators just make all “functions” nominally have only one argument.

To set up things that “really” have multiple arguments, one uses structures like f[x][y][z]. From the point of standard mathematics, this is very weird: one expects “functions” to just “take an argument and return a result”, and “map one space to another” (say real numbers to complex numbers).

But if one’s thinking “sufficiently symbolically” it’s fine. And in the Wolfram Language—with its fundamentally symbolic character (and distant ancestry in combinator concepts)—one can just as well make a definition like

f
&#10005
f[x_][y_] := x + y

as:

f
&#10005
f[x_, y_] := x + y

Back in 1980—even though I don’t think I knew about combinators yet at that time—I actually tried in my SMP system that was a predecessor to the Wolfram Language the idea of having f[x][y] be able to be equivalent to f[x,y]. But it was a bit like forcing every verb to be intransitive—and there were many situations in which it was quite unnatural, and hard to understand.

Combinators in the Wild: Some Zoology

So far we’ve been talking about combinators that are set up to compute specific things that we want to compute. But what if we just pick possible combinators “from the wild”, say at random? What will they do?

In the past, that might not have seemed like a question that was worth asking. But I’ve now spent decades studying the abstract computational universe of simple programs—and building a whole “new kind of science” around the discoveries I’ve made about how they behave. And with that conceptual framework it now becomes very interesting to look at combinators “in the wild” and see how they behave.

So let’s begin at the beginning. The simplest s, k combinator expressions that won’t just remain unchanged under the combinator rules have to have size 3. There are a total of 16 such expressions:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						CombinatorEvolutionPlot[{#}, "StatesDisplay"] & /@ 
 EnumerateCombinators[3]

And none of them do anything interesting: they either don’t change at all, or, as in for example k[s][s], they immediately give a single symbol (here s).

But what about larger combinator expressions? The total number of possible combinator expressions of size n grows like

Table
&#10005
Table[2^n CatalanNumber[n - 1], {n, 10}]

or in general

2^n CatalanNumber
&#10005
2^n CatalanNumber[n - 1] == (2^n Binomial[2 n - 2, n - 1])/n

or asymptotically:

Asymptotic
&#10005

At size 4, again nothing too interesting happens. With all the 80 possible expressions, the longest it takes to reach a fixed point is 3 steps, and that happens in 4 cases:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList /@ {s[k][s][s], s[k][s][k], s[k][k][s], 
   s[k][k][k]}, "StatesDisplay", Spacings -> 2]

At size 5, the longest it takes to reach a fixed point is 4 steps, and that happens in 10 cases out of 448:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Column[
 CombinatorEvolutionPlot[#, "StatesDisplay", ItemSize -> 10] & /@ 
  Partition[
   CombinatorFixedPointList /@ 
     Select[EnumerateCombinators[5], 
      Length[CombinatorFixedPointList[#]] == 4 &] /. {s -> 
      Style[s, Black, FontWeight -> "SemiBold"], 
     k -> Style[k, Black, FontWeight -> "SemiBold"], 
     a -> Style[a, Black], b -> Style[b, Black], 
     c -> Style[c, Black]}, 5], Spacings -> 1]

At size 6, there is a slightly broader distribution of “halting times”:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Histogram[
 Length /@ CombinatorFixedPointList /@ EnumerateCombinators[6], {1}, 
 Frame -> True, ChartStyle -> $PlotStyles["Histogram", "ChartStyle"], 
 ImageSize -> 200]

The longest halting time is 7, achieved by:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  CombinatorFixedPointList /@ 
   Select[EnumerateCombinators[6], 
    Length[CombinatorFixedPointList[#]] == 7 &], "StatesDisplay", 
  Spacings -> 2], .9]

Meanwhile, the largest expressions created are of size 10 (in the sense that they contain a total of 10 s’s or k’s):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  PadRight[CombinatorFixedPointList /@ 
    Select[EnumerateCombinators[6], 
     LeafCount[CombinatorFixedPoint[#]] == 10 &], {Automatic, 
    Automatic}, ""], "StatesDisplay", Spacings -> 2], .75]

The distribution of final sizes is a little odd:

Histogram
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Histogram[
 LeafCount[CombinatorFixedPoint[#]] & /@ EnumerateCombinators[6], {1},
  Frame -> True, ChartStyle -> $PlotStyles["Histogram", "ChartStyle"],
  ImageSize -> 200]

For size n  5, there’s actually a gap with no final states of size n – 1 generated. But at size 6, out of 2688 expressions, there are just 12 that give size 5 (about 0.4%).

OK, so what’s going to happen if we go to size 7? Now there are 16,896 possible expressions. And there’s something new: two never stabilize (S(SS)SSSS), (SSS(SS)SS):

{s
&#10005
{s[s[s]][s][s][s][s], s[s][s][s[s]][s][s]}

After one step, the first one of these evolves to the second, but then this is what happens over the next few steps (we’ll see other visualizations of this later):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorEvolveList[s[s][s][s[s]][s][s], 8], "StatesDisplay"]

The total size (i.e. LeafCount, or “number of s’s”) grows like:

LeafCount
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
					LeafCount /@ CombinatorEvolveList[s[s][s][s[s]][s][s], 30]

A log plot shows that after an initial transient the size grows roughly exponentially:

s7lengths
&#10005

And looking at successive ratios one sees some elaborate fine structure:

s7lengths
&#10005

What is this ultimately doing? With a little effort, one finds that the sizes have a length-83 transient, followed by sequences of values of length 23 + 2n, in which the second differences of successive sizes are given by:

Join
&#10005
Join[38 {0, 0, 0, 12, -17} 2^n + {0, 1, 0, -135, 189}, Table[0, n], 
 38 {0, 1, 0, 0, 1, -1, 0, 0, 0, 4} 2^n + {12, -13, 0, 6, -7, 1, 0, 1,
    0, -27}, Table[0, n + 2], 
 228 {0, 1, 0, 0, 1, -1} 2^n + 2 {6, -20, 0, 3, -17, 14}]

The final sequence of sizes is obtained by concatenating these blocks and computing Accumulate[Accumulate[list]]—giving an asymptotic size that appears to be of the form . So, yes, we can ultimately “figure out what’s going on” with this little size-7 combinator (and we’ll see some more details later). But it’s remarkable how complicated it is.

OK, but let’s go back and look at the other size-7 expressions. The halting time distribution (ignoring the 2 cases that don’t halt) basically falls off exponentially, but shows a couple of outliers:

allres = ResourceFunction
&#10005

The maximum finite halting time is 16 steps, achieved by s[s[s[s]]][s][s][s] (S(S(SS))SSS):

CombinatorEvolutionPlot
&#10005

And the distribution of final sizes is (with the maximum of 41 being achieved by the maximum-halting-time expression we’ve just seen):

allfix7
&#10005

OK, so what happens at size 8? There are 109,824 possible combinator expressions. And it’s fairly easy to find out that all but 76 of these go to fixed points within at most 50 steps (the longest survivor is s[s][s][s[s[s]]][k][k] (SSS(S(SS))KK), which halts after 44 steps):

allres8t50
&#10005

The final fixed points in these cases are mostly quite small; this is the distribution of their sizes:

allres8t50
&#10005

And here is a comparison between halting times and final sizes:

allres8t50
&#10005

The outlier for size is s[s][k][s[s[s]][s]][s] (SSK(S(SS)S)S), which evolves in 27 steps to a fixed expression of size 80 (along the way reaching an intermediate expression of size 86):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
					ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][k][s[s[s]][s]][s], 33], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

Among combinator expressions that halt in less than 50 steps, the maximum intermediate expression size of 275 is achieved for s[s][s][s[s[s][k]]][k] (SSS(S(SSK))K) (which ultimately evolves to s[s[s[s][k]]][k] (S(S(SSK))K) after 26 steps):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][s][s[s[s][k]]][k], 33], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

So what about size-8 expressions that don’t halt after 50 steps? There are altogether 76—with 46 of these being inequivalent (in the sense that they don’t quickly evolve to others in the set).

Here’s how these 46 expressions grow (at least until they reach size 10,000):

inonterms
&#10005

Some of these actually end up halting. In fact, s[s][s][s[s]][s][k[k]] (SSS(SS)S(KK)) halts after just 52 steps, with final result k[s[k][k[s[k][k]]]] (K(SK(K(SKK)))), having achieved a maximum expression size of 433:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][s][s[s]][s][k[k]], 60], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

The next shortest halting time occurs for s[s][s][s[s[s]]][k][s] (SSS(S(SS))KS), which takes 89 steps to produce an expression of size 65:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][s][s[s[s]]][k][s], 95], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

Then we have s[s][s][s[s[s]]][s][k] (SSS(S(SS))SK), which halts (giving the size-10 s[k][s[s[s[s[s[s]]][s]]][k]] (SK(S(S(S(S(SS))S))K)), but only after 325 steps:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][s][s[s[s]]][s][k], 350], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

There’s also a still-larger case to be seen: s[s[s][s]][s][s[s]][k] (S(SSS)S(SS)K), which exhibits an interesting “IntegerExponent-like” nested pattern of growth, but finally halts after 1958 steps, having achieved a maximum intermediate expression size of 21,720 along the way:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 LeafCount /@ CombinatorFixedPointList[s[s[s[s][s]]][s][s][k]], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

What about the other expressions? s[s][s][s[s]][s][s[k]] shows very regular growth in size:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][s][s[s]][s][s[k]], 300], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 250]

In the other cases, there’s no such obvious regularity. But one can start to get a sense of what happens by plotting differences between sizes on successive steps:

inonterms
&#10005

There are some obvious cases of regularity here. Several show a regular pattern of linearly increasing differences, implying overall t2 growth in size:

pairGraphic
&#10005

singleGraphic
&#10005

Others show regular growth in differences, leading to t3/2 growth in size:

pairGraphic
&#10005

pairGraphic
&#10005

Others have pure exponential growth:

pairLogGraphic
&#10005

There are quite a few that have regular but below-exponential growth, much like the size-7 case s[s][s][s[s]][s][s] (SSS(SS)SS) with ~ growth:

pairLogGraphic
&#10005

All the cases we’ve just looked at only involve s. When we allow k as well, there’s for example s[s][s][s[s[s][s]]][k] (SSS(S(SSS))K)—which shows regular, essentially “stair-like” growth:

CombinatorFixedPointList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 LeafCount /@ 
  CombinatorFixedPointList[s[s][s][s[s[s][s]]][k], 
   "MaxSize" -> 100000, "MaxSteps" -> 2000], Frame -> True, 
 Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 330]

There’s also a case like s[s[s]][s][s[s]][s][k] (S(SS)S(SS)SK):

CombinatorFixedPointList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 LeafCount /@ 
  CombinatorFixedPointList[s[s[s]][s][s[s]][s][k], "MaxSize" -> 50000,
    "MaxSteps" -> 4000], Frame -> True, Joined -> True, 
 Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
 AspectRatio -> 1/3, ImageSize -> 600]

On a small scale, this appears somewhat regular, but the larger-scale structure, as revealed by taking differences, it doesn’t seem so regular (though it does have a certain “IntegerExponent-like” look):

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Differences[
  LeafCount /@ 
   CombinatorFixedPointList[s[s[s]][s][s[s]][s][k], 
    "MaxSize" -> 50000, "MaxSteps" -> 4000]], Frame -> True, 
 Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
 AspectRatio -> 1/5, PlotRange -> All, ImageSize -> 620]

It’s not clear what will happen in this case. The overall form of the behavior looks a bit similar to examples above that eventually terminate. Continuing for 50,000 steps, though, here’s what’s happened:

CombinatorEvolve
&#10005

And in fact it turns out that the size-difference peaks continue to get higher—-having values of the form 6 (17 × 2n + 1) and occurring at positions of the form 2 (9 × 2n+2 + n – 18).

Here’s another example: s[s][s][s[s]][s][k[s]] (SSS(SS)S(KS))). The overall growth in this case—at least for 200 steps—looks somewhat irregular:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 LeafCount /@ CombinatorEvolveList[s[s][s][s[s]][s][k[s]], 200], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
 AspectRatio -> 1/3]

And taking differences reveals a fairly complex pattern of behavior:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Differences[
  LeafCount /@ CombinatorEvolveList[s[s][s][s[s]][s][k[s]], 200]], 
 Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], PlotRange -> All, 
 AspectRatio -> 1/3, ImageSize -> 570]

But after 1000 steps there appears to be some regularity to be seen:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

And even after 2000 steps the regularity is more obvious:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

There’s a long transient, but after that there are systematic peaks in the size difference, with the nth peak having height 16487 + 3320 n and occurring at step 14n2 + 59n + 284. (And, yes, it’s pretty weird to see all these strange numbers cropping up.)

What happens if we look at size-10 combinator expressions? There’s a lot of repeating of behavior that we’ve seen with smaller expressions. But some new things do seem to happen.

After 1000 steps s[s][k][s[s][k][s[s]][s]][k] (SSK(SSK(SS)S)K) seems to be doing something quite complicated when one looks at its size differences:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

But it turns out that this is just a transient, and after 1000 steps or so, the system settles into a pattern of continual growth similar to ones we’ve seen before:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

Another example is s[s][k][s[s][k][s[s]][s]][s] (SSK(SSK(SS)S)S). After 2000 steps there seems to be some regularity, and some irregularity:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

And basically this continues:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

s[s][s][s[s[s[k]]]][s][s[k]] (SSS(S(S(SK)))S(SK)) is a fairly rare example of “nested-like” growth that continues forever (after a million steps, the size obtained is 597,871,806):

SKCombinatorLeftmostOutermostLeafCounts
&#10005

As a final example, consider s[s[s]][s][s][s][s[s][k[k]]] (S(SS)SSS(SS(KK))). Here’s what this does for the first 1000 steps:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

It looks somewhat complicated, but seems to be growing slowly. But then around step 4750 it suddenly jumps up, quickly reaching size 51,462:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

Keep going further, and there are more jumps:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

After 100,000 steps there’s a definite pattern of jumps—but it’s not quite regular:

ListStepPlot
&#10005

So what’s going to happen? Mostly it seems to be maintaining a size of a few thousand or more. But then, after 218,703 steps, it dips down, to size 319. So, one might think, perhaps it’s going to “die out”. Keep going longer, and at step 34,339,093 it gets down to size 27, even though by step 36,536,622 it’s at size 105,723.

Keep going even longer, and one sees it dipping down in size again (here shown in a downsampled log plot):

down = Transpose
&#10005

But, then, suddenly, boom. At step 137,356,329 it stops, reaching a fixed point of size 39. And, yes, it’s totally remarkable that a tiny combinator expression like s[s[s]][s][s][s][s[s][k[k]]] (S(SS)SSS((SS(KK))) can do all this.

If one hasn’t seen it before, this kind of complexity would be quite shocking. But after spending so long exploring the computational universe, I’ve become used to it. And now I just view each new case I see as yet more evidence for my Principle of Computational Equivalence.

A central fact about s, k combinators is that they’re computation universal. And this tells us that whatever computation we want to do, it’ll always be possible to “write a combinator program”—i.e. to create a combinator expression—that’ll do it. And from this it follows that—just like with the halting problem for Turing machines—the problem of whether a combinator will halt is in general undecidable.

But the new thing we’re seeing here is that it’s difficult to figure out what will happen not just “in general” for complicated expressions set up to do particular computations but also for simple combinator expressions that one might “find in the wild”. But the Principle of Computational Equivalence tells us why this happens.

Because it says that even simple programs—and simple combinator expressions—can lead to computations that are as sophisticated as anything. And this means that their behavior can be computationally irreducible, so that the only way to find out what will happen is essentially just to run each step and see what happens. So then if one wants to know what will happen in an infinite time, one may have to do an effectively infinite amount of computation to find out.

Might there be another way to formulate our questions about the behavior of combinators? Ultimately we could use any computation universal system to represent what combinators do. But some formulations may connect more immediately with existing ideas—say mathematical ones. And for example I think it’s conceivable that the sequences of combinator sizes we’ve seen above could be obtained in a more “direct numerical way”, perhaps from something like nestedly recursive functions (I discovered this particular example in 2003):

f
&#10005
f[n_] := 3 f[n - f[n - 1]]
f
&#10005
f[n_ /; n < 1] = 1
nrf
&#10005

Visualizing Combinators

One of the issues in studying combinators is that it’s so hard to visualize what they’re doing. It’s not like with cellular automata where one can make arrays of black and white cells and readily use our visual system to get an impression of what’s going on. Consider for example the combinator evolution:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorEvolveList[s[s][k][s[s[s]][s]][s], 7], "StatesDisplay"]

In a cellular automaton the rule would be operating on neighboring elements, and so there’d be locality to everything that’s happening. But here the combinator rules are effectively moving whole chunks around at a time, so it’s really hard to visually trace what’s happening.

But even before we get to this issue, can we make the mass of brackets and letters in something like

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolveList[s[s][k][s[s[s]][s]][s], 6] // Last

easier to read? For example, do we really need all those brackets? In the Wolfram Language, for example, instead of writing

a
&#10005
a[b[c[d[e]]]]

we can equivalently write

a@b@c@d@e
&#10005
a@b@c@d@e

thereby avoiding brackets.

But using @ doesn’t avoid all grouping indications. For example, to represent

a
&#10005
a[b][c][d][e]

with @ we’d have to write:

(((a@b)@c)@d)@e
&#10005
(((a@b)@c)@d)@e

In our combinator expression above, we had 24 pairs of brackets. By using @, we can reduce this to 10:

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 s[s[s[s]][k[s[s[s]][s]][s]]][
  k[s[s[s]][s]][s][
   s[s[s]][k[s[s[s]][s]][s]]]], "CharactersRightAssociative", 
 "ApplicationGlyph" -> 
  Style["\[NegativeVeryThinSpace]\[NegativeVeryThinSpace]@\
\[NegativeVeryThinSpace]", 11], "UseCombinatorGlyphs" -> None]

And we don’t really need to show the @, so we can make this smaller:

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"];
CombinatorPlot[
 s[s[s[s]][k[s[s[s]][s]][s]]][
  k[s[s[s]][s]][s][
   s[s[s]][k[s[s[s]][s]][s]]]], "CharactersRightAssociative", 
 "ApplicationGlyph" -> "\[NegativeVeryThinSpace]", 
 "UseCombinatorGlyphs" -> None]

When combinators were first introduced a century ago, the focus was on “multi-argument-function-like” expressions such as a[b][c] (as appear in the rules for s and k), rather than on “nested-function-like” expressions such as a[b[c]]. So instead of thinking of function application as “right associative”—so that a[b[c]] can be written without parentheses as a@b@c—people instead thought of function application as left associative—so that a[b][c] could be written without parentheses. (Confusingly, people often used @ as the symbol for this left-associative function application.)

As it’s turned out, the f[g[x]] form is much more common in practice than f[g][x], and in 30+ years there hasn’t been much of a call for a notation for left-associative function application in the Wolfram Language. But in celebration of the centenary of combinators, we’ve decided to introduce Application (indicated by ) to represent left-associative function application.

So this means that a[b][c][d][e] can now be written

FunctionToApplication
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; FunctionToApplication[a[b][c][d][e]]

without parentheses. Of course, now a[b[c[d[e]]]] needs parentheses:

FunctionToApplication
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; FunctionToApplication[a[b[c[d[e]]]]]

In this notation the rules for s and k can be written without brackets as:

FunctionToApplication
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; FunctionToApplication[{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}]

Our combinator expression above becomes

FunctionToApplication
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; FunctionToApplication[
 s[s[s[s]][k[s[s[s]][s]][s]]][
  k[s[s[s]][s]][s][s[s[s]][k[s[s[s]][s]][s]]]]]

or without the function application character

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 s[s[s[s]][k[s[s[s]][s]][s]]][
  k[s[s[s]][s]][s][
   s[s[s]][k[s[s[s]][s]][s]]]], "CharactersLeftAssociative", 
 "ApplicationGlyph" -> "\[NegativeVeryThinSpace]", 
 "UseCombinatorGlyphs" -> None]

which now involves 13 pairs of parentheses.

Needless to say, if you consider all possible combinator expressions, left and right associativity on average do just the same in terms of parenthesis counts: for size-n combinator expressions, both on average need pairs; the number of cases needing k pairs is

Binomial
&#10005
Binomial[n - 1, k - 1] Binomial[n, k - 1]/k

(the “Catalan triangle”). (Without associativity, we’re dealing with our standard representation of combinator expressions, which always requires n – 1 pairs of brackets.)

By the way, the number of “right-associative” parenthesis pairs is just the number of subparts of the combinator expression that match _[_][_], while for left-associative parenthesis pairs it’s the number that match _[_[_]]. (The number of brackets in the no-associativity case is the number of matches of _[_].)

If we look at the parenthesis/bracket count in the evolution of the smallest nonterminating combinator expression from above s[s][s][s[s]][s][s] (otherwise known as sss(ss)ss) we find:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"];
ListStepPlot[
 Callout[#[[1]] /@ CombinatorEvolveList[s[s][s][s[s]][s][s], 50], #[[
     2]]] & /@ {{LeafCount[#] &, 
    "none"}, {Count[#, _[_][_], {0, Infinity}, Heads -> True] &, 
    "left"}, {Count[#, _[_[_]], {0, Infinity}, Heads -> True] &, 
    "right"}}, Center, Frame -> True, Joined -> True]

Or in other words, in this case, left associativity leads on average to about 62% of the number of parentheses of right associativity. We’ll look at this in more detail later, but for growing combinator expressions, it’ll almost always turn out to be the case that left associativity is the “parenthesis-avoidance winner”.

But even with our “best parenthesis avoidance” it’s still very hard to see what’s going on from the textual form:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 Grid[{{Column[{CombinatorEvolutionPlot[
       CombinatorPlot[#, "CharactersLeftAssociative", 
          "ApplicationGlyph" -> "\[NegativeVeryThinSpace]", 
          "UseCombinatorGlyphs" -> None] & /@ 
        CombinatorEvolveList[s[s][s][s[s]][s][s], 7], 
       "StatesDisplay"], Text[Style["left", Italic, 12]]}, 
     Dividers -> Center, FrameStyle -> LightGray, 
     Alignment -> {-1 -> Center}], 
    Column[{CombinatorEvolutionPlot[
       CombinatorPlot[#, "CharactersRightAssociative", 
          "ApplicationGlyph" -> "\[NegativeVeryThinSpace]", 
          "UseCombinatorGlyphs" -> None] & /@ 
        CombinatorEvolveList[s[s][s][s[s]][s][s], 7], 
       "StatesDisplay"], Text[Style["right", Italic, 12]]}, 
     Dividers -> Center, FrameStyle -> LightGray, 
     Alignment -> {-1 -> Center}], 
    Column[{CombinatorEvolutionPlot[
       CombinatorEvolveList[s[s][s][s[s]][s][s], 7], "StatesDisplay"],
       Text[Style["none", Italic, 12]]}, Dividers -> Center, 
     FrameStyle -> LightGray, Alignment -> {-1 -> Center}]}}, 
  Dividers -> {{{Directive[Thick, Gray]}}, {False}}, 
  Spacings -> 2], .89]

So what about getting rid of parentheses altogether? Well, we can always use so-called Polish (or Łukasiewicz) “prefix” notation—in which we write f[x] as fx and f[g[x]] as fgx. And in this case our combinator expression from above becomes:

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 s[s[s[s]][k[s[s[s]][s]][s]]][
  k[s[s[s]][s]][s][
   s[s[s]][k[s[s[s]][s]][s]]]], "CharactersPolishNotation", 
 "UseCombinatorGlyphs" -> None]

Alternatively—like a traditional HP calculator—we can use reverse Polish “postfix” notation, in which f[x] is fx and f[g[x]] is fgx (and is like HP ):

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 s[s[s[s]][k[s[s[s]][s]][s]]][
  k[s[s[s]][s]][s][
   s[s[s]][k[s[s[s]][s]][s]]]], "CharactersReversePolishNotation", 
 "UseCombinatorGlyphs" -> None]

The total number of symbols is always equal to the number of pairs of brackets in our standard “non-associative” functional form:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 Grid[{{Column[{CombinatorEvolutionPlot[
       Row[Flatten[# //. x_[y_] -> {\[Bullet], x, y}]] & /@ 
        CombinatorEvolveList[s[s][s][s[s]][s][s], 7], 
       "StatesDisplay"], Text[Style["Polish", Italic, 12]]}, 
     Dividers -> Center, FrameStyle -> LightGray, 
     Alignment -> {-1 -> Center}], 
    Column[{CombinatorEvolutionPlot[
       Row[Flatten[# //. x_[y_] -> {x, y, \[Bullet]}]] & /@ 
        CombinatorEvolveList[s[s][s][s[s]][s][s], 7], 
       "StatesDisplay"], Text[Style["reverse Polish", Italic, 12]]}, 
     Dividers -> Center, FrameStyle -> LightGray, 
     Alignment -> {-1 -> Center}]}}, 
  Dividers -> {{{Directive[Thick, Gray]}}, {False}}, Spacings -> 2],
  1]

What if we look at this on a larger scale, “cellular automaton style”, with s being and being ? Here’s the not-very-enlightening result:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; GraphicsRow[{Labeled[
   CombinatorEvolutionPlot[
    CombinatorEvolveList[s[s][s][s[s]][s][s], 7], 
    "ArrayPlotPolishNotation",  Mesh -> True, ImageSize -> 300, 
    MeshStyle -> GrayLevel[0, .18]], 
   Text[Style["Polish", Italic, 12]]], 
  Labeled[CombinatorEvolutionPlot[
    CombinatorEvolveList[s[s][s][s[s]][s][s], 7], 
    "ArrayPlotReversePolishNotation", Mesh -> True, ImageSize -> 300, 
    MeshStyle -> GrayLevel[0, .18]], 
   Text[Style["reverse Polish", Italic, 12]]]}, ImageSize -> 640]

Running for 50 steps, and fixing the aspect ratio, we get (for the Polish case):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorEvolveList[s[s][s][s[s]][s][s], 
  50], "ArrayPlotPolishNotation", ImageSize -> 630, 
 AspectRatio -> 1/2]

We can make the same kinds of pictures from our bracket representation too. We just take a string like s[s][s][s[s]][s][s] and render each successive character as a cell of some color. (It’s particularly easy if we’ve only got one basic combinator—say s—because then we only need colors for the opening and closing brackets.) We can also make “cellular automaton–style” pictures from parenthesis representations like SSS(SS)SS. Again, all we do is render each successive character as a cell of some color.

The results essentially always tend to look much like the reverse Polish case above. Occasionally, though, they reveal at least something about the “innards of the computation”. Like here’s the terminating combinator expression s[s][s][s[s[s]]][k][s]] from above, rendered in right-associative form:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s][s][s[s[s]]][k][s]], "ArrayPlotRightAssociative", 
 AspectRatio -> .6, "IncludeUpdateHighlighting" -> False, 
 ImageSize -> 530]

Pictures like this in a sense convert all combinator expressions to sequences. But combinator expressions are in fact hierarchical structures, formed by nested invocations of symbolic “functions”. One way to represent the hierarchical structure of

s
&#10005
s[s][s][s[s]][s][
  s] /. {s -> Style[s, Black, FontWeight -> "SemiBold"]}

is through a hierarchy of nested boxes:

MapAll
&#10005
MapAll[# /. {a_[b_] -> Framed[Row[{a, " ", b}]], 
    a_Symbol -> Framed[a]} &, s[s][s][s[s]][s][s], Heads -> True]

We can color each box by its depth in the expression:

But now to represent the expression all we really need to do is show the basic combinators in a color representing its depth. And doing this, we can visualize the terminating combinator evolution above as:

FramedCombinatorGraphic
&#10005

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Legended[
 CombinatorEvolutionPlot[
  CombinatorFixedPointList[s[s][s][s[s[s]]][k][s]], 
  "ExpressionDepthPlot", FrameTicks -> {True, True, False, False}], 
 BarLegend[{"Rainbow", {0, 18}}, LegendMarkerSize -> 125]]

We can also render this in 3D (with the height being the “depth” in the expression):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s][s][s[s[s]]][k][s]], "ExpressionDepthPlot", 
 "IncludeDepthAxis" -> True, BoxRatios -> {1, 1, .1}, 
 PlotRange -> {Automatic, Automatic, {.001, 18}}, 
 ClippingStyle -> Directive[GrayLevel[.5], Opacity[.4]], 
 BoundaryStyle -> None, Boxed -> False, Axes -> False]

To test out visualizations like these, let’s look (as above) at all the size-8 combinator expressions with distinct evolutions that don’t terminate within 50 steps. Here’s the “depth map” for each case:

inonterms
&#10005

In these pictures we’re drawing a cell for each element in the “stringified version” of the combinator expression at each step, then coloring it by depth. But given a particular combinator expression, one can consider other ways to indicate the depth of each element. Here are a few possibilities, shown for step 8 in the evolution of s[s][s][s[s]][s][s] (SSS(SS)SS) (note that the first of these is essentially the “indentation level” that might be used if each s, k were “pretty printed” on a separate line):

SKStep
&#10005

MatchedBracketsPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
MatchedBracketsPlot[CombinatorEvolve[s[s][s][s[s]][s][s], 8], 
   Appearance -> #, AspectRatio -> 1/3, 
   "BracketCharacters" -> {"[", "]"}, 
   ColorRules -> {s -> RGBColor[
      0.8823529411764706, 0.29411764705882354`, 0.2980392156862745]}, 
   ImageSize -> 300, PlotStyle -> AbsolutePointSize[4], 
   "IncludeTextForm" -> False] & /@ {"Mountain", "Vee", "Bush", 
  "Tree"}

And this is what one gets on a series of steps:

MatchedBracketsPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Table[
 Labeled[GraphicsRow[
   MatchedBracketsPlot[CombinatorEvolve[s[s][s][s[s]][s][s], t], 
      Appearance -> #, "IncludeTextForm" -> False, AspectRatio -> 1/3,
       PlotStyle -> AbsolutePointSize[0]] & /@ {"Mountain", "Vee", 
     "Bush", "Tree"}, ImageSize -> 600], Text[Style[t, Gray]], 
  Left], {t, 10, 25, 5}]

But in a sense a more direct visualization of combinator expressions is as trees, as for example in:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Grid[{CombinatorEvolutionPlot[{#}, "StatesDisplay"] , 
    CombinatorExpressionGraph[#, "MatchHighlighting" -> False, 
     VertexSize -> {"Scaled", 0.08}, ImageSize -> {Automatic, 100}, 
     AspectRatio -> 1/2]} & /@ {s[s[s]], s[s][s], 
   s[s][s][s[s]][s][s]}, Frame -> All, FrameStyle -> LightGray]
CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Grid[{{CombinatorEvolutionPlot[{s[
       s[s[s]][k[s[s[s]][s]][s]]][
      k[s[s[s]][s]][s][s[s[s]][k[s[s[s]][s]][s]]]]}, 
    "StatesDisplay"]}, {CombinatorExpressionGraph[
    s[s[s[s]][k[s[s[s]][s]][s]]][
     k[s[s[s]][s]][s][s[s[s]][k[s[s[s]][s]][s]]]], 
    AspectRatio -> 1/2, "MatchHighlighting" -> False, 
    ImageSize -> 450]}}, Frame -> All, FrameStyle -> LightGray, 
 Alignment -> Center]

Note that these trees can be somewhat simplified by treating them as left or right “associative”, and essentially pulling left or right leaves into the “branch nodes”.

But using the original trees, we can ask for example what the trees for the expressions produced by the evolution of s[s][s][s[s]][s][s] (SS(SS)SS) are. Here are the results for the first 15 steps:

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorExpressionGraph[#, "UpdateHighlighting" -> {}, 
   "MatchHighlighting" -> False, 
   "EvaluationScheme" -> {"Leftmost", "Outermost", 1}, 
   "ShowVertexLabels" -> False, ImageSize -> {Automatic, 80}] & /@ 
 CombinatorEvolveList[s[s][s][s[s]][s][s], 15]

In a different rendering, these become:

TreePlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
TreePlot[CombinatorExpressionGraph[#, "UpdateHighlighting" -> {}, 
    "EvaluationScheme" -> {"Leftmost", "Outermost", 1}, 
    "ShowVertexLabels" -> False, ImageSize -> {Automatic, 60}], 
   Center] & /@ CombinatorEvolveList[s[s][s][s[s]][s][s], 15]

OK, so these are representations of the combinator expressions on successive steps. But where are the rules being applied at each step? As we’ll discuss in much more detail in the next section, in the way we’ve done things so far we’re always doing just one update at each step. Here’s an example of where the updates are happening in a particular case:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorPlot[#, "FramedMatches"] & /@ 
  NestList[CombinatorStep, s[s][s][s[s[s]]][k][s], 
   7], "StatesDisplay"]

Continuing longer we get (note that some lines have wrapped in this display):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  CombinatorPlot[#, "FramedMatches"] & /@ 
   NestList[CombinatorStep, s[s][s][s[s[s]]][k][s], 20], 
  "StatesDisplay"], .4]

A feature of the way we’re writing out combinator expressions is that the “input” to any combinator rule always corresponds to a contiguous span within the expression as we display it. So when we show the total size of combinator expressions on each step in an evolution, we can display which part is getting rewritten:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[s[s][s][s[s[s]]][k][s]], "SizeAndMatches", 
 ImageSize -> 500]

Notice that, as expected, application of the S rule tends to increase size, while the K rule decreases it.

Here is the distribution of rule applications for all the examples we showed above:

inonterms
&#10005

We can combine multiple forms of visualization by including depths:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[s[s][s][s[s[s]]][k][s], 
  "MaxSize" -> 100], "DepthAndUpdatePlot"]
CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s][s][s[s[s]]][k][s]], "DepthAndUpdatePlot", 
 "SpanThickness" -> .6]

We can also do the same in 3D:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[s[s][s][s[s[s]]][k][s]], "DepthCuboidPlot", 
 Axes -> True]

So what about the underlying trees? Here are the S, K combinator rules in terms of trees:

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Column[
 Map[Row[{#[[1]], Spacer[15], 
     Style["\[LongRightArrow]", FontSize -> 18], 
     Spacer[3], #[[2]]} ] &, 
  Map[CombinatorExpressionGraph[#, VertexSize -> .3, 
     "UpdateHighlighting" -> {}, "MatchHighlighting" -> True, 
     ImageSize -> Small] &, {{s[x][y][z], x[z][y[z]]}, {k[x][y], 
     x}}, {2}]], Spacings -> 2]

And here are the updates for the first few steps of the evolution of s[s][s][s[s[s]]][k][s] (SSS(S(SS))KS):

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorExpressionGraph[#, "UpdateHighlighting" -> {"Subtrees"}, 
   "EvaluationScheme" -> {"Leftmost", "Outermost", 1}, 
   "MatchHighlighting" -> False, "ShowVertexLabels" -> False, 
   ImageSize -> {Automatic, 100}] & /@ 
 CombinatorEvolveList[s[s][s][s[s[s]]][k][s], 12]

In these pictures we are effectively at each step highlighting the “first” subtree matching s[_][_][_] or k[_][_]. To get a sense of the whole evolution, we can also simply count the number of subtrees with a given general structure (like _[_][_] or _[_[_]]) that occur at a given step (see also below):

CombinatorFixedPointList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Function[p, 
   Callout[Count[#, p, {0, Infinity}, Heads -> True] & /@ 
     CombinatorFixedPointList[s[s][s][s[s[s]]][k][s]], 
    Quiet[CombinatorExpressionGraph[p, ImageSize -> 30]], 
    Frame -> 
     True]] /@ {_, _[_], _[_][_], _[_[_]], _[_[_[_]]], _[_][_][_]}, \
Center, Joined -> True, Frame -> True, ImageSize -> 530]

One more indication of the behavior of combinators comes from looking at tree depths. In addition to the total depth (i.e. Wolfram Language Depth) of the combinator tree, one can also look at the depth at which update events happen (here with the total size shown underneath):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Function[{list}, 
  GraphicsColumn[{CombinatorEvolutionPlot[list, "UpdateDepthPlot"], 
    ListStepPlot[LeafCount /@ list, Center, AspectRatio -> 1/4, 
     Frame -> True, 
     PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"]]}]][
 CombinatorFixedPointList[s[s][s][s[s[s]]][k][s]]]

Here are the depth profiles for the rules shown above:

inonterms
&#10005

Not surprisingly, total depth tends to increase when growth continues. But it is notable that—except when termination is near at hand—it seems like (at least with our current updating scheme) updates tend to be made to “high-level” (i.e. low-depth) parts of the expression tree.

When we write out a combinator expression like the size-33

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[{CombinatorEvolve[
   s[s][s][s[s]][s][s], 9]}, "StatesDisplay"]

or show it as a tree

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorExpressionGraph[
 CombinatorEvolve[s[s][s][s[s]][s][s], 9], "UpdateHighlighting" -> {},
  "ShowVertexLabels" -> False]

we’re in a sense being very wasteful, because we’re repeating the same subexpressions many times. In fact, in this particular expression, there are 65 subexpressions altogether—but only 11 distinct ones.

So how can we represent a combinator expression making as much use as possible of the commonality of these subexpressions? Well, instead of using a tree for the combinator expression, we can use a directed acyclic graph (DAG) in which we start from a node representing the whole expression, and then show how it breaks down into shared subexpressions, with each shared subexpression represented by a node.

To see how this works, let’s consider first the trivial case of f[x]. We can represent this as a tree—in which the root represents the whole expression f[x], and has one connection to the head f, and another to the argument x:

ToDAG
&#10005

The expression f[g[x]] is still a tree:

ToDAG
&#10005

But in f[f[x]] there is a “shared subexpression” (which in this case is just f), and the graph is no longer a tree:

ToDAG
&#10005

For f[x][f[x][f[x]]], f[x] is a shared subexpression:

ToDAG
&#10005

For s[s][s][s[s]][s][s]] things get a bit more complicated:

ToDAG
&#10005

For the size-33 expression above, the DAG representation is

ToDAG
&#10005

where the nodes correspond to the 11 distinct subexpression of the whole expression that appears at the root.

So what does combinator evolution look like in terms of DAGs? Here are the first 15 steps in the evolution of s[s][s][s[s]][s][s]:

ToDAG
&#10005

And here are some later steps:

ParallelTable
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"];
ParallelTable[
 Labeled[Graph[
   CombinatorToDAG[CombinatorEvolve[s[s][s][s[s]][s][s], t]], 
   GraphLayout -> "LayeredDigraphEmbedding", AspectRatio -> 1/2], 
  Text[Style[t, Gray, 12]]], {t, 50, 150, 50}]

Sharing all common subexpressions is in a sense a maximally reduced way to specify a combinator expression. And even when the total size of the expressions is growing roughly exponentially, the number of distinct subexpressions may grow only linearly—here roughly like 1.24 t:

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Length[Union[Level[#, {0, Infinity}, Heads -> True]]] & /@ 
  CombinatorEvolveList[s[s][s][s[s]][s][s], 100], Frame -> True, 
 Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
 AspectRatio -> 1/4, ImageSize -> 600]

Looking at successive differences suggests a fairly simple pattern:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Differences[
  Length[Union[Level[#, {0, Infinity}, Heads -> True]]] & /@ 
   CombinatorEvolveList[s[s][s][s[s]][s][s], 300]], Frame -> True, 
 Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
 AspectRatio -> 1/6, ImageSize -> 600]

Here are the DAG representations of the result of 50 steps in the evolution of the 46 “growing size-7” combinator expressions above:

inonterms
&#10005

It’s notable that some of these show considerable complexity, while others have a rather simple structure.

Updating Schemes and Multiway Systems

The world of combinators as we’ve discussed it so far may seem complicated. But we’ve actually so far been consistently making a big simplification. And it has to do with how the combinator rules are applied.

Consider the combinator expression:

s
&#10005
s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]]

There are 6 places (some overlapping) at which s[_][_][_] or k[_][_] matches some subpart of this expression:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[{CombinatorPlot[
   s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]], 
   "FramedMatches", 
   "EvaluationScheme" -> {"Leftmost", "Outermost"}]}, "StatesDisplay"]

One can see the same thing in the tree form of the expression (the matches are indicated at the roots of their subtrees):

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorExpressionGraph[
 s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]], 
 "UpdateHighlighting" -> {}, "MatchHighlighting" -> True, 
 AspectRatio -> .8, ImageSize -> 410]

But now the question is: if one’s applying combinator rules, which of these matches should one use?

What we’ve done so far is to follow a particular strategy—usually called leftmost outermost—which can be thought of as looking at the combinator expression as we normally write it out with brackets etc. and applying the first match we encounter in a left-to-right scan, or in this case:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[{CombinatorPlot[
   s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]], 
   "FramedMatches", 
   "IncludeBackgroundFraming" -> True]}, "StatesDisplay"]

In the Wolfram Language we can find the positions of the matches just using:

expr = s
&#10005
expr = s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]]
pos = Position
&#10005
pos = Position[expr, s[_][_][_] | k[_][_]]

This shows—as above—where these matches are in the expression:

expr = s
&#10005
expr = s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]];
pos = Position
&#10005
pos = Position[expr, s[_][_][_] | k[_][_]];
MapAt
MapAt
&#10005
MapAt[Framed, expr, pos]

Here are the matches, in the order provided by Position:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; With[{expr = 
   s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]]}, 
 Grid[{Text[#], 
     CombinatorEvolutionPlot[{CombinatorPlot[
        expr, {"FramedPositions", {#}}, 
        "IncludeArgumentFraming" -> False]}, "StatesDisplay"]} & /@ 
   Position[expr, s[_][_][_] | k[_][_]], Frame -> All]]

The leftmost-outermost match here is the one with position {0}.

In general the series of indices that specify the position of a subexpression say whether to reach the subexpression one should go left or right at each level as one descends the expression tree. An index 0 says to go to “head”, i.e. the f in f[x], or the f[a][b] in f[a][b][c]; an index 1 says to the “first argument”, i.e. the x in f[x], or the c in f[a][b][c]. The length of the list of indices gives the depth of the corresponding subexpression.

We’ll talk in the next section about how leftmost outermost—and other schemes—are defined in terms of indices. But here the thing to notice is that in our example here Position doesn’t give us part {0} first; instead it gives us {0,0,0,1,1,0,1}:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[{CombinatorPlot[
   s[s[s][s][s[s][k[k][s]][s]]][s][s][
    s[k[s][k]][k][s]], {"FramedPositions", {{0, 0, 0, 1, 1, 0, 1}}}, 
   "IncludeArgumentFraming" -> True, 
   "IncludeBackgroundFraming" -> True]}, "StatesDisplay"]

And what’s happening is that Position is doing a depth-first traversal of the expression tree to look for matches, so it first descends all the way down the left-hand tree branches—and since it finds a match there, that’s what it returns. In the taxonomy we’ll discuss in the next section, this corresponds to a leftmost-innermost scheme, though here we’ll refer to it as “depth first”.

Now consider the example of s[s][s][k[s][s]]. Here is what it does first with the leftmost-outermost strategy we’ve been using so far, and second with the new strategy:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Grid[{{Column[{CombinatorEvolutionPlot[
      CombinatorPlot[#, "FramedMatches"] & /@ 
       CombinatorFixedPointList[s[s][s][k[s][s]]], "StatesDisplay"], 
     Text[Style["standard (leftmost outermost)", Italic, 12]]}, 
    Dividers -> Center, FrameStyle -> Gray, 
    Alignment -> {-1 -> Center}], 
   Column[{CombinatorEvolutionPlot[
      CombinatorPlot[#, "FramedMatches", 
         "EvaluationScheme" -> {"Innermost", "Leftmost", 1}] & /@ 
       CombinatorFixedPointList[
        s[s][s][k[s][s]], {"Innermost", "Leftmost", 1}], 
      "StatesDisplay"], 
     Text[Style["depth-first (leftmost innermost)", Italic, 12]]}, 
    Dividers -> Center, FrameStyle -> Gray, 
    Alignment -> {-1 -> Center}]}}, 
 Dividers -> {{{Directive[Thick, LightGray]}}, {False}}, 
 Alignment -> Top]

There are two important things to notice. First, that in both cases the final result is the same. And second, that the steps taken—and the total number required to get to the final result—is different in the two cases.

Let’s consider a larger example: s[s][s][s[s[s]]][k][s]] (SSS(S(SS))KS). With our standard strategy we saw above that the evolution of this expression terminates after 89 steps, giving an expression of size 65. With the depth-first strategy the evolution still terminates with the same expression of size 65, but now it takes only 29 steps:

CombinatorFixedPointList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; (Labeled[
    ListStepPlot[
     LeafCount /@ 
      CombinatorFixedPointList[s[s][s][s[s[s]]][k][s], First[#]], 
     PlotRange -> All, Frame -> True, Filling -> Axis, 
     FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
     PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
     ImageSize -> 300], 
    Text[Style[ToLowerCase[ #[[2]]], Italic, 
      12]]]) & /@ {{{"Leftmost", "Outermost"}, 
   "standard (leftmost outermost)"}, {{"Leftmost", "Innermost"}, 
   "depth first (leftmost innermost)"}}

It’s an important feature of combinator expression evolution that when it terminates—whatever strategy one’s used—the result must always be the same. (This “confluence” property—that we’ll discuss more later—is closely related to the concept of causal invariance in our models of physics.)

What happens when the evolution doesn’t terminate? Let’s consider the simplest non-terminating case we found above: s[s][s][s[s]][s][s] (SSS(SS)SS). Here’s how the sizes increase with the two strategies we’ve discussed:

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; (Labeled[
    ListStepPlot[
     LeafCount /@ 
      CombinatorEvolveList[s[s][s][s[s]][s][s], 60, First@#], 
     PlotRange -> All, Frame -> True, Filling -> Axis, 
     FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
     PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
     ImageSize -> 300, ScalingFunctions -> "Log"], 
    Text[Style[ToLowerCase[ #[[2]]], Italic, 
      12]]]) & /@ {{{"Leftmost", "Outermost"}, 
   "standard (leftmost outermost)"}, {{"Leftmost", "Innermost"}, 
   "depth first (leftmost innermost)"}}

The difference is more obvious if we plot the ratios of sizes on successive steps:

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; (Labeled[
    ListStepPlot[
     Ratios[LeafCount /@ 
       CombinatorEvolveList[s[s][s][s[s]][s][s], 150, First@#]], 
     PlotRange -> All, Frame -> True, Filling -> Axis, 
     FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
     PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], 
     ImageSize -> 300, ScalingFunctions -> "Log"], 
    Text[Style[ToLowerCase[ #[[2]]], Italic, 
      12]]]) & /@ {{{"Leftmost", "Outermost"}, 
   "standard (leftmost outermost)"}, {{"Leftmost", "Innermost"}, 
   "depth first (leftmost innermost)"}}

In both these pairs of pictures, we can see that the two strategies start off producing the same results, but soon diverge.

OK, so we’ve looked at two particular strategies for picking which updates to do. But is there a general way to explore all possibilities? It turns out that there is—and it’s to use multiway systems, of exactly the kind that are also important in our Physics Project.

The idea is to make a multiway graph in which there’s an edge to represent each possible update that can be performed from each possible “state” (i.e. combinator expression). Here’s what this looks like for the example of s[s][s][k[s][s]] (SSS(KSS)) above:

skrules = {s
&#10005

Here’s what we get if we include all the “updating events”:

skrules
&#10005

Now each possible sequence of updating events corresponds to a path in the multiway graph. The two particular strategies we used above correspond to these paths:

skrules
&#10005

We see that even at the first step here, there are two possible ways to go. But in addition to branching, there is also merging, and indeed whichever branch one takes, it’s inevitable that one will end up at the same final state—in effect the unique “result” of applying the combinator rules.

Here’s a slightly more complicated case, where there starts out being a unique path, but then after 4 steps, there’s a branch, but after a few more steps, everything converges again to a unique final result:

skrules
&#10005

For combinator expressions of size 4, there’s never any branching in the multiway graph. At size 5 the multiway graphs that occur are:

skrules
&#10005

At size 6 the 2688 possible combinator expressions yield the following multiway graphs, with the one shown above being basically as complicated as it gets:

skrules
&#10005

At size 7, much more starts being able to happen. There are rather regular structures like:

skrules
&#10005

As well as cases like:

skrules
&#10005

This can be summarized by giving just the size of each intermediate expression, here showing the path defined by our standard leftmost-outermost updating strategy:

MWCombinatorGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraph[s[s][k][s[s]][k][k], 15, CombinatorEvolveList], 
 AspectRatio -> 1]

By comparison, here is the path defined by the depth-first strategy above:

MWCombinatorGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraph[s[s][k][s[s]][k][k], 15, 
  CombinatorEvolveList[#1, #2, {"Leftmost", "Innermost"}] &], 
 AspectRatio -> 1]

s[s][s][s[s[k]]][k] (SSS(S(SK))K) is a case where leftmost outermost-evaluation avoids longer paths and larger intermediate expressions

MWCombinatorGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraph[s[s][s][s[s[k]]][k], 15, "LeftmostOutermost"], 
 AspectRatio -> 1.2]

while depth-first evaluation takes more steps:

MWCombinatorGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraph[s[s][s][s[s[k]]][k], 15, 
  CombinatorEvolveList[#1, #2, {"Leftmost", "Innermost"}] &], 
 AspectRatio -> 1]

s[s[s]][s][s[s]][s] (S(SS)S(SS)S) gives a larger but more uniform multiway graph (s[s[s[s]]][s][s][s] evolves directly to s[s[s]][s][s[s]][s]):

MWCombinatorGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraph[s[s[s]][s][s[s]][s], 15, "LeftmostOutermost"], 
 AspectRatio -> 1.2, ImageSize -> 480]

Depth-first evaluation gives a slightly shorter path:

MWCombinatorGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraph[s[s[s]][s][s[s]][s], 15, 
  CombinatorEvolveList[#1, #2, {"Leftmost", "Innermost"}] &], 
 AspectRatio -> 1.2, ImageSize -> 450]

Among size-7 expressions, the largest finite multiway graph (with 94 nodes) is for s[s[s[s]]][s][s][k] (S(S(SS))SSK):

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraphMinimal[s[s[s[s]]][s][s][k], 18], 
 AspectRatio -> 1.2]

Depending on the path, this can take between 10 and 18 steps to reach its final state:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Histogram[
 Length /@ FindPath[
   MWCombinatorGraphMinimal[s[s[s[s]]][s][s][k], 18], 
   "s[s[s[s]]][s][s][k]", 
   "s[k[s[k][s[s[s]][k]]]][k[s[k][s[s[s]][k]]]]", Infinity, All], 
 ChartStyle -> $PlotStyles["Histogram", "ChartStyle"], Frame -> True, 
 FrameTicks -> {True, False}]

Our standard leftmost-outermost strategy takes 12 steps; the depth first takes 13 steps:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Labeled[Graph[
    MWCombinatorGraphMinimal[s[s[s[s]]][s][s][k], 15, First[#]], 
    AspectRatio -> 1.2, ImageSize -> {Automatic, 300}], 
   Text[Style[#[[2]], Italic, 12]]] & /@ {{CombinatorEvolveList, 
   "standard"}, {CombinatorEvolveList[#1, #2, {"Leftmost", 
      "Innermost"}] &, "depth-first"}}

But among size-7 combinator expressions there are basically two that do not lead to finite multiway systems: s[s[s]][s][s][s][k] (S(SS)SSSK) (which evolves immediately to s[s][s][s[s]][s][k]) and s[s[s]][s][s][s][s] (S(SS)SSSS) (which evolves immediately to s[s][s][s[s]][s][s]).

Let’s consider s[s[s]][s][s][s][k]. For 8 steps there’s a unique path of evolution. But at step 9, the evolution branches

skrules
&#10005

as a result of there being two distinct possible updating events:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[{CombinatorPlot[
   Last[CombinatorEvolveList[s[s[s]][s][s][s][k], 8]], 
   "FramedMatches", 
   "EvaluationScheme" -> {"Leftmost", "Outermost"}]}, "StatesDisplay"]

Continuing for 14 steps we get a fairly complex multiway system:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraphMinimal[s[s[s]][s][s][s][k], 14], 
 AspectRatio -> 1.2]

But this isn’t “finished”; the nodes circled in red correspond to expressions that are not fixed points, and will evolve further. So what happens with particular evaluation orders?

Here are the results for our two updating schemes:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Labeled[Graph[
    MWCombinatorGraphMinimal[s[s[s]][s][s][s][k], 14, First[#]], 
    AspectRatio -> 1.2, ImageSize -> {Automatic, 300}], 
   Text[Style[Last[#], Italic, 12]]] & /@ {{CombinatorEvolveList, 
   "standard"}, {CombinatorEvolveList[#1, #2, {"Leftmost", 
      "Innermost"}] &, "depth-first"}}

Something important is visible here: the leftmost-outermost path leads (in 12 steps) to a fixed-point node, while the depth-first path goes to a node that will evolve further. In other words, at least as far as we can see in this multiway graph, leftmost-outermost evaluation terminates while depth first does not.

There is just a single fixed point visible (s[k]), but there are many “unfinished paths”. What will happen with these? Let’s look at depth-first evaluation. Even though it hasn’t terminated after 14 steps, it does so after 29 steps—yielding the same final result s[k]:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s][s][s[s[s]]][k][s], {"Leftmost", "Innermost", 
   1}], "SizeAndMatches", 
 "EvaluationScheme" -> {"Leftmost", "Innermost", 1}, PlotRange -> All]

And indeed it turns out to be a general result (known since the 1940s) that if a combinator evolution path is going to terminate, it must terminate in a unique fixed point, but it’s also possible that the path won’t terminate at all.

Here’s what happens after 17 steps. We see more and more paths leading to the fixed point, but we also see an increasing number of “unfinished paths” being generated:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraphMinimal[s[s[s]][s][s][s][k], 17, 
  CombinatorEvolveList[#1, #2] &, "PathThickness" -> 3.5], 
 AspectRatio -> 1.2]

Let’s now come back to the other case we mentioned above: s[s[s]][s][s][s][s] (S(SS)SSSS). For 12 steps the evolution is unique:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  CombinatorEvolveList[s[s[s]][s][s][s][s], 12], "StatesDisplay"], .7]

But at that step there are two possible updating events:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[{CombinatorPlot[
    Last[CombinatorEvolveList[s[s[s]][s][s][s][s], 12]], 
    "FramedMatches", 
    "EvaluationScheme" -> {"Leftmost", "Outermost"}]}, 
  "StatesDisplay"], .7] 

And from there on out, there’s rapid growth in the multiway graph:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraphMinimal[s[s[s]][s][s][s][s], 18], 
 AspectRatio -> 1.2]

And what’s important here is that there are no fixed points: there is no possible evaluation strategy that leads to a fixed point. And what we’re seeing here is an example of a general result: if there is a fixed point in a combinator evolution, then leftmost-outermost evaluation will always find it.

In a sense, leftmost-outermost evaluation is the “most conservative” evaluation strategy, with the least propensity for ending up with “runaway evolution”. Its “conservatism” is on display if one compares growth from it and from depth-first evaluation in this case:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Labeled[CombinatorEvolutionPlot[
    CombinatorEvolveList[s[s[s]][s][s][s][s], 80, Append[#, 1]], 
    "SizeAndMatches", "EvaluationScheme" -> Append[#, 1], 
    PlotRange -> All, ImageSize -> 300], 
   Text[Row[Style[#, Italic, 12] & /@ (ToLowerCase /@ #), 
     Spacer[1]]]] & /@ {{"Leftmost", "Outermost"}, {"Leftmost", 
   "Innermost"}}

Looking at the multiway graph—as well as others—a notable feature is the presence of “long necks”: for many steps every evaluation strategy leads to the same sequence of expressions, and there is just one possible match at each step.

But how long can this go on? For size 8 and below it’s always limited (the longest “neck” at size 7 is for s[s[s]][s][s][s][s] and is of length 13; for size 8 it is no longer, but is of length 13 for s[s[s[s]][s][s][s][s]] and k[s[s[s]][s][s][s][s]]). But at size 9 there are four cases (3 distinct) for which growth continues forever, but is always unique:

{s[s[s[s]]][s[s[s][s]]][s], s[s[s[s]]][s[s[s]]][s[s]],<br />
 s[s[s]][s][s[s[s][s]][s]], s[s[s]][k][s[s[s][s]][s]]}
&#10005
{s[s[s[s]]][s[s[s][s]]][s], s[s[s[s]]][s[s[s]]][s[s]], 
 s[s[s]][s][s[s[s][s]][s]], s[s[s]][k][s[s[s][s]][s]]}

And as one might expect, all these show rather regular patterns of growth:

CombinatorEvolutionPlot
&#10005

The second differences are given in the first and third cases by repeats of (for successive n):

Join
&#10005
Join[{0, 0, 1} , Table[0, n], {7, 0, 0, 1, 0, 3 (2^(n + 2) - 3)}]

In the second they are given by repeats of

Join
&#10005
Join[Table[0, n], {2}]

and in the final case by repeats of

Join
&#10005
Join[{0, 1} , 
 Table[0, n], {-3 2^(n + 3) + 18, 3 2^(n + 3) - 11, 0, 1, 
  0, -3 2^(n + 3) + 2, 9 2^(n + 2) - 11}]

The Question of Evaluation Order

As a computational language designer, it’s an issue I’ve been chasing for 40 years: what’s the best way to define the order in which one evaluates (i.e. computes) things? The good news is that in a well-designed language (like the Wolfram Language!) it fundamentally doesn’t matter, at least much of the time. But in thinking about combinators—and the way they evolve—evaluation order suddenly becomes a central issue. And in fact it’s also a central issue in our new model of physics—where it corresponds to the choice of reference frame, for relativity, quantum mechanics and beyond.

Let’s talk first about evaluation order as it shows up in the symbolic structure of the Wolfram Language. Imagine you’re doing this computation:

Length
&#10005
Length[Join[{a, b}, {c, d, e}]]

The result is unsurprising. But what’s actually going on here? Well, first you’re computing Join[...]:

Join
&#10005
Join[{a, b}, {c, d, e}]

Then you’re taking the result, and providing it as argument to Length, which then does its job, and gives the result 5. And in general in the Wolfram Language, if you’re computing f[g[x]] what’ll happen is that x will be evaluated first, followed by g[x], and finally f[g[x]]. (Actually, the head f in f[x] is the very first thing evaluated, and in f[x, y] one evaluates f, then x, then y and then f[x, y].)

And usually this is exactly what one wants, and what people implicitly expect. But there are cases where it isn’t. For example, let’s say you’ve defined x = 1 (i.e. Set[x,1]). Now you want to say x = 2 (Set[x,2]). If the x evaluated first, you’d get Set[1,2], which doesn’t make any sense. Instead, you want Set to “hold its first argument”, and “consume it” without first evaluating it. And in the Wolfram Language this happens automatically because Set has attribute HoldFirst.

How is this relevant to combinators? Well, basically, the standard evaluation order used by the Wolfram Language is like the depth-first (leftmost-innermost) scheme we described above, while what happens when functions have Hold attributes is like the leftmost-outermost scheme.

But, OK, so if we have something like f[a[x],y] we usually first evaluate a[x], then use the result to compute f[a[x],y]. And that’s pretty easy to understand if a[x], say, immediately evaluates to something like 4 that doesn’t itself need to be evaluated. But what happens when in f[a[x],y], a[x] evaluates to b[x] which then evaluates to c[x] and so on? Do you do the complete chain of “subevaluations” before you “come back up” to evaluate y, and f[...]?

What’s the analog of this for combinators? Basically it’s whether when you do an update based on a particular match in a combinator expression, you then just keep on “updating the update”, or whether instead you go on and find the next match in the expression before doing anything with the result of the update. The “updating the update” scheme is basically what we’ve called our depth-first scheme, and it’s essentially what the Wolfram Language does in its automatic evaluation process.

Imagine we give the combinator rules as Wolfram Language assignments:

s[x_][y_][z_] := x[z][y[z]]
&#10005
s[x_][y_][z_] := x[z][y[z]]
k[x_][y_] := x
&#10005
k[x_][y_] := x

Then—by virtue of the standard evaluation process in the Wolfram Language—every time we enter a combinator expression these rules will automatically be repeatedly applied, until a fixed point is reached:

s[s][s][s[s[s]]][k][s]
&#10005
s[s][s][s[s[s]]][k][s]

What exactly is happening “inside” here? If we trace it in a simpler case, we can see that there is repeated evaluation, with a depth-first (AKA leftmost-innermost) scheme for deciding what to evaluate:

Dataset
&#10005
Dataset[Trace[s[k[k][k]][s][s]]]

Of course, given the assignment above for s, if one enters a combinator expression—like s[s][s][s[s]][s][s]]—whose evaluation doesn’t terminate, there’ll be trouble, much as if we define x = x + 1 (or x = {x}) and ask for x. Back when I was first doing language design people often told me that issues like this meant that a language that used automatic infinite evaluation “just couldn’t work”. But 40+ years later I think I can say with confidence that “programming with infinite evaluation, assuming fixed points” works just great in practice—and in rare cases where there isn’t going to be a fixed point one has to do something more careful anyway.

In the Wolfram Language, that’s all about specifically applying rules, rather than just having it happen automatically. Let’s say we clear our assignments for s and k:

Clear
&#10005
Clear[s, k]

Now no transformations associated with s and k will automatically be made:

s
&#10005
s[s][s][s[s[s]]][k][s]

But by using /. (ReplaceAll) we can ask that the s, k transformation rules be applied once:

s
&#10005
s[s][s][s[s[s]]][k][s] /. {s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}

With FixedPointList we can go on applying the rule until we reach a fixed point:

FixedPointList
&#10005
FixedPointList[#]

It takes 26 steps—which is different from the 89 steps for our leftmost-outermost evaluation, or the 29 steps for leftmost-innermost (depth-first) evaluation. And, yes, the difference is the result of /. in effect applying rules on the basis of a different scheme than the ones we’ve considered so far.

But, OK, so how can we parametrize possible schemes? Let’s go back to the combinator expression from the beginning of the previous section:

s
&#10005
Clear[s,k]; 
					s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]]

Here are the positions of possible matches in this expression:

Position
&#10005
Position[s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]], 
 s[_][_][_] | k[_][_]]

An evaluation scheme must define a way to say which of these matches to actually do at each step. In general we can apply pretty much any algorithm to determine this. But a convenient approach is to think about sorting the list of positions by particular criteria, and then for example using the first k positions in the result.

Given a list of positions, there are two obvious potential types of sorting criteria to use: ones based on the lengths of the position specifications, and ones based on their contents. For example, we might choose (as Sort by default does) to sort shorter position specifications first:

Sort
&#10005
Sort[{{0, 0, 0, 1, 1, 0, 1}, {0, 0, 0, 1, 1}, {0, 0, 0, 1}, {0}, {1, 
   0, 0, 1}, {1}}]

But what do the shorter position specifications correspond to? They’re the more “outer” parts of the combinator expression, higher on the tree. And when we say we’re using an “outermost” evaluation scheme, what we mean is that we’re considering matches higher on the tree first.

Given two position specifications of the same length, we then need a way to compare these. An obvious one is lexicographic—with 0 sorted before 1. And this corresponds to taking f before x in f[x], or taking the leftmost object first.

We have to decide whether to sort first by length and then by content, or the other way around. But if we enumerate all choices, here’s what we get:

allschemes
&#10005

And here’s where the first match with each scheme occurs in the expression tree:

allschemes
&#10005

So what happens if we use these schemes in our combinator evolution? Here’s the result for the terminating example s[s][s][s[s[s]]][k][s] above, always keeping only the first match with a given sorting criterion, and at each step showing where the matches were applied:

allschemes
&#10005

Here now are the results if we allow the first up to 2 matches from each sorted list to be applied:

allschemes
&#10005

Here are the results for leftmost outermost, allowing up to between 1 and 8 updates at each step:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Table[
 CombinatorEvolutionPlot[
  CombinatorFixedPointList[
   s[s][s][s[s[s]]][k][s], {"Leftmost", "Outermost", k}], 
  "SizeAndMatches", 
  "EvaluationScheme" -> {"Leftmost", "Outermost", k}, 
  PlotRange -> All, ImageSize -> 150], {k, 8}]

And here’s a table of the “time to reach the fixed point” with different evaluation schemes, allowing different numbers of updates at each step:

allschemes
&#10005

Not too surprisingly, the time to reach the fixed point always decreases when the number of updates that can be done at each step increases.

For the somewhat simpler terminating example s[s[s[s]]][s][s][s] (S(S(SS))SSS) we can explicitly look at the updates on the trees at each step for each of the different schemes:

allschemes
&#10005

OK, so what about a combinator expression that does not terminate? What will these different evaluation schemes do? Here are the results for s[s[s]][s][s][s][s] (S(SS)SSSS) over the course of 50 steps, in each case using only one match at each step:

allschemes
&#10005

And here is what happens if we allow successively more matches (selected in leftmost-outermost order) to be used at each step:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Table[
 CombinatorEvolutionPlot[
  CombinatorEvolveList[s[s[s]][s][s][s][s], 
   50, {"Leftmost", "Outermost", k}], "SizeAndMatches", 
  "EvaluationScheme" -> {"Leftmost", "Outermost", k}, 
  PlotRange -> All, ImageSize -> 150], {k, 4}]

Not surprisingly, the more matches allowed, the faster the growth in size (and, yes, looking at pictures like this suggests studying a kind of “continuum limit” or “mean field theory” for combinator evolution):

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Table[Callout[
   LeafCount /@ 
    CombinatorEvolveList[s[s[s]][s][s][s][s], 
     50, {"Leftmost", "Outermost", n}], n], {n, 10}], 
 ScalingFunctions -> "Log", Frame -> True]

It’s interesting to look at the ratios of sizes on successive steps for different updating schemes (still for s[s[s]][s][s][s][s]). Some schemes lead to much more “obviously simple” long-term behavior than others:

allschemes
&#10005

In fact, just changing the number of allowed matches (here for leftmost outermost) can have similar effects:

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Table[
 ListStepPlot[
  Ratios[LeafCount /@ 
    CombinatorEvolveList[s[s[s]][s][s][s][s], 
     100, {"Leftmost", "Outermost", k}]], PlotRange -> All, 
  ImageSize -> 150, Frame -> True, Filling -> Axis, 
  FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
  PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"]], {k, 4}]

What about for other combinator expressions? Different updating schemes can lead to quite different behavior. Here’s s[s[s]][s][s[s[s]]][k] (S(SS)S(S(SS))K):

allschemes
&#10005

And here’s s[s[s]][s][s][s][s[k]] (S(SS)SSS(SK))—which for some updating schemes gives purely periodic behavior (something which can’t happen without a k in the original combinator expression):

allschemes
&#10005

It’s worth noting that—at least when there are k’s involved—different updating schemes can even change whether the evaluation of a particular combinator expression ever terminates. This doesn’t happen below size 8. But at size 8, here’s what happens for example with s[s][s][s[s]][s][s][k] (SSS(SS)SSK):

allschemes
&#10005

For some updating schemes it reaches a fixed point (always just s[k]) but for others it gives unbounded growth. The innermost schemes are the worst in terms of “missing fixed points”; they do it for 16 size-8 combinator expressions. But (as we mentioned earlier) leftmost outermost has the important feature that it’ll never miss a fixed point if one exists—though sometimes at the risk of taking an overly ponderous route to the fixed point.

But so if one’s applying combinator-like transformation rules in practice, what’s the best scheme to use? The Wolfram Language /. (ReplaceAll) operation in effect uses a leftmost-outermost scheme—but with an important wrinkle: instead of just using one match, it uses as many non-overlapping matches as possible.

Consider again the combinator expression:

s
&#10005
s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]]

In leftmost-outermost order the possible matches here are:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Keys[
 CombinatorMatches[
  s[s[s][s][s[s][k[k][s]][s]]][s][s][s[k[s][k]][k][s]], {"Leftmost", 
   "Outermost"}]]

But the point is that the match at position {0} overlaps the match at position {0,0,0,1} (i.e. it is a tree ancestor of it). And in general the possible match positions form a partially ordered set, here:

ReverseGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ReverseGraph[
 RelationGraph[
  ListStrictPrefixQ, {{0, 0, 0, 1, 1, 0, 1}, {0, 0, 0, 1, 1}, {0, 0, 
    0, 1}, {0}, {1, 0, 0, 1}, {1}}, VertexLabels -> Automatic]]

One possibility is always to use matches at the “bottom” of the partial order—or in other words, the very innermost matches. Inevitably these matches can’t overlap, so they can always be done in parallel, yielding a “parallel innermost” evaluation scheme that is potentially faster (though runs the risk of not finding a fixed point at all).

What /. does is effectively to use (in leftmost order) all the matches that appear at the “top” of the partial order. And the result is again typically faster overall updating. In the s[s][s][s[s]][s][s][k] example above, repeatedly applying /. (which is what //. does) finds the fixed point in 23 steps, while it takes ordinary one-replacement-at-a-time leftmost-outermost updating 30 steps—and parallel innermost doesn’t terminate in this case:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Labeled[ListStepPlot[#[[1]], PlotRange -> All, ImageSize -> 150, 
    Frame -> True, Filling -> Axis, 
    FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
    PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"]], 
   Text[Style[#[[2]], Italic, 12]]] & /@ {{LeafCount /@ 
    NestList[# /. {s[x_][y_][z_] -> x[z][y[z]], k[x_][y_] -> x} &, 
     s[s][s][s[s]][s][s][k], 35], 
   "Wolfram Language /."}, {LeafCount /@ 
    CombinatorEvolveList[s[s][s][s[s]][s][s][k], 35], 
   "leftmost outermost"}}

For s[s][s][s[s[s]]][k][s] (SSS(S(SS))KS) parallel innermost does terminate, getting a result in 27 steps compared to 26 for /.—but with somewhat smaller intermediate expressions:

CombinatorStep
&#10005

For a case in which there isn’t a fixed point, however, /. will often lead to more rapid growth. For example, with s[s[s]][s][s][s][s] (S(SS)SSSS) it basically gives pure exponential 2t/2 growth (and eventually so does parallel innermost):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Labeled[ListStepPlot[#[[1]], PlotRange -> All, ImageSize -> 150, 
    Frame -> True, Filling -> Axis, 
    FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
    PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"]], 
   Text[Style[#[[2]], Italic, 12]]] & /@ {{LeafCount /@ 
    NestList[# /. {s[x_][y_][z_] -> x[z][y[z]], k[x_][y_] -> x} &, 
     s[s[s]][s][s][s][s], 35], 
   "Wolfram Language /."}, {LeafCount /@ 
    CombinatorEvolveList[s[s[s]][s][s][s][s], 35], 
   "leftmost outermost"}, {LeafCount /@ 
    NestList[
     CombinatorStep[Automatic, #, {"Parallel", "Innermost"}] &, 
     s[s[s]][s][s][s][s], 35], "parallel innermost"}}

In A New Kind of Science I gave a bunch of results for combinators with /. updating, finding much of the same kind of behavior for “combinators in the wild” as we’ve seen here.

But, OK, so we’ve got the updating scheme of /. (and its repeated version //.), and we’ve got the updating scheme for automatic evaluation (with and without functions with “hold” attributes). But are there other updating schemes that might also be useful, and if so, how might we parametrize them?

I’ve wondered about this since I was first designing SMP—the forerunner to Mathematica and the Wolfram Language—more than 40 years ago. One place where the issue comes up is in automatic evaluation of recursively defined functions. Say one has a factorial function defined by:

f
&#10005
f[1] = 1; f[n_] := n f[n - 1]

What will happen if one asks for f[0]? With the most obvious depth-first evaluation scheme, one will evaluate f[-1], f[-2], etc. forever, never noticing that everything is eventually going to be multiplied by 0, and so the result will be 0. If instead of automatic evaluation one was using //. all would be well—because it’s using a different evaluation order:

f
&#10005
f[0] //. f[n_] -> n f[n - 1]

Let’s consider instead the recursive definition of Fibonacci numbers (to make this more obviously “combinator like” we could for example use Construct instead of Plus):

f
&#10005
f[1] = f[2] = 1; f[n_] := f[n - 1] + f[n - 2]

If you ask for f[7] you’re essentially going to be evaluating this tree:

Clear
&#10005

But the question is: how do you do it? The most obvious approach amounts to doing a depth-first scan of the tree—and doing about ϕn computations. But if you were to repeatedly use /. instead, you’d be doing more of a breadth-first scan, and it’d take more like O(n2) computations:

FixedPointList
&#10005
FixedPointList[# /. {f[1] -> 1, f[2] -> 1, 
    f[n_] -> f[n - 1] + f[n - 2]} &, f[7]]

But how can one parametrize these different kinds of behavior? From our modern perspective in the Wolfram Physics Project, it’s like picking different foliations—or different reference frames—in what amount to causal graphs that describe the dependence of one result on others. In relativity, there are some standard reference frames—like inertial frames parametrized by velocity. But in general it’s not easy to “describe reasonable reference frames”, and we’re typically reduced to just talking about named metrics (Schwarzschild, Kerr, …), much like here we’re talking about “named updated orders” (“leftmost innermost”, “outermost rightmost”, …).

But back in 1980 I did have an idea for at least a partial parametrization of evaluation orders. Here it is from section 3.1 of the SMP documentation:

SMP documentation

What I called a “projection” then is what we’d call a function now; a “filter” is what we’d now call an argument. But basically what this is saying is that usually the arguments of a function are evaluated (or “simplified” in SMP parlance) before the function itself is evaluated. (Though note the ahead-of-its-time escape clause about “future parallel-processing implementations” which might evaluate arguments asynchronously.)

But here’s the funky part: functions in SMP also had Smp and Rec properties (roughly, modern “attributes”) that determined how recursive evaluation would be done. And in a first approximation, the concept was that Smp would choose between innermost and outermost, but then in the innermost case, Rec would say how many levels to go before “going outermost” again.

And, yes, nobody (including me) seems to have really understood how to use these things. Perhaps there’s a natural and easy-to-understand way to parametrize evaluation order (beyond the /. vs. automatic evaluation vs. hold attributes mechanism in Wolfram Language), but I’ve never found it. And it’s not encouraging here to see all the complexity associated with different updating schemes for combinators.

By the way, it’s worth mentioning that there is always a way to completely specify evaluation order: just do something like procedural programming, where every “statement” is effectively numbered, and there can be explicit Goto’s that say what statement to execute next. But in practice this quickly gets extremely fiddly and fragile—and one of the great values of functional programming is that it streamlines things by having “execution order” just implicitly determined by the order in which functions get evaluated (yes, with things like Throw and Catch also available).

And as soon as one’s determining “execution order” by function evaluation order, things are immediately much more extensible: without having to specify anything else, there’s automatically a definition of what to do, for example, when one gets a piece of input with more complex structure. If one thinks about it, there are lots of complex issues about when to recurse through different parts of an expression versus when to recurse through reevaluation. But the good news is that at least the way the Wolfram Language is designed, things in practice normally “just work” and one doesn’t have to think about them.

Combinator evaluation is one exception, where, as we have seen, the details of evaluation order can have important effects. And presumably this dependence is in fact connected to why it’s so hard to understand how combinators work. But studying combinator evaluation once again inspires one (or at least me) to try to find convenient parametrizations for evaluation order—perhaps now using ideas and intuition from physics.

The World of the S Combinator

In the definitions of the combinators s and k

{s
&#10005
{s[x_][y_][z_] -> x[z][y[z]], k[x_][y_] -> x}

S is basically the one that “builds things up”, while K is the one that “cuts things down”. And historically, in creating and proving things with combinators, it was important to have the balance of both S and K. But what we’ve seen above makes it pretty clear that S alone can already do some pretty complicated things.

So it’s interesting to consider the minimal case of combinators formed solely from s. For size n (i.e. LeafCount[n]), there are

CatalanNumber
&#10005
CatalanNumber[n-1]==Binomial[2 n+1,n+1]/(2n+1)

(~ for large n) possible such combinators, each of which can be characterized simply in terms of the sequence of bracket openings and closings it involves.

Some of these combinators terminate in a limited time, but above size 7 there are ones that do not:

ttimes
&#10005

And already there’s something weird: the fraction of nonterminating combinator expressions steadily increases with size, then precipitously drops, then starts climbing again:

ttotals
&#10005

But let’s look first at the combinator expressions whose evaluation does terminate. And, by the way, when we’re dealing with S alone, there’s no possibility of some evaluation schemes terminating and others not: they either all terminate, or none do. (This result was established in the 1930s from the fact that the S combinator—unlike K—in effect “conserves variables”, making it an example of the so-called λI calculus.)

With leftmost-outermost evaluation, here are the halting time distributions, showing roughly exponential falloff with gradual broadening:

ttimes
&#10005

And here are the (leftmost-outermost) “champions”—the combinator expressions that survive longest (with leftmost-outermost evaluation) before terminating:

CombinatorTraditionalForm
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Text[
 Grid[Prepend[
   Append[#, CombinatorTraditionalForm[Last[#]]] & /@ {{2, 0, 
      s[s]}, {3, 0, s[s][s]}, {4, 1, s[s][s][s]}, {5, 2, 
      s[s][s][s][s]}, {6, 4, s[s][s][s][s][s]}, {7, 15, 
      s[s[s[s]]][s][s][s]}, {8, 15, s[s[s[s[s]]][s][s][s]]}, {9, 86, 
      s[s[s]][s[s]][s[s]][s][s]}, {10, 1109, 
      s[s[s][s]][s[s]][s][s][s][s]}, {11, 1109, 
      s[s[s[s][s]][s[s]][s][s][s][s]]}, {12, 1444, 
      s[s[s]][s[s]][s[s][s][s][s][s]][s]}, {13, 6317, 
      s[s[s]][s[s]][s[s][s][s][s][s][s]][s]}, {14, 23679, 
      s[s[s]][s[s]][s[s][s][s][s][s][s][s]][s]}, {15, 131245, 
      s[s[s]][s[s]][s[s][s][s][s][s][s][s][s]][s]}, {16, 454708, 
      s[s[s]][s[s]][s[s][s][s][s][s][s][s][s][s]][s]} }, 
   Style[#, Italic] & /@ {"size", "max steps", "expression", ""}], 
  Frame -> All, 
  Background -> {{GrayLevel[0.9]}, {GrayLevel[0.9]}, None}]]

The survival (AKA halting) times grow roughly exponentially with size—and notably much slower than what we saw in the SK case above:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
						ListStepPlot[
 Transpose[{Range[4, 16], {1, 2, 4, 15, 15, 86, 1109, 1109, 1444, 
    6317, 23679, 131245, 454708}}], Center, ScalingFunctions -> "Log",
  AspectRatio -> .4, Frame -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 310]

How do the champions actually behave? Here’s what happens for a sequence of sizes:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorEvolutionPlot[CombinatorFixedPointList[#], "SizeAndMatches",
    ImageSize -> 250,
   Epilog ->
    Text[Style[First[#], 
      Directive[FontSize -> 12, GrayLevel[0.25], 
       FontFamily -> "Source Sans Pro"]], 
     Scaled[{.25, 1}], {1.5, 1.4}]] & /@ {{"size 8", 
   s[s[s[s[s]]][s][s][s]]}, {"size 9", 
   s[s[s]][s[s]][s[s]][s][s]}, {"size 10", 
   s[s[s][s]][s[s]][s][s][s][s]}, {"size 11", 
   s[s[s[s][s]][s[s]][s][s][s][s]]}}

There’s progressive increase in size, and then splat: the evolution terminates. Looking at the detailed behavior (here for size 9 with a “right-associative rendering”) shows that what’s going on is quite systematic:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s[s]][s[s]][s[s]][s][s]], "ArrayPlotRightAssociative", 
 AspectRatio -> .3, "IncludeUpdateHighlighting" -> False]

The differences again reflect the systematic character of the behavior:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

And it seems that what’s basically happening is that the combinator is acting as a kind of digital counter that’s going through an exponential number of steps—and ultimately building a very regular tree structure:

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorExpressionGraph[
 CombinatorFixedPoint[s[s[s]][s[s]][s[s][s][s][s][s]][s]], 
 AspectRatio -> .25, "ShowVertexLabels" -> False, VertexSize -> Large]

By the way, even though the final state is the same, the evolution is quite different with different evaluation schemes. And for example our “leftmost-outermost champions” actually terminate much faster with depth-first evaluation:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						CombinatorEvolutionPlot[
   CombinatorFixedPointList[#, {"Leftmost", "Innermost", 1}], 
   "SizeAndMatches", 
   "EvaluationScheme" -> {"Leftmost", "Innermost", 1}, 
   ImageSize -> 190, PlotRange -> All,
   Epilog ->
    Text[Style[First[#], 
      Directive[FontSize -> 12, GrayLevel[0.25], 
       FontFamily -> "Source Sans Pro"]], 
     Scaled[{.25, 1}], {1, 1.4}]] & /@ {{"size 8", 
   s[s[s[s[s]]][s][s][s]]}, {"size 9", 
   s[s[s]][s[s]][s[s]][s][s]}, {"size 10", 
   s[s[s][s]][s[s]][s][s][s][s]}}

Needless to say, there can be different depth-first (AKA leftmost-innermost) champions, although—somewhat surprisingly—some turn out to be the same (but not sizes 8, 12, 13):

CombinatorFixedPointList
&#10005

We can get a sense of what happens with all possible evaluation schemes if we look at the multiway graph. Here is the result for the size-8-leftmost-outermost champion s[s[s[s]]][s][s][s]:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Function[sch, 
  Labeled[Graph[
    MWCombinatorGraphMinimal[s[s[s[s]]][s][s][s], 15, 
     CombinatorEvolveList[#1, #2, sch] &, NodeSizeMultiplier -> .4], 
    AspectRatio -> 1, ImageSize -> 250], 
   Text[Style[Row[ToLowerCase /@ sch, Spacer[1]], Italic, 
     12]]]] /@ {{"Leftmost", "Outermost"}, {"Leftmost", "Innermost"}}

The number of expressions at successive levels in the multiway graph starts off growing quite exponentially, but after 12 steps it rapidly drops—eventually yielding a finite graph with 74 nodes (leftmost outermost is the “slowest” evaluation scheme—taking the maximum 15 steps possible):

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Length /@ 
  ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]],
     k[x_][y_] -> x}, s[s[s[s]]][s][s][s], 16], Center, Frame -> True,
  AspectRatio -> .4, Frame -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 280]

Even for the size-9 champion the full multiway graph is too large to construct explicitly. After 15 steps the number of nodes has reached 6598, and seems to be increasingly roughly like —even though after at most 86 steps all “dangling ends” must have resolved, and the system must reach its fixed point:

MWCombinatorGraphMinimal
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 MWCombinatorGraphMinimal[s[s[s]][s[s]][s[s]][s][s], 12, 
  NodeSizeMultiplier -> 1.5], AspectRatio -> 1]

What happens with s combinator expressions that do not terminate? We already saw above some examples of the kind of growth in size one observes (say with leftmost-outermost evaluation). Here are examples with roughly exponential behavior, with differences between successive steps shown on a log scale:

ListStepPlot
&#10005

And here are examples of differences shown on a linear scale:

ListStepPlot
&#10005

Sometimes there are fairly long transients, but what’s notable is that among all the 8629 infinite-growth combinator expressions up to size 11 there are none whose evolution seems to show long-term irregularity in overall size. Of course, something like rule 30 also doesn’t show irregularity in overall size; one has to look “inside” to see complex behavior—and difficulties of visualization make that hard to systematically do in the case of combinators.

But looking at the pictures above there seem to be a “limited number of ways” that combinator expressions grow without bound. Sometimes it’s rather straightforward to see how the infinite growth happens. Here’s a particularly “pure play” example: the size-9 case s[s[s[s]]][s[s[s]]][s[s]] (S(S(SS))(S(SS))(SS)) which evolves the same way with all evaluation schemes (in the pictures, the root of the match at each step is highlighted):

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorExpressionGraph[#, "ShowVertexLabels" -> False, 
   "UpdateHighlighting" -> {"Leftmost", "Outermost", 1}, 
   "MatchHighlighting" -> True, ImageSize -> {Automatic, 120}] & /@ 
 CombinatorEvolveList[s[s[s[s]]][s[s[s]]][s[s]], 6]

Looking at the subtree “below” each match we see

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorExpressionGraph[#, "ShowVertexLabels" -> False, 
   "MatchHighlighting" -> True, 
   ImageSize -> {Automatic, 60}] & /@ (First[
     Extract[#, {First[Keys[CombinatorMatches[#]]]}]] & /@ 
   CombinatorEvolveList[s[s[s[s]]][s[s[s]]][s[s]], 20])

and it is clear that there is a definite progression which will keep going forever, leading to infinite growth.

But if one looks at the corresponding sequence of subtrees for a case like the smallest infinite-growth combinator expression s[s][s][s[s]][s][s] (SSS(SS)SS), it’s less immediately obvious what’s going on:

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorExpressionGraph[#, "ShowVertexLabels" -> False, 
   "UpdateHighlighting" -> "Nodes", 
   "EvaluationScheme" -> {"Leftmost", "Outermost", 1}, 
   ImageSize -> {Automatic, 60}] & /@ (First[
     Extract[#, {First[Keys[CombinatorMatches[#]]]}]] & /@ 
   CombinatorEvolveList[s[s][s][s[s]][s][s], 20])

But there’s a rather remarkable result from the end of the 1990s that gives one a way to “evaluate” combinator expressions, and tell whether they’ll lead to infinite growth—and in particular to be able to say directly from an initial combinator expression whether it’ll continue evolving forever, or will reach a fixed point.

One starts by writing a combinator expression like s[s[s[s]]][s[s[s]]][s[s]] (S(S(SS))(S(SS))(SS)) in an explicitly “functional” form:

FunctionToApplication
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
					FunctionToApplication[s[s[s[s]]][s[s[s]]][s[s]]] /. Application -> f

Then one imagines f[x,y] as being a function with explicit (say, integer) values. One replaces s by some explicit value (say an integer), then defines values for f[1,1], f[1,2], etc.

As a first example, let’s say that we take s = 1 and f[x_,y_]=x+y. Then we can “evaluate” the combinator expression above as

SCombinatorAutomatonTreeGeneral
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; SCombinatorAutomatonTreeGeneral[
 s[s[s[s]]][s[s[s]]][s[s]], Application[x_, y_] -> x + y, 1, 
 VertexSize -> .6]

and in this case the value at the root just counts the total size (i.e. LeafCount).

But by changing f one can probe other aspects of the combinator expression tree. And what was discovered in 2000 is that there’s a complete way to test for infinite growth by setting up 39 possible values, and making f[x,y] be a particular (“tree automaton”) “multiplication table” for these values:

MapIndexed
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
gridColors[x_] := 
 Blend[{Hue[0.1, 0.89, 0.984], Hue[0.16, 0.51, 0.984], Hue[
   0.04768041237113402, 0, 0.984]}, x]
Grid[MapIndexed[
  If[#2[[1]] === 1 || #2[[2]] === 1,  
    Item[Style[#1, 9, Bold, GrayLevel[.35]], 
     Background -> GrayLevel[.9]], 
    If[#1 == 38, 
     Item["", Background -> RGBColor[0.984, 0.43, 0.208], 
      FrameStyle -> Darker[RGBColor[0.984, 0.43, 0.208], .2]], 
     Item[Style[#1, 9, GrayLevel[0, .6]], 
      Background -> gridColors[(38 - #1)/38] , 
      FrameStyle -> Darker[RGBColor[0.984, 0.43, 0.208], .2]]]] &, 
  Prepend[MapIndexed[Flatten[Prepend[#1, (#2 - 1)]] &, 
    Table[i\[Application]j /. maintablesw, {i, 0, 38}, {j, 0, 38}]], 
   Flatten[Prepend[Range[0, 38], "\[Application]"]]], {2}], 
 Spacings -> {.25, 0}, ItemSize -> {1, 1}, Frame -> All, 
 FrameStyle -> GrayLevel[.6], BaseStyle -> "Text"]

Bright red (value 38) represents the presence of an infinite growth seed—and once one exists, f makes it propagate up to the root of the tree. And with this setup, if we replace s by the value 0, the combinator expression above can be “evaluated” as:

SCombinatorAutomatonTree
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; SCombinatorAutomatonTree[s[s[s[s]]][s[s[s]]][s[s]], 
 VertexSize -> .5]

At successive steps in the evolution we get:

SCombinatorAutomatonTree
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
SCombinatorAutomatonTree[#, VertexSize -> .8, 
   ImageSize -> {Automatic, 140}] & /@ 
 CombinatorEvolveList[s[s[s[s]]][s[s[s]]][s[s]], 5]

Or after 8 steps:

SCombinatorAutomatonTree
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; SCombinatorAutomatonTree[
 CombinatorEvolve[s[s[s[s]]][s[s[s]]][s[s]], 8], VertexSize -> .8]

The “lowest 38” is always at the top of the subtree where the match occurs, serving as a “witness” of the fact that this subtree is an infinite growth seed.

Here are some sample size-7 combinator expressions, showing how the two that lead to infinite growth are identified:

SCombinatorAutomatonTree
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Labeled[SCombinatorAutomatonTree[#, VertexSize -> .5, 
    ImageSize -> {Automatic, 200}], 
   Style[Text[#], 12]] & /@ {s[s][s][s][s][s][s], s[s[s][s][s][s]][s],
   s[s[s]][s][s][s][s], s[s][s][s[s]][s][s], s[s[s[s]][s[s]]][s], 
  s[s][s[s[s][s]]][s], s[s[s[s]]][s[s[s]]]}

If we were dealing with combinator expressions involving both S and K we know that it’s in general undecidable whether a particular expression will halt. So what does it mean that there’s a decidable way to determine whether an expression involving only S halts?

One might assume it’s a sign that S alone is somehow computationally trivial. But there’s more to this issue. In the past, it has often been thought that a “computation” must involve starting with some initial (“input”) state, then ending up at a fixed point corresponding to a final result. But that’s certainly not how modern computing in practice works. The computer and its operating system do not completely stop when a particular computation is finished. Instead, the computer keeps running, but the user is given a signal to come and look at something that provides the output for the computation.

There’s nothing fundamentally different about how computation universality works in a setup like this; it’s just a “deployment” issue. And indeed the simplest possible examples of universality in cellular automata and Turing machines have been proved this way.

So how might this work for S combinator expressions? Basically any sophisticated computation has to live on top of an infinite combinator growth process. Or, put another way, the computation has to exist as some kind of “transient” of potentially unbounded length, that in effect “modulates” the infinite growth “carrier”.

One would set up a program by picking an appropriate combinator expression from the infinite collection that lead to infinite growth. Then the evolution of the combinator expression would “run” the program. And one would use some computationally bounded process (perhaps a bounded version of a tree automaton) to identify when the result of the computation is ready—and one would “read it out” by using some computationally bounded “decoder”.

My experience in the computational universe—as captured in the Principle of Computational Equivalence—is that once the behavior of a system is not “obviously simple”, the system will be capable of sophisticated computation, and in particular will be computation universal. The S combinator is a strange and marginal case. At least in the ways we have looked at it here, its behavior is not “obviously simple”. But we have not quite managed to identify things like the kind of seemingly random behavior that occurs in a system like rule 30, that are a hallmark of sophisticated computation, and probably computation universality.

There are really two basic possibilities. Either the S combinator alone is capable of sophisticated computation, and there is, for example, computational irreducibility in determining the outcome of a long s combinator evolution. Or the S combinator is fundamentally computationally reducible—and there is some approach (and maybe some new direction in mathematics) that “cracks it open”, and allows one to readily predict everything that an S combinator expression will do.

I’m not sure which way it’s going to go—although my almost-uniform experience over the last four decades has been that when I think some system is “too simple” to “do anything interesting” or show sophisticated computation, it eventually proves me wrong, often in bizarre and unexpected ways. (In the case of the S combinator, a possibility—like I found for example in register machines—is that sophisticated computation might first reveal itself in very subtle effects, like seemingly random off-by-one patterns.)

But whatever happens, it’s amazing that 100 years after the invention of the S combinator there are still such big mysteries about it. In his original paper, Moses Schönfinkel expressed his surprise that something as simple as S and K were sufficient to achieve what we would now call universal computation. And it will be truly remarkable if in fact one can go even further, and S alone is sufficient: a minimal example of universal computation hiding in plain sight for a hundred years.

(By the way, in addition to ordinary “deterministic” combinator evolution with a particular evaluation scheme, one can also consider the “nondeterministic” case corresponding to all possible paths in the multiway graph. And in that case there’s a question of categorizing infinite graphs obtained by nonterminating S combinator expressions—perhaps in terms of transfinite numbers.)

Causal Graphs and the Physicalization of Combinators

Not long ago one wouldn’t have had any reason to think that ideas from physics would relate to combinators. But our Wolfram Physics Project has changed that. And in fact it looks as if methods and intuition from our Physics Project—and the connections they make to things like relativity—may give some interesting new insights into combinators, and may in fact make their operation a little less mysterious.

In our Physics Project we imagine that the universe consists of a very large number of abstract elements (“atoms of space”) connected by relations—as represented by a hypergraph. The behavior of the universe—and the progression of time—is then associated with repeated rewriting of this hypergraph according to a certain set of (presumably local) rules.

It’s certainly not the same as the way combinators work, but there are definite similarities. In combinators, the basic “data structure” is not a hypergraph, but a binary tree. But combinator expressions evolve by repeated rewriting of this tree according to rules that are local on the tree.

There’s a kind of intermediate case that we’ve often used as a toy model for aspects of physics (particularly quantum mechanics): string substitution systems. A combinator expression can be written out “linearly” (say as s[s][s][s[s[s]]][k][s]), but really it’s tree-structured and hierarchical. In a string substitution system, however, one just has plain strings, consisting of sequences of characters, without any hierarchy. The system then evolves by repeatedly rewriting the string by applying some local string substitution rule.

For example, one could have a rule like {"A"  "BBB","BB"  "A"}. And just like with combinators, given a particular string—like "BBA"—there are different possible choices about where to apply the rule. And—again like with combinators—we can construct a multiway graph to represent all possible sequences of rewritings:

MultiwaySystem
&#10005
Graph[ResourceFunction["MultiwaySystem"][{"A" -> "BBB", "BB" -> "A"}, 
  "BBA", 5, "StatesGraph"], AspectRatio -> 1]

And again as with combinators we can define a particular “evaluation order” that determines which of the possible updates to the string to apply at each step—and that defines a path through the multiway graph.

For strings there aren’t really the same notions of “innermost” and “outermost”, but there are “leftmost” and “rightmost”. Leftmost updating in this case would give the evolution history

NestList
&#10005
NestList[StringReplace[#, {"A" -> "BBB", "BB" -> "A"}, 
   1] &, "BBA", 10]

which corresponds to the path:

MultiwaySystem
&#10005
With[{g = 
   Graph[ResourceFunction["MultiwaySystem"][{"A" -> "BBB", 
      "BB" -> "A"}, "BBA", 5, "StatesGraph"], AspectRatio -> 1]}, 
 HighlightGraph[g, 
  Style[Subgraph[g, 
    NestList[StringReplace[#, {"A" -> "BBB", "BB" -> "A"}, 1] &, 
     "BBA", 10]], Thick, RGBColor[0.984, 0.43, 0.208]]]]

Here’s the underlying evolution corresponding to that path, with the updating events indicated in yellow:

SubstitutionSystemCausalPlot
&#10005
ResourceFunction["SubstitutionSystemCausalPlot"][
 ResourceFunction["SubstitutionSystemCausalEvolution"][{"A" -> "BBB", 
   "BB" -> "A"}, "BBA", 8, "First"], "CellLabels" -> True, 
 "ColorTable" -> {Hue[0.6296304159168616, 0.13, 0.9400000000000001], 
   Hue[0.6296304159168616, 0.07257971950090639, 0.9725480985324374, 
    1.]}, ImageSize -> 120]

But now we can start tracing the “causal dependence” of one event on another. What characters need to have been produced as “output” from a preceding event in order to provide “input” to a new event? Let’s look at a case where we have a few more events going on:

SubstitutionSystemCausalPlot
&#10005
ResourceFunction["SubstitutionSystemCausalPlot"][
 BlockRandom[SeedRandom[33242]; 
  ResourceFunction["SubstitutionSystemCausalEvolution"][{"A" -> "BBB",
     "BB" -> "A"}, "BBA", 8, {"Random", 3}]], "CellLabels" -> True, 
 "ColorTable" -> {Hue[0.6296304159168616, 0.13, 0.9400000000000001], 
   Hue[0.6296304159168616, 0.07257971950090639, 0.9725480985324374, 
    1.]}, ImageSize -> 180]

But now we can draw a causal graph that shows causal relationships between events, i.e. which events have to have happened in order to enable subsequent events:

SubstitutionSystemCausalPlot
&#10005
With[{gr = 
   ResourceFunction["SubstitutionSystemCausalPlot"][
    BlockRandom[SeedRandom[33242]; 
     ResourceFunction[
       "SubstitutionSystemCausalEvolution"][{"A" -> "BBB", 
       "BB" -> "A"}, "BBA", 8, {"Random", 3}]], "CausalGraph" -> True,
     "CausalGraphStyle" -> Directive[Thick, Red], 
    "ColorTable" -> {Hue[
      0.6296304159168616, 0.13, 0.9400000000000001], Hue[
      0.6296304159168616, 0.07257971950090639, 0.9725480985324374, 
       1.]}, ImageSize -> 180]},
 Prepend[gr[[2 ;; -1]],
  Replace[gr[[1]], Arrow[__] -> {}, Infinity]~Join~
   {RGBColor[0.984, 0.43, 0.208], Thickness[0.01], 
    Cases[gr, Arrow[__], Infinity]}]
 ]

And at a physics level, if we’re an observer embedded in the system, operating according to the rules of the system, all we can ultimately “observe” is the “disembodied” causal graph, where the nodes are events, and the edges represent the causal relationships between these events:

SubstitutionSystemCausalGraph
&#10005
ResourceFunction["SubstitutionSystemCausalGraph"][{"A" -> "BBB", 
  "BB" -> "A"}, "BBA", 5]

So how does this relate to combinators? Well, we can also create causal graphs for those—to get a different view of “what’s going on” during combinator evolution.

There is significant subtlety in exactly how “causal dependence” should be defined for combinator systems (when is a copied subtree “different”?, etc.). Here I’ll use a straightforward definition that’ll give us an indication of how causal relationships in combinators work, but that’s going to require further refinement to fit in with other definitions we want.

Imagine we just write out combinator expressions in a linear way. Then here’s a combinator evolution:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  CombinatorPlot[#, "FramedMatches", 
     "EvaluationScheme" -> {"Leftmost", "Innermost", 1}] & /@ 
   CombinatorEvolveList[s[s][s][s[s[s][k]]][k], 
    36, {"Leftmost", "Innermost", 1}], "StatesDisplay"], .65]

To understand causal relationships we need to trace “what gets rewritten to what”—and which previous rewriting events a given rewriting event “takes its input from”. It’s helpful to look at the rewriting process above in terms of trees:

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
CombinatorExpressionGraph[#, 
   "UpdateHighlighting" -> {"Nodes", "Subtrees"}, 
   "ShowVertexLabels" -> False, 
   "EvaluationScheme" -> {"Leftmost", "Innermost", 1}, 
   ImageSize -> {Automatic, 50}] & /@ 
 CombinatorEvolveList[s[s][s][s[s[s][k]]][k], 
  36, {"Leftmost", "Innermost", 1}]

Going back to a textual representation, we can show the evolution in terms of “states”, and the “events” that connect them. Then we can trace (in orange) what the causal relationships between the events are:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
ResourceFunction[
  "MultiwayCombinator"][{s[x_][y_][z_] :> 
    x[z][y[z]]} -> (EvaluationOrderTake[#, {"Leftmost", "Outermost", 
      3}] &), s[s][s][s[s[s][k]]][k], 6, "EvolutionCausalGraph"]

Continuing this for a few more steps we get:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
ResourceFunction[
  "MultiwayCombinator"][{s[x_][y_][z_] :> 
    x[z][y[z]]} -> (EvaluationOrderTake[#, {"Leftmost", "Outermost", 
      3}] &), s[s][s][s[s[s][k]]][
  k], 20, "EvolutionCausalGraphStructure"]

Now keeping only the causal graph, and continuing until the combinator evolution terminates, we get:

CombinatorCausalGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorCausalGraph[
 s[s][s][s[s[s][k]]][k], 50, {"Leftmost", "Innermost", 1}, 
 AspectRatio -> 3]

It’s interesting to compare this with a plot that summarizes the succession of rewriting events:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s][s][s[s[s][k]]][k], {"Leftmost", "Innermost", 
   1}], "SizeAndMatches", 
 "EvaluationScheme" -> {"Leftmost", "Innermost", 1}, ImageSize -> 190,
  PlotRange -> All]

So what are we actually seeing in the causal graph? Basically it’s showing us what “threads of evaluation” occur in the system. When there are different parts of the combinator expression that are in effect getting updated independently, we see multiple causal edges running in parallel. But when there’s a synchronized evaluation that affects the whole system, we just see a single thread—a single causal edge.

The causal graph is in a sense giving us a summary of the structure of the combinator evolution, with many details stripped out. And even when the size of the combinator expression grows rapidly, the causal graph can still stay quite simple. So, for example, the growing combinator s[s][s][s[s]][s][s] has a causal graph that forms a linear chain with simple “side loops” that get systematically further apart:

CombinatorCausalGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorCausalGraph[
 s[s][s][s[s]][s][s], 40, {"Leftmost", "Outermost", 1}, 
 AspectRatio -> 3]

Sometimes it seems that the growth dies out because different parts of the combinator system become causally disconnected from each other:

CombinatorCausalGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; {CombinatorCausalGraph[s[s[s]][s[s]][s[s]][s][s], 200, 
  AspectRatio -> 3], 
 Labeled[CombinatorEvolutionPlot[CombinatorEvolveList[#, 100], 
     "SizeAndMatches", ImageSize -> 190, PlotRange -> All], 
    Style[Text[#], 12]] &[s[s[s]][s[s]][s[s]][s][s]]}

Here are a few other examples:

CombinatorCausalGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ParallelMap[{CombinatorCausalGraph[#, 30, 
    AspectRatio -> 3], 
   Labeled[CombinatorEvolutionPlot[CombinatorEvolveList[#, 50], 
     "SizeAndMatches", ImageSize -> 190, PlotRange -> All], 
    Style[Text[#], 12]]} &, {s[s][s][s[s[s][s]]][k], 
  s[s][s][s[s[s[k]]]][s][s[k]], s[s][s][s[s]][s][k[s]], 
  s[s][s][s[s]][s][s[k]]}]

But do such causal graphs depend on the evaluation scheme used? This turns out to be a subtle question that depends sensitively on definitions of identity for abstract expressions and their subexpressions.

The first thing to say is that combinators are confluent, in the sense that different evaluation schemes—even if they take different paths—must always give the same final result whenever the evolution of a combinator expression terminates. And closely related to this is the fact that in the multiway graph for a combinator system, any branching must be accompanied by subsequent merging.

For both string and hypergraph rewriting rules, the presence of these properties is associated with another important property that we call causal invariance. And causal invariance is precisely the property that causal graphs produced by different updating orders must always be isomorphic. (And in our model of physics, this is what leads to relativistic invariance, general covariance, objective measurement in quantum mechanics, etc.)

So is the same thing true for combinators? It’s complicated. Both string and hypergraph rewriting systems have an important simplifying feature: when you update something in them, it’s reasonable to think of the thing you update as being “fully consumed” by the updating event, with a “completely new thing” being created as a result of the event.

But with combinators that’s not such a reasonable picture. Because when there’s an updating event, say for s[x][y][z], x can be a giant subtree that you end up “just copying”, without, in a sense, “consuming” and “reconstituting”. In the case of strings and hypergraphs, there’s a clear distinction between elements of the system that are “involved in an update”, and ones that aren’t. But in a combinator system, it’s not so obvious whether nodes buried deep in a subtree that’s “just copied” should be considered “involved” or not.

There’s a complicated interplay with definitions used in constructing multiway graphs. Consider a string rewriting system. Start from a particular state and then apply rewriting rules in all possible ways:

LayeredGraphPlot
&#10005
LayeredGraphPlot[
 ResourceFunction["MultiwaySystem"][{"A" -> "AB", "BB" -> "A"}, "A", 
  5, "EvolutionGraphUnmerged"], AspectRatio -> .4]

Absent anything else, this will just generate a tree of results. But the crucial idea behind multiway graphs is that when states are identical, they should be merged, in this case giving:

Graph
&#10005
Graph[ResourceFunction["MultiwaySystem"][{"A" -> "AB", "BB" -> "A"}, 
  "A", 5, "StatesGraph"], AspectRatio -> .4]

For strings it’s very obvious what “being identical” means. For hypergraphs, the natural definition is hypergraph isomorphism. What about for combinators? Is it pure tree isomorphism, or should one take into account the “provenance” of subtrees?

(There are also questions like whether one should define the nodes in the multiway graph in terms of “instantaneous states” at all, or whether instead they should be based on “causal graphs so far”, as obtained with particular event histories.)

These are subtle issues, but it seems pretty clear that with appropriate definitions combinators will show causal invariance, so that (appropriately defined) causal graphs will be independent of evaluation scheme.

By the way, in addition to constructing causal graphs for particular evolution histories, one can also construct multiway causal graphs representing all possible causal relationships both within and between different branches of history. This shows the multiway graph for the (terminating) evolution of s[s][s][s[s[k]]][k], annotated with casual edges:

MultiwayCombinator
&#10005
ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}, 
 s[s][s][s[s[k]]][k], 15, "EvolutionCausalGraphStructure", 
 GraphLayout -> "LayeredDigraphEmbedding", AspectRatio -> 2]

And here’s the multiway causal graph alone in this case:

MultiwayCombinator
&#10005
ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}, s[s][s][s[s[k]]][k], 15, "CausalGraphStructure", 
 AspectRatio -> 1]

(And, yes, the definitions don’t all quite line up here, so the individual instances of causal graphs that can be extracted here aren’t all the same, as causal invariance would imply.)

The multiway causal graph for s[s[s]][s][s][s][s] shows a veritable explosion of causal edges:

MultiwayCombinator
&#10005
ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}, s[s[s]][s][s][s][s], 17, "CausalGraphStructure", 
 AspectRatio -> 1, GraphLayout -> "LayeredDigraphEmbedding"]

In our model of physics, the causal graph can be thought of as a representation of the structure of spacetime. Events that follow from each other are “timelike separated”. Events that can be arranged so that none are timelike separated can be considered to form a “spacelike slice” (or a “surface of simultaneity”), and to be spacelike separated. (Different foliations of the causal graph correspond to different “reference frames” and identify different sets of events as being in the same spacelike slice.)

When we’re dealing with multiway systems it’s also possible for events to be associated with different “threads of history”—and so to be branchlike separated. But in combinator systems, there’s yet another form of separation between events that’s possible—that we can call “treelike separation”.

Consider these two pairs of updating events:

CombinatorExpressionGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; {CombinatorExpressionGraph[
  x[s[x][y][z]][y[s[x][y][z]]], "MatchHighlighting" -> True], 
 CombinatorExpressionGraph[s[x][y][s[x][y][z]], 
  "MatchHighlighting" -> True]}

In the first case, the events are effectively “spacelike separated”. They are connected by being in the same combinator expression, but they somehow appear at “distinct places”. But what about the second case? Again the two events are connected by being in the same combinator expression. But now they’re not really “at distinct places”; they’re just “at distinct scales” in the tree.

One feature of hypergraph rewriting systems is that in large-scale limits the hypergraphs they produce can behave like continuous manifolds that potentially represent physical space, with hypergraph distances approximating geometric distances. In combinator systems there is almost inevitably a kind of nested structure that may perhaps be reminiscent of scale-invariant critical phenomena and ideas like scale relativity. But I haven’t yet seen combinator systems whose limiting behavior produces something like finite-dimensional “manifold-like” space.

It’s common to see “event horizons” in combinator causal graphs, in which different parts of the combinator system effectively become causally disconnected. When combinators reach fixed points, it’s as if “time is ending”—much as it does in spacelike singularities in spacetime. But there are no doubt new “treelike” limiting phenomena in combinator systems, that may perhaps be reflected in properties of hyperbolic spaces.

One important feature of both string and hypergraph rewriting systems is that their rules are generally assumed to be somehow local, so that the future effect of any given element must lie within a certain “cone of influence”. Or, in other words, there’s a light cone which defines the maximum spacelike separation of events that can be causally connected when they have a certain timelike separation. In our model of physics, there’s also an “entanglement cone” that defines maximum branchlike separation between events.

But what about in combinator systems? The rules aren’t really “spatially local”, but they are “tree local”. And so they have a limited “tree cone” of influence, associated with a “maximum treelike speed”—or, in a sense, a maximum speed of scale change.

Rewriting systems based on strings, hypergraphs and combinator expressions all have different simplifying and complexifying features. The relation between underlying elements (“characters arranged in sequence”) is simplest for strings. The notion of what counts as the same element is simplest for hypergraphs. But the relation between the “identities of elements” is probably simplest for combinator expressions.

Recall that we can always represent a combinator expression by a DAG in which we “build up from atoms”, sharing common subexpressions all the way up:

CombinatorToDAG
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[CombinatorToDAG[s[s[s]][s][s[s]][s]], 
 VertexLabels -> Placed[Automatic, Automatic, ToString]]

But what does combinator evolution look like in this representation? Let’s start from the extremely simple case of k[x][y], which in one step becomes just x. Here’s how we can represent this evolution process in DAGs:

Graph
&#10005
 CloudGet[
    "https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.\
wl"]; 
Graph[#, VertexLabels -> Placed[Automatic, Automatic, ToString], 
  GraphLayout -> "LayeredDigraphEmbedding"] & /@ 
 SKDAGList[k[x][y], 1]

The dotted line in the second DAG indicates an update event, which in this case transforms k[x][y] to the “atom” x.

Now let’s consider s[x][y][z]. Once again there’s a dotted line that signifies the evolution:

Graph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Graph[#, VertexLabels -> Placed[Automatic, Automatic, ToString], 
   GraphLayout -> "LayeredDigraphEmbedding"] & /@ 
 SKDAGList[s[x][y][z], 1]

Now let’s add an extra wrinkle: consider not k[x][y] but s[k[x][y]]. The outer s doesn’t really do anything here. But it still has to be accounted for, in the sense that it has to be “wrapped back around” the x that comes from k[x][y]x. We can represent that “rewrapping” process, by a “tree pullback pseudoevent” indicated by the dotted line:

Graph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Graph[#, VertexLabels -> Placed[Automatic, Automatic, ToString], 
   GraphLayout -> "LayeredDigraphEmbedding"] & /@ 
 SKDAGList[s[k[x][y]], 1]

If a given event happens deep inside a tree, there’ll be a whole sequence of “pullback pseudoevents” that “reconstitute the tree”.

Things get quite complicated pretty quickly. Here’s the (leftmost-outermost) evolution of s[s[s]][s][k][s] to its fixed point in terms of DAGs:

SKDAGList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; SKDAGList[s[s[s]][s][k][s], 5]

Or with labels:

Graph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[Last[SKDAGList[s[s[s]][s][k][s], 5]], 
 VertexLabels -> Placed[Automatic, Automatic, ToString]]

One notable feature is that this final DAG in a sense encodes the complete history of the evolution—in a “maximally shared” way. And from this DAG we can construct a causal graph—whose nodes are derived from the edges in the DAG representing update events and pseudoevents. It’s not clear how to do this in the most consistent way—particularly when it comes to handling pseudoevents. But here’s one possible version of a causal graph for the evolution of s[s[s]][s][k][s] to its fixed point—with the yellow nodes representing events, and the gray ones pseudoevents:

DAGCausalGraph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[DAGCausalGraph[s[s[s]][s][k][s], 5], 
 VertexSize -> .3, AspectRatio -> .6]

Combinator Expressions as Dynamical Systems

Start with all possible combinator expressions of a certain size, say involving only s. Some are immediately fixed points. But some only evolve to fixed points. So how are the possible fixed points distributed in the set of all possible combinator expressions?

For size 6 there are 42 possible combinator expressions, and all evolve to fixed points—but only 27 distinct ones. Here are results for several combinator sizes:

fps = Monitor
&#10005

As the size of the combinator expression goes up, the fraction of distinct fixed points seems to systematically go down:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 Table[N[Length[Union[fps[[n]]]]/Length[fps[[n]]]], {n, Length[fps]}],
  Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 Frame -> True]

And what this shows is that combinator evolution is in a sense a “contractive” process: starting from all possible expressions, there’s only a certain “attractor” of expressions that survives. Here’s a “state transition graph” for initial expressions of size 9 computed with leftmost-outermost evaluation (we’ll see a more general version in the next section):

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"]; 
Graph[
 With[{n = 9}, 
  ResourceFunction[
    "ParallelMapMonitored"][# -> CombinatorFixedPoint[#] &, 
   Complement[Groupings[Table[s, n], Construct -> 2], 
    CloudImport[
     StringTemplate[
       "https://www.wolframcloud.com/obj/sw-blog/Combinators/Data/S-\
NT1e4-``.wxf"][n]]]]]]

This shows the prevalence of different fixed-point sizes as a function of the size of the initial expression:

bubble
&#10005

What about the cases that don’t reach fixed points? Can we somehow identify different equivalent classes of infinite combinator evolutions (perhaps analogously to the way we can identify different transfinite numbers)? In general we can look at similarities between the multiway systems that are generated, since these are always independent of updating scheme (see the next section).

But something else we can do for both finite and infinite evolutions is to consider the set of subexpressions common to different steps in the evolution—or across different evolutions. Here’s a plot of the number of copies of the ultimately most frequent subexpressions at successive steps in the (leftmost-outermost) evolution of s[s][s][s[s]][s][s] (SSS(SS)SS):

CombinatorEvolveList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 With[{evol = CombinatorEvolveList[s[s][s][s[s]][s][s], 35]}, 
  Function[c, 
    Callout[Count[#, c, {0, Infinity}, Heads -> True] & /@ evol, 
     CombinatorExpressionGraph[c, 
      "ShowVertexLabels" -> False]]] /@ {s, s[s], s[s[s]], s[s[s]][s],
     s[s[s[s]][s]], s[s[s[s]][s]][s], s[s[s]][s[s[s[s]][s]][s]], 
    s[s[s[s]][s[s[s[s]][s]][s]]], 
    s[s[s]][s[s[s[s]][s]][s]][s[s[s[s]][s[s[s[s]][s]][s]]]], 
    s[s[s]][s[s[s[s]][s]][s]][s[s[s[s]][s[s[s[s]][s]][s]]]][
     s[s[s[s]][s[s[s[s]][s]][s]]]]}], Frame -> True, ImageSize -> 800]

The largest subexpression shown here has size 29. And as the picture makes clear, most subexpressions do not appear with substantial frequency; it’s only a thin set that does.

Looking at the evolution of all possible combinator expressions up to size 8, one sees gradual “freezing out” of certain subexpressions (basically as a result of their involvement in halting), and continued growth of others:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						ListStepPlot[
 With[{evols = 
    Transpose[
     CombinatorEvolveList[#, 35] & /@ 
      Flatten[Table[
        Groupings[Table[s, n], Construct -> 2], {n, 8}]]]}, 
  Function[cb, 
    Callout[Count[#, cb, {1, Infinity}, Heads -> True] & /@ evols, 
     StandardForm[Text[cb]]]] /@ {s[s[s][s]], s[s[s[s]]], s[s][s], 
    s[s[s]][s], s[s[s[s]][s]], s[s][s[s]], s, s[s[s]], s[s[s[s][s]]], 
    s[s]}], ScalingFunctions -> "Log", Frame -> True, 
 ImageSize -> 800]

In an attempt to make contact with traditional dynamical systems theory it’s interesting to try to map combinator expressions to numbers. A straightforward way to do this (particularly when one’s only dealing with expressions involving s) is to use Polish notation, which represents

s
&#10005
s[s[s]][s[s[s[s]][s]][s]][s[s[s[s]][s[s[s[s]][s]][s]]]][
 s[s[s[s]][s[s[s[s]][s]][s]]]]

as

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 s[s[s]][s[s[s[s]][s]][s]][s[s[s[s]][s[s[s[s]][s]][s]]]][
  s[s[s[s]][s[s[s[s]][s]][s]]]], "CharactersPolishNotation", 
 "UseCombinatorGlyphs" -> None]

or the binary number

Row
&#10005
Row[Flatten[
   s[s[s]][s[s[s[s]][s]][s]][s[s[s[s]][s[s[s[s]][s]][s]]]][
     s[s[s[s]][s[s[s[s]][s]][s]]]] //. 
    x_[y_] -> {\[Bullet], x, y}] /. {\[Bullet] -> 1, s -> 0}]

i.e., in decimal:

FromDigits
&#10005
FromDigits[
 Flatten[s[s[s]][s[s[s[s]][s]][s]][s[s[s[s]][s[s[s[s]][s]][s]]]][
     s[s[s[s]][s[s[s[s]][s]][s]]]] //. 
    x_[y_] -> {\[Bullet], x, y}] /. {\[Bullet] -> 1, s -> 0}, 2]

Represented in terms of numbers like this, we can plot all subexpressions which arise in the evolution of s[s][s][s[s]][s][s] (SSS(SS)SS):

cbToNumber
&#10005

Making a combined picture for all combinator expressions up to size 8, one gets:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListPlot[
 Union[Catenate[
   ResourceFunction["ParallelMapMonitored"][
    Function[expr, 
     Catenate[
      MapIndexed[(Function[u, {First[#2 - 1], u}] /@ 
          DeleteCases[
           cbToNumber /@ Level[#, {0, Infinity}, Heads -> True], 
           0]) &, CombinatorEvolveList[expr, 50]]]], 
    Flatten[Table[Groupings[Table[s, n], Construct -> 2], {n, 8}]]]]],
  ScalingFunctions -> "Log", Frame -> True]

There’s definitely some structure: one’s not just visiting every possible subexpression. But quite what the limiting form of this might be is not clear.

Another type of question to ask is what the effect of a small change in a combinator expression is on its evolution. The result will inevitably be somewhat subtle—because there is both spacelike and treelike propagation of effects in the evolution.

As one example, though, consider evolving s[s][s][s[s]][s][s] (SSS(SS)SS) for 20 steps (to get an expression of size 301). Now look at the effect of changing a single s in this expression to s[s], and then evolving the result. Here are the sizes of the expressions that are generated:

SKCombinatorLeftmostOutermostLeafCounts
&#10005

Equality and Theorem Proving for Combinators

How do you tell if two combinator expressions are equal? It depends what you mean by “equal”. The simplest definition—that we’ve implicitly used in constructing multiway graphs—is that expressions are equal only if they’re syntactically exactly the same (say they’re both s[k][s[s]]).

But what about a more semantic definition, that takes into account the fact that one combinator expression can be transformed to another by the combinator rules? The obvious thing to say is that combinator expressions should be considered equal if they can somehow be transformed by the rules into expressions that are syntactically the same.

And so long as the combinators evolve to fixed points this is in principle straightforward to tell. Like here are four syntactically different combinator expressions that all evolve to the same fixed point, and so in a semantic sense can be considered equal:

LeafCount
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 Transpose[
  Map[If[LeafCount[#] <= 2, #, Magnify[#, .8]] &, 
   Transpose[
    PadRight[
     CombinatorFixedPointList[#] & /@ {s[s][k][k][s[k]], 
       s[s[s]][s][k][k], s[s][s][k][s][k], s[k[s]][k[k]][k]}, 
     Automatic, ""]], {2}]], "StatesDisplay", Spacings -> 2]

One can think of the fixed point as representing a canonical form to which combinator expressions that are equal can be transformed. One can also think of the steps in the evolution as corresponding to steps in a proof of equality.

But there’s already an issue—that’s associated with the fundamental fact that combinators are computation universal. Because in general there’s no upper bound on how many steps it can take for the evolution of a combinator expression to halt (and no general a priori way to even tell if it’ll halt at all). So that means that there’s also no upper bound on the “length of proof” needed to show by explicit computation that two combinators are equal. Yes, it might only take 12 steps to show that this is yet another combinator equal to s[k]:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[s[s[s]][s][s][s][k]], "StatesDisplay"]

But it could also take 31 steps (and involve an intermediate expression of size 65):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  CombinatorFixedPointList[s[s[s]][s][s][s][s][k]], 
  "StatesDisplay"], .8]

We know that if we use leftmost-outermost evaluation, then any combinator expression that has a fixed point will eventually evolve to it (even though we can’t in general know how long it will take). But what about combinator expressions that don’t have fixed points? How can we tell if they’re “equal” according to our definition?

Basically we have to be able to tell if there are sequences of transformations under the combinator rules that cause the expressions to wind up syntactically the same. We can think of these sequences of transformations as being like possible paths of evolution. So then in effect what we’re asking is whether there are paths of evolution for different combinators that intersect.

But how can we characterize what possible paths of evolution might exist for all possible evaluation schemes? Well, that’s what the multiway graph does. And in terms of multiway graphs there’s then a concrete way to ask about equality (or, really, equivalence) between combinator expressions. We basically just need to ask whether there is some appropriate path between the expressions in the multiway graph.

There are lots of details, some of which we’ll discuss later. But what we’re basically dealing with is a quintessential example of the problem of theorem proving in a formal system. There are different ways to set things up. But as one example, we could take our system to define certain axioms that transform expressions. Applying these axioms in all possible ways generates a multiway graph with expressions as nodes. But then the statement that there’s a theorem that expression A is equal to expression B (in the sense that it can be transformed to it) becomes the statement that there’s a way to get from A to B in the graph—and giving a path can then be thought of as giving a proof of the theorem.

As an example, consider the combinator expressions:

s
&#10005
s[s][s[s][s[s[s]]][k]][k[s[s][s[s[s]]][k]]]
s
&#10005
s[k[s[k][s[s[s]][k]]]][k[s[s][s[s[s]]][k]]]

Constructing a multiway graph one can then find a path

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; With[{g = 
   ResourceFunction[
     "MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
     k[x_][y_] -> x}, s[s][s[s][s[s[s]]][k]][k[s[s][s[s[s]]][k]]], 6, 
    "StatesGraphStructure", GraphLayout -> "LayeredDigraphEmbedding", 
    AspectRatio -> 1/2]}, 
 HighlightGraph[g, 
  Style[Subgraph[
    g, {"s[s][s[s][s[s[s]]][k]][k[s[s][s[s[s]]][k]]]", 
     "s[k[s[s][s[s[s]]][k]]][s[s][s[s[s]]][k][k[s[s][s[s[s]]][k]]]]", 
     "s[k[s[k][s[s[s]][k]]]][s[s][s[s[s]]][k][k[s[s][s[s[s]]][k]]]]", 
     "s[k[s[k][s[s[s]][k]]]][s[k][s[s[s]][k]][k[s[s][s[s[s]]][k]]]]", 
     "s[k[s[k][s[s[s]][k]]]][k[k[s[s][s[s[s]]][k]]][s[s[s]][k][k[s[s][\
s[s[s]]][k]]]]]", "s[k[s[k][s[s[s]][k]]]][k[s[s][s[s[s]]][k]]]"}], 
   Thick, RGBColor[0.984, 0.43, 0.208]]]]

which corresponds to the proof that one can get from one of these expressions to the other:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[{s[s][s[s][s[s[s]]][k]][
   k[s[s][s[s[s]]][k]]], 
  s[k[s[s][s[s[s]]][k]]][s[s][s[s[s]]][k][k[s[s][s[s[s]]][k]]]], 
  s[k[s[k][s[s[s]][k]]]][s[s][s[s[s]]][k][k[s[s][s[s[s]]][k]]]], 
  s[k[s[k][s[s[s]][k]]]][s[k][s[s[s]][k]][k[s[s][s[s[s]]][k]]]], 
  s[k[s[k][s[s[s]][k]]]][
   k[k[s[s][s[s[s]]][k]]][s[s[s]][k][k[s[s][s[s[s]]][k]]]]], 
  s[k[s[k][s[s[s]][k]]]][k[s[s][s[s[s]]][k]]]}, "StatesDisplay"]

In this particular case, both expressions eventually reach a fixed point. But consider the expressions:

s
&#10005
s[s[s[s[s][s]]][k]][s[s[s[s[s][s]]][k]]][k[s[s[s[s][s]]][k]]]
s
&#10005
s[s[s[s][s]]][k][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]]

Neither of these expressions evolves to a fixed point. But there’s still a path in the (ultimately infinite) multiway graph between them

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; With[{g = 
   ResourceFunction[
     "MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
     k[x_][y_] -> x}, 
    s[s[s[s[s][s]]][k]][s[s[s[s[s][s]]][k]]][k[s[s[s[s][s]]][k]]], 7, 
    "StatesGraphStructure"]}, 
 HighlightGraph[g, 
  Style[Subgraph[
    g, {"s[s[s[s[s][s]]][k]][s[s[s[s[s][s]]][k]]][k[s[s[s[s][s]]][k]]]\
", "s[s[s[s][s]]][k][k[s[s[s[s][s]]][k]]][s[s[s[s[s][s]]][k]][k[s[s[s[\
s][s]]][k]]]]", 
     "s[s[s][s]][k[s[s[s[s][s]]][k]]][k[k[s[s[s[s][s]]][k]]]][s[s[s[s[\
s][s]]][k]][k[s[s[s[s][s]]][k]]]]", 
     "s[s][s][k[k[s[s[s[s][s]]][k]]]][k[s[s[s[s][s]]][k]][k[k[s[s[s[s]\
[s]]][k]]]]][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]]", 
     "s[k[k[s[s[s[s][s]]][k]]]][s[k[k[s[s[s[s][s]]][k]]]]][k[s[s[s[s][\
s]]][k]][k[k[s[s[s[s][s]]][k]]]]][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][\
k]]]]", "k[k[s[s[s[s][s]]][k]]][k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]][\
k]]]]][s[k[k[s[s[s[s][s]]][k]]]][k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]]\
[k]]]]]][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]]", 
     "k[s[s[s[s][s]]][k]][s[k[k[s[s[s[s][s]]][k]]]][k[s[s[s[s][s]]][k]\
][k[k[s[s[s[s][s]]][k]]]]]][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]]\
", "s[s[s[s][s]]][k][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]]"}], 
   Thick, RGBColor[0.984, 0.43, 0.208]]]]

corresponding to the equivalence proof:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[{s[s[s[s[s][s]]][k]][s[s[s[s[s][s]]][k]]][
    k[s[s[s[s][s]]][k]]], 
   s[s[s[s][s]]][k][k[s[s[s[s][s]]][k]]][
    s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]], 
   s[s[s][s]][k[s[s[s[s][s]]][k]]][k[k[s[s[s[s][s]]][k]]]][
    s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]], 
   s[s][s][k[k[s[s[s[s][s]]][k]]]][
     k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]][k]]]]][
    s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]], 
   s[k[k[s[s[s[s][s]]][k]]]][s[k[k[s[s[s[s][s]]][k]]]]][
     k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]][k]]]]][
    s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]], 
   k[k[s[s[s[s][s]]][k]]][
      k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]][k]]]]][
     s[k[k[s[s[s[s][s]]][k]]]][
      k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]][k]]]]]][
    s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]], 
   k[s[s[s[s][s]]][k]][
     s[k[k[s[s[s[s][s]]][k]]]][
      k[s[s[s[s][s]]][k]][k[k[s[s[s[s][s]]][k]]]]]][
    s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]], 
   s[s[s[s][s]]][k][s[s[s[s[s][s]]][k]][k[s[s[s[s][s]]][k]]]]}, 
  "StatesDisplay"], .8]

But with our definition, two combinator expressions can still be considered equal even if one of them can’t evolve into the other: it can just be that among the possible ancestors (or, equivalently for combinators, successors) of the expressions there’s somewhere an expression in common. (In physics terms, that their light cones somewhere overlap.)

Consider the expressions:

{s[s[s][s]][s][s[s][k]], s[s][k][s[s[s][k]]][k]}
&#10005
{s[s[s][s]][s][s[s][k]], s[s][k][s[s[s][k]]][k]}

Neither terminates, but it still turns out that there are paths of evolution for each of them that lead to the same expression:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  PadRight[FindShortestPath[
      ResourceFunction[
        "MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
        k[x_][y_] -> x}, #, 12, "StatesGraphStructure"], ToString[#], 
      "s[s[s][k]][s[s[s][k]]][s[s[s][k]]]"] & /@ {s[s[s][s]][s][
      s[s][k]], s[s][k][s[s[s][k]]][k]}, Automatic, ""], 
  "StatesDisplay",  Spacings -> 2], .9]

If we draw a combined multiway graph starting from the two initial expressions, we can see the converging paths:

grx = Graph
&#10005

But is there a more systematic way to think about relations between combinator expressions? Combinators are in a sense fundamentally computational constructs. But one can still try to connect them with traditional mathematics, and in particular with abstract algebra.

And so, for example, it’s common in the literature of combinators to talk about “combinatory algebra”, and to write an expression like

s
&#10005
s[k][s[s[k[s[s]][s]][s]][k]][k[s[k]][s[s[k[s[s]][s]][s]][k]]]

as

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 s[k][s[s[k[s[s]][s]][s]][k]][
  k[s[k]][s[s[k[s[s]][s]][s]][k]]], "CharactersLeftAssociative"]

where now one imagines that (“application”) is like an algebraic operator that “satisfies the relations”

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Map[Row[{CombinatorPlot[#[[1]], "CharactersLeftAssociative"], 
     Spacer[1], "\[LongEqual]", Spacer[1], 
     CombinatorPlot[#[[2]], "CharactersLeftAssociative"]}] &, {s[x][
      y][z] == x[z][y[z]], k[x][y] == x}] /. {x -> Style[x, Italic], 
  y -> Style[y, Italic], z -> Style[z, Italic]} 

with “constants” S and K. To determine whether two combinator expressions are equal one then has to see if there’s a sequence of “algebraic” transformations that can go from one to the other. The setup is very similar to what we’ve discussed above, but the “two-way” character of the rules allows one to directly use standard equational logic theorem-proving methods (although because combinator evolution is confluent one never strictly has to use reversed rules).

So, for example, to prove s[k[s]][k[k]][k]s[s][s][k][s][k] or

CombinatorTraditionalForm
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Row[{CombinatorTraditionalForm[#[[1]]], Spacer[1], "\[LongEqual]", 
    Spacer[1], CombinatorTraditionalForm[#[[2]]]}] &[
 s[k[s]][k[k]][k] == s[s][s][k][s][k]]

one applies a series of transformations based on the S and K “axioms” to parts of the left- and right-hand sides to eventually reduce the original equation to a tautology:

FindCombinatorProof
&#10005

One can give the outline of this proof as a standard FindEquationalProof proof graph:

FindCombinatorProof
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 FindCombinatorProof[s[k[s]][k[k]][k] == s[s][s][k][s][k], "SK"][
  "ProofGraph"], 
 VertexLabels -> {"Axiom 1" -> 
    "\!\(\*TemplateBox[{},\n\"CombinatorK\"]\) axiom", 
   "Axiom 2" -> "S axiom", "Hypothesis 1" -> "hypothesis", 
   "Conclusion 1" -> "tautology", 
   x_ /; (StringTake[x, 1] === "S") -> None}]

The yellowish dots correspond to the “intermediate lemmas” listed above, and the dotted lines indicate which lemmas use which axioms.

One can establish a theorem like

FindCombinatorProof
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Row[{CombinatorTraditionalForm[#[[1]]], Spacer[1], "\[LongEqual]", 
    Spacer[1], CombinatorTraditionalForm[#[[2]]]}] &[
 s[k[s]][k[k]][k] == s[s[s]][s][s][s][k]]

with a slightly more complex proof:

FindCombinatorProof
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Graph[
 FindCombinatorProof[s[k[s]][k[k]][k] == s[s[s]][s][s][s][k], "SK"][
  "ProofGraph"], 
 VertexLabels -> {"Axiom 1" -> 
    "\!\(\*TemplateBox[{},\n\"CombinatorK\"]\) axiom", 
   "Axiom 2" -> "S axiom", "Hypothesis 1" -> "hypothesis", 
   "Conclusion 1" -> "tautology", 
   x_ /; (StringTake[x, 1] === "S") -> None}]

One feature of this proof is that because the combinator rules are confluent—so that different branches in the multiway system always merge—the proof never has to involve critical pair lemmas representing equivalences between branches in the multiway system, and so can consist purely of a sequence of “substitution lemmas”.

There’s another tricky issue, though. And it has to do with taking “everyday” mathematical notions and connecting them with the precise symbolic structure that defines combinators and their evolution. As an example, let’s say you have combinators a and b. It might seem obvious that if a is to be considered equal to b, then it must follow that a[x]b[x] for all x.

But actually saying this is true is telling us something about what we mean by “equal”, and to specify this precisely we have to add the statement as a new axiom.

In our basic setup for proving anything to do with equality (or, for that matter, any equivalence relation), we’re already assuming the basic features of equivalence relations (reflexivity, symmetry, transitivity):

Column
&#10005
Column[{Infix[f[x, x], "\[LongEqual]"], 
  Implies[Infix[f[x, y], "\[LongEqual]"], 
   Infix[f[y, x], "\[LongEqual]"]], 
  Implies[Wedge[Infix[f[x, y], "\[LongEqual]"], 
    Infix[f[y, z],  "\[LongEqual]"]], 
   Infix[f[x, z],  "\[LongEqual]"]]} ]

In order to allow us to maintain equality while doing substitutions we also need the axiom:

Implies
&#10005
Implies[Wedge[Infix[f[x, y],  "\[LongEqual]"], 
  Infix[f[z, u],  "\[LongEqual]"]], 
 Infix[f[Application[x, z], Application[y, u]],  "\[LongEqual]"]]

And now to specify that combinator expressions that are considered equal also “do the same thing” when applied to equal expressions, we need the “extensionality” axiom:

Implies
&#10005
Implies[Infix[f[x, y],  "\[LongEqual]"], 
 Infix[f[Application[x, z], Application[y, z]],  "\[LongEqual]"]]

The previous axioms all work in pure “equational logic”. But when we add the extensionality axiom we have to explicitly use full first-order logic—with the result that we get more complicated proofs, though the same basic methods apply.

Lemmas and the Structure of Combinator Space

One feature of the proofs we’ve seen above is that each intermediate lemma just involves direct use of one or other of the axioms. But in general, lemmas can use lemmas, and one can “recursively” build up a proof much more efficiently than just by always directly using the axioms.

But which lemmas are best to use? If one’s doing ordinary human mathematics—and trying to make proofs intended for human consumption—one typically wants to use “famous lemmas” that help create a human-relatable narrative. But realistically there isn’t likely to be a “human-relatable narrative” for most combinator equivalence theorems (or, at least there won’t be until or unless thinking in terms of combinators somehow becomes commonplace).

So then there’s a more “mechanical” criterion: what lemmas do best at reducing the lengths of as many proofs as much as possible? There’s some trickiness associated with translations between proofs of equalities and proofs that one expression can evolve into another. But roughly the question boils downs to this. When we construct a multiway graph of combinator evolution, each event—and thus each edge—is just the application of a single combinator “axiom”.

But if instead we do transformations based on more sophisticated lemmas we can potentially get from one expression to another in fewer steps. In other words, if we “cache” certain combinator transformations, can we make finding paths in combinator multiway graphs systematically more efficient?

To find all possible “combinator theorems” from a multiway system, we should start from all possible combinator expressions, then trace all possible paths to other expressions. It’s a little like what we did in the previous section—except now we want to consider multiway evolution with all possible evaluation orders.

Here’s the complete multiway graph starting from all size-4 combinator expressions:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}, EnumerateCombinators[4], 4, "StatesGraph"]

Up to size 6, the graph is still finite (with each disconnected component in effect corresponding to a separate “fixed-point attractor”):

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}, 
 EnumerateCombinators[6], 12, "StatesGraphStructure"]

For size 7 and above, it becomes infinite. Here’s the beginning of the graph for size-8 expressions involving only s:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
  k[x_][y_] -> x}, 
 Groupings[Table[s, 8], Construct -> 2], 10, "StatesGraphStructure"]

If one keeps only terminating cases, one gets for size 8:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; With[{n = 8}, 
 ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
   k[x_][y_] -> x}, 
  Complement[Groupings[Table[s, n], Construct -> 2], 
   Import[CloudObject[
     StringTemplate[
       "https://www.wolframcloud.com/obj/sw-blog/Combinators/Data/S-\
NT1e4-``.wxf"][n]]]], 50, "StatesGraphStructure", ImageSize -> 260]]

And for size 9:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; With[{n = 10}, 
 ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
   k[x_][y_] -> x}, 
  Complement[Groupings[Table[s, n], Construct -> 2], 
   Import[CloudObject[
     StringTemplate[
       "https://www.wolframcloud.com/obj/sw-blog/Combinators/Data/S-\
NT1e4-``.wxf"][n]]]], 10, "StatesGraphStructure", ImageSize -> 270]]

To assess the “most useful” transformations for “finding equations” there’s more to do: not only do we need to track what leads to what, but we also need to track causal relationships. And this leads to ideas like using lemmas that have the largest number of causal edges associated with them.

But are there perhaps other ways to find relations between combinator expressions, and combinator theorems? Can we for example figure out what combinator expressions are “close to” what others? In a sense what we need is to define a “space of combinator expressions” with some appropriate notion of nearness.

One approach would just be to look at “raw distances” between trees—say based on asking how many edits have to be made to one tree to get to another. But an approach that more closely reflects actual features of combinators is to think about the concept of branchial graphs and branchial space that comes from our Physics Project.

Consider for example the multiway graph generated from s[s[s]][s][s[s]][s] (S(SS)S(SS)S):

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; LayeredGraphPlot[
 ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] :> x[z][y[z]]},
   s[s[s]][s][s[s]][s], 13, "StatesGraphStructure"], AspectRatio -> 2]

Now consider a foliation of this graph (and in general there will be many possible foliations that respect the partial order defined by the multiway graph):

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; LayeredGraphPlot[
 ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] :> x[z][y[z]]},
   s[s[s]][s][s[s]][s], 13, "StatesGraphStructure"], 
 AspectRatio -> 2, 
 Epilog -> {ResourceFunction["WolframPhysicsProjectStyleData"][
    "BranchialGraph", "EdgeStyle"], AbsoluteThickness[1], 
   Table[Line[{{-20, i}, {5, i}}], {i, 1.5, 36, 2.6}]}]

In each slice, we can then define—as in our Physics Project—a branchial graph in which nodes are joined when they have an immediate common ancestor in the multiway graph. In the case shown here, the branchial graphs in successive slices are:

MultiwayCombinator
&#10005
Table[Framed[
  ResourceFunction[
    "MultiwayCombinator"][{s[x_][y_][z_] :> x[z][y[z]]}, 
   s[s[s]][s][s[s]][s], t, "BranchialGraphStructure", 
   ImageSize -> Tiny], FrameStyle -> LightGray], {t, 4, 13}]

If we consider a combinator expression like s[s][s][s[s]][s][s] (SSS(SS)SS) that leads to infinite growth, we can ask what the “long-term” structure of the branchial graph will be. Here are the results after 18 and 19 steps:

MultiwayCombinator
&#10005
Table[Framed[
  ResourceFunction[
    "MultiwayCombinator"][{s[x_][y_][z_] :> x[z][y[z]]}, 
   s[s][s][s[s]][s][s], t, "BranchialGraphStructure", 
   ImageSize -> 300], FrameStyle -> LightGray], {t, 18, 19}]

The largest connected components here contain respectively 1879 and 10,693 combinator expressions. But what can we say about their structure? One thing suggested by our Physics Project is to try to “fit them to continuous spaces”. And a first step in doing that is to estimate their effective dimension—which one can do by looking at the growth in the volume of a “geodesic ball” in the graph as a function of its radius:

MultiwayCombinator
&#10005

The result for distances small compared to the diameter of the graph is close to quadratic growth—suggesting that there is some sense in which the space of combinator expressions generated in this way may have a limiting 2D manifold structure.

It’s worth pointing out that different foliations of the multiway graph (i.e. using different “reference frames”) will lead to different branchial graphs—but presumably the (suitably defined) causal invariance of combinator evolution will lead to relativistic-like invariance properties of the branchial graphs.

Somewhat complementary to looking at foliations of the multiway graph is the idea of trying to find quantities that can be computed for combinator expressions to determine whether the combinator expressions can be equal. Can we in essence find hash codes for combinator expressions that are equal whenever the combinator expressions are equal?

In general we’ve been looking at “purely symbolic” combinator expressions—like:

CombinatorPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorPlot[
 k[k[s[k]][s[k]]][s[k][k[s[k]][s[k]]]], "CharactersLeftAssociative"]

But what if we consider S, K to have definite, say numerical, values, and to be some kind of generalized multiplication operator that combines these values? We used this kind of approach above in finding a procedure for determining whether S combinator expressions will evolve to fixed points. And in general each possible choice of “multiplication functions” (and S, K “constant values”) can be viewed in mathematical terms as setting up a “model” (in the model-theoretic sense) for the “combinatory algebra”.

As a simple example, let’s consider a finite model in which there are just 2 possible values, and the “multiplication table” for the operator is:

Grid
&#10005
Grid[MapIndexed[
  If[#2[[1]] === 1 || #2[[2]] === 1,  
    Item[Style[#1, 12, Bold, GrayLevel[.35]], 
     Background -> GrayLevel[.9]], 
    Item[Style[#1, 12], 
     Background -> 
      Blend[{Hue[0.1, 0.89, 0.984], Hue[0.16, 0.51, 0.984], Hue[
        0.04768041237113402, 0, 0.984]}, (2 - #1)/2], 
     FrameStyle -> Darker[RGBColor[0.984, 0.43, 0.208], .2]]] &, 
  Prepend[MapIndexed[Prepend[#, First[#2]] &, {{2, 1}, {2, 2}}], 
   Prepend[Range[2], "\[Application]"]], {2}], Spacings -> {.25, 0}, 
 ItemSize -> {2, 2}, Frame -> All, FrameStyle -> GrayLevel[.6], 
 BaseStyle -> "Text"]

If we consider S combinator expressions of size 5, there are a total of 14 such expressions, in 10 equivalence classes, that evolve to different fixed points. If we now “evaluate the trees” according to our “model for ” we can see that within each equivalence class the value accumulated at the root of the tree is always the same, but differs between at least some of the equivalence classes:

SCombinatorAutomatonTreeGeneral
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Framed[Row[#, Spacer[5]]] & /@ 
 Map[SCombinatorAutomatonTreeGeneral[#, 
    Application[x_, y_] :> ({{2, 1}, {2, 2}}[[x, y]]), 1, 
    VertexSize -> .6, ImageSize -> {UpTo[120], UpTo[120]}] &, 
  EquivalenceGroups[5], {2}]

If we look at larger combinator expressions this all keeps working—until we get to two particular size-10 expressions, which have the same fixed point, but different “values”:

SCombinatorAutomatonTreeGeneral
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
SCombinatorAutomatonTreeGeneral[#, 
   Application[x_, y_] :> ({{2, 1}, {2, 2}}[[x, y]]), 1, 
   VertexSize -> .6, 
   ImageSize -> {UpTo[200], UpTo[200]}] & /@ {s[s[s]][
   s[s[s]][s][s][s[s]]], s[s][s[s[s]][s[s[s]]]][s[s]]}

Allowing 3 possible values, the longest-surviving models are

Grid
&#10005
Grid[MapIndexed[
    If[#2[[1]] === 1 || #2[[2]] === 1,  
      Item[Style[#1, 12, Bold, GrayLevel[.35]], 
       Background -> GrayLevel[.9]], 
      Item[Style[#1, 12], 
       Background -> 
        Blend[{Hue[0.1, 0.89, 0.984], Hue[0.16, 0.51, 0.984], Hue[
          0.04768041237113402, 0, 0.984]}, (3 - #1)/3], 
       FrameStyle -> Darker[RGBColor[0.984, 0.43, 0.208], .2]]] &, 
    Prepend[MapIndexed[Prepend[#, First[#2]] &, #], 
     Prepend[Range[3], "\[Application]"]], {2}], Spacings -> {.25, 0},
    ItemSize -> {2, 2}, Frame -> All, FrameStyle -> GrayLevel[.6], 
   BaseStyle -> "Text"] & /@ {{{2, 3, 2}, {2, 2, 2}, {2, 2, 1}}, {{3, 
    3, 2}, {3, 1, 3}, {3, 3, 3}}}

but these both fail at size 13 (e.g. for s[s][s[s]][s[s[s[s[s]][s][s]]][s[s]]], s[s[s]][s[s[s[s[s]][s[s[s]]]]]][s[s]]).

The fact that combinator equivalence is in general undecidable means we can’t expect to find a computationally finite “valuation procedure” that will distinguish all inequivalent combinator expressions. But it’s still conceivable that we could have a scheme to distinguish some classes of combinator expressions from others—in essence through the values of a kind of “conserved quantity for combinators”.

Another approach is to consider directly “combinator axioms” like

CombinatorTraditionalForm
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
Row[{CombinatorTraditionalForm[#[[1]]], Spacer[1], "\[LongEqual]", 
       Spacer[1], 
       CombinatorTraditionalForm[#[[
         2]]]}] &[#] & /@ {CombinatorS\[Application]x\[Application]y\
\[Application]z == x\[Application]z\[Application](y\[Application]z), 
   CombinatorK\[Application]x\[Application]y == x} /. {x -> 
   Style[x, Italic], y -> Style[y, Italic], z -> Style[z, Italic]} 

and simply ask if there are models of , S and K that satisfy them. Assuming a finite “multiplication table”, there’s no way to do this for K, and thus for S and K together. For S alone, however, there are already 8 2-valued models, and 285 3-valued ones.

The full story is more complicated, and has been the subject of a fair amount of academic work on combinators over the past half century. The main result is that there are models that are in principle known to exist, though they’re infinite and probably can’t be explicitly constructed.

In the case of something like arithmetic, there are formal axioms (the Peano axioms). But we know that (even though Gödel’s theorem shows that there are inevitably also other, exotic, non-standard models) there’s a model of these axioms that is the ordinary integers. And our familiarity with these and their properties makes us feel that the Peano axioms aren’t just formal axioms; they’re axioms “about” something, namely integers.

What are the combinator axioms “about”? There’s a perfectly good interpretation of them in terms of computational processes. But there doesn’t seem to be some “static” set of constructs—like the integers—that give one more insight about what combinators “really are”. Instead, it seems, combinators are in the end just through and through computational.

Empirical Computation Theory with Combinators

We’ve talked a lot here about what combinators “naturally do”. But what about getting combinators to do something specific—for example to perform a particular computation we want?

As we saw by example at the beginning of this piece, it’s not difficult to take any symbolic structure and “compile it” to combinators. Let’s say we’re given:

f
&#10005
f[y[x]][y][x]

There’s then a recursive procedure that effectively builds “function invocations” out of s’s and “stops computations” with k’s. And using this we can “compile” our symbolic expression to the (slightly complicated) combinator expression:

SKCombinatorCompile
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; SKCombinatorCompile[f[y[x]][y][x], {f, x, y}]

To “compute our original expression” we just have to take this combinator expression (“”), form [f][x][y], then apply the combinator rules and find the fixed point:

CloudGet
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[
  CombinatorFixedPointList[
   SKCombinatorCompile[f[y[x]][y][x], {f, x, y}][f][x][y]], 
  "StatesDisplay"], .5]

But is this the “best combinator way” to compute this result?

There are various different things we could mean by “best”. Smallest program? Fastest program? Most memory-efficient program? Or said in terms of combinators: Smallest combinator expression? Smallest number of rule applications? Smallest intermediate expression growth?

In computation theory one often talks theoretically about optimal programs and their characteristics. But when one’s used to studying programs “in the wild” one can start to do empirical studies of computation-theoretic questions—as I did, for example, with simple Turing machines in A New Kind of Science.

Traditional computation theory tends to focus on asymptotic results about “all possible programs”. But in empirical computation theory one’s dealing with specific programs—and in practice there’s a limit to how many one can look at. But the crucial and surprising fact that comes from studying the computational universe of “programs in the wild” is that actually even very small programs can show highly complex behavior that’s in some sense typical of all possible programs. And that means that it’s realistic to get intuition—and results—about computation-theoretic questions just by doing empirical investigations of actual, small programs.

So how does this work with combinators? An immediate question to ask is: if one wants a particular expression, what are all the possible combinator expressions that will generate it?

Let’s start with a seemingly trivial case: x[x]. With the compilation procedure we used above we get the size-7 combinator expression

SKCombinatorCompile
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; SKCombinatorCompile[x[x], {x}]

which (with leftmost-outermost evaluation) generates x[x] in 6 steps:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  SKCombinatorCompile[x[x], {x}][x]], "StatesDisplay"]

But what happens if we just start enumerating possible combinator expressions? Up to size 5, none compute x[x]. But at size 6, we have:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[s[s[s]][s][s[k]][x]], "StatesDisplay"]

So we can “save” one unit of program size, but at the “cost” of taking 9 steps, and having an intermediate expression of size 21.

What if we look at size-7 programs? There are a total of 11 that work (including the one from our “compiler”):

{s
&#10005
{s[s[s[s]]][s][s[k]], s[s[s]][s[k]][s[k]], s[s][s[k]][s[s[k]]], 
 s[s[s[k]]][s[k][s]], s[s][s[k]][s[k][s]], s[s[s[k]]][s[k][k]], 
 s[s][s[k]][s[k][k]], s[s[k][s]][s[k][s]], s[s[k][s]][s[k][k]], 
 s[s[k][k]][s[k][s]], s[s[k][k]][s[k][k]]}

How do these compare in terms of “time” (i.e. number of steps) and “memory” (i.e. maximum intermediate expression size)? There are 4 distinct programs that all take the same time and memory, there are none that are faster, but there are others that are slowest (the slowest taking 12 steps):

TimeMemoryList
&#10005

What happens with larger programs? Here’s a summary:

TimeMemoryList
&#10005

Here are the distributions of times (dropping outliers)—implying (as the medians above suggest) that even a randomly picked program is likely to be fairly fast:

alllambdas
&#10005

And here’s the distribution of time vs. memory on a log-log scale:

TimeMemoryList
&#10005

At size 10, the slowest and most memory-intensive program is s[s[s][k][s[s[s[s]]]]][s][k] (S(SSK(S(S(SS))))SK):

CombinatorFixedPointList
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 LeafCount /@ 
  CombinatorFixedPointList[s[s[s][k][s[s[s[s]]]]][s][k][1]], 
 AspectRatio -> 1/2, Frame -> True, Joined -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleDark"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 295]

There are so many other questions one can ask. For example: how similar are the various fastest programs? Do they all “work the same way”? At size 7 they pretty much seem to:

alllambdas
&#10005

At size 8 there are a few “different schemes” that start to appear:

alllambdas
&#10005

Then one can start to ask questions about how these fastest programs are laid out in the kind of “combinator space” we discussed in the last section—and whether there are good incremental (“evolutionary”) ways to find these fastest programs.

Another type of question has to do with the running of our programs. In everything we’ve done so far in this section, we’ve used a definite evaluation scheme: leftmost outermost. And in using this definite scheme, we can think of ourselves as doing “deterministic combinator computation”. But we can also consider the complete multiway system of all possible updating sequences—which amounts to doing non-deterministic computation.

Here’s the multiway graph for the size-6 case we considered above, highlighting the leftmost-outermost evaluation path:

Module
&#10005
Module[CloudGet[
  "https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];\
 {g = ResourceFunction[
     "MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
     k[x_][y_] -> x}, s[s[s]][s][s[k]][x], 12, "StatesGraphStructure",
     GraphLayout -> "LayeredDigraphEmbedding", AspectRatio -> 1], max},
 max = Max[LeafCount[ToExpression[#]] & /@ VertexList[g]];
 g = Graph[g, 
   VertexSize -> ((# -> 0.75*Sqrt[LeafCount[ToExpression[#]]/max]) & /@ 
      VertexList[g])];
 HighlightGraph[g, 
  Style[Subgraph[g, 
    ToString /@ CombinatorFixedPointList[s[s[s]][s][s[k]][x]]], Thick,
    RGBColor[0.984, 0.43, 0.208]]]]

And, yes, in this case leftmost outermost happens to follow a fastest path here. Some other possible schemes are very slow in comparison—with the maximum time being 13 and the maximum intermediate expression size being 21.

At size 7 the multiway graphs for all the leftmost-outermost-fastest programs are the same—and are very simple—among other things making it seem that in retrospect the size-6 case “only just makes it”:

alllambdas
&#10005

At size 8 there “two ideas” among the 16 cases:

alllambdas
&#10005

At size 9 there are “5 ideas” among 80 cases:

alllambdas
&#10005

And at size 10 things are starting to get more complicated:

alllambdas
&#10005

But if we don’t look at only at leftmost-outermost-fastest programs? At size 7 here are the multiway graphs for all combinator expressions that compute x[x]:

alllambdas
&#10005

So if one operates “non-deterministically”—i.e. one can follow any path in the multiway graph, not just the leftmost-outermost evaluation scheme one—can one compute the answer faster? The answer in this particular case is no.

But what about at size 8? Of the 95 programs that compute x[x], in most cases the situation is like for size 7 and leftmost outermost gives the fastest result. But there are some wilder things that can happen.

Consider for example

s
&#10005
s[s[s[s]]][k[s[k]]][s]

Here’s the complete multiway graph in this case (with 477 nodes altogether):

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; 
With[{g = 
     ResourceFunction[
       "MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
       k[x_][y_] -> x}, #[x], 16, "StatesGraphStructure", 
      GraphLayout -> "LayeredDigraphEmbedding", AspectRatio -> 1]}, 
   HighlightGraph[
    g, {Style[Subgraph[g, ToString /@ CombinatorFixedPointList[#[x]]],
       Thick, RGBColor[0.984, 0.43, 0.208]], 
     Style[Subgraph[g, FindShortestPath[g, ToString[#[x]], "x[x]"]], 
      Thick, Red]}]] &[s[s[s[s]]][k[s[k]]][s]]

Two paths are indicated: the one in orange is the leftmost-outermost evaluation—which takes 12 steps in this case. But there’s also another path, shown in red—which has length 11. Here’s a comparison:

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; Magnify[
 CombinatorEvolutionPlot[{MapIndexed[
    Row[{Text[Style[First[#2], Gray]], Spacer[6], #1}] &, 
    CombinatorFixedPointList[s[s[s[s]]][k[s[k]]][s][x]]], 
   MapIndexed[Row[{Text[Style[First[#2], Gray]], Spacer[6], #1}] &, 
    ToExpression /@ 
       With[{g = 
          ResourceFunction[
            "MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
            k[x_][y_] -> x}, #[x], 16, "StatesGraphStructure"]}, 
        FindShortestPath[g, ToString[#[x]], "x[x]"]] &[
     s[s[s[s]]][k[s[k]]][s]]]}, "StatesDisplay"], 0.8]

To get a sense of the “amount of non-determinism” that can occur, we can look at the number of nodes in successive layers of the multiway graph—essentially the number of “parallel threads” present at each “non-deterministic step”:

MultiwayCombinator
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; ListStepPlot[
 ResourceFunction["MultiwayCombinator"][{s[x_][y_][z_] -> x[z][y[z]], 
   k[x_][y_] -> x}, s[s[s[s]]][k[s[k]]][s][x], 16, 
  "StatesCountsList"], Center, Frame -> True, Filling -> Axis, 
 FillingStyle -> $PlotStyles["ListPlot", "FillingStyleLight"], 
 PlotStyle -> $PlotStyles["ListPlot", "PlotStyle"], ImageSize -> 190]

What about size-8 programs for x[x]? There are 9 more—similar to this one—where the non-deterministic computation is one step shorter. (Sometimes—as for s[s[s]][s][s[k[k][s]]]—the multiway graph is more complicated, in this case having 1661 nodes.)

But there are some other things that happen. And a dramatic one is that there can be paths that just don’t terminate at all. s[s[s[s][s]]][s][s[k]] gives an example. Leftmost-outermost evaluation reaches a fixed point after 14 steps. But overall the multiway graph grows exponentially (already having size 24,705 after 14 steps)—yielding eventually an infinite number of infinite paths: non-deterministic threads that in a sense get “lost forever”.

So far all we’ve talked about here is the computation of the one—seemingly trivial—object x[x]. But what about computing other things? Imagine we have a combinator expression that we apply to x to form [x]. If when we “evaluate” this with the combinator rules it reaches a fixed point we can say this is the result of the computation. But a key point is that most of the time this “result” won’t just contain x; it’ll still have “innards of the computation”—in the form of S’s and K’s—in it.

Out of all 2688 combinator expressions of size 6, 224 compute x. Only one (that we saw above) computes something more complicated: x[x]. At size 7, there are 11 programs that compute x[x], and 4 that compute x[x][x]. At size 8 the things that can be computed are:

alllambdas
&#10005

At size 9 the result is:

alllambdas
&#10005

In a sense what we’re seeing here are the expressions (or objects) of “low algorithmic information content” with respect to combinator computation: those for which the shortest combinator program that generates them is just of length 9. In addition to shortest program length, we can also ask about expressions generated within certain time or intermediate-expression-size constraints.

What about the other way around? How large a program does one need to generate a certain object? We know that x[x] can be generated with a program of size 6. It turns out x[x[x]] needs a program of size 8:

alllambdas
&#10005

Here are the shortest programs for objects of size 4:

alllambdas
&#10005

Our original “straightforward compiler” generates considerably longer programs: to get an object involving only x’s of size n it produces a program of length 4n – 1 (i.e. 15 in this case).

It’s interesting to compare the different situations here. x[x[x]][x[x]][x[x[x[x]][x[x]]]][x[x[x[x]][x[x]]]] (of size 17) can be generated by the program s[s[s]][s][s[s][s[k]]] (of size 8). But the shortest program that can generate x[x[x[x]]] (size 4) is of length 10. And what we’re seeing is that different objects can have very different levels of “algorithmic redundancy” under combinator computation.

Clearly we could go on to investigate objects that involve not just x, but also y, etc. And in general there’s lots of empirical computation theory that one can expect to do with combinators.

As one last example, one can ask how large a combinator expression is needed to “build to a certain size”, in the sense that the combinator expression evolves to a fixed point with that size. Here is the result for all sizes up to 100, both for S,K expressions, and for expressions with S alone (the dotted line is ):

skresults
&#10005

By the way, we can also ask about programs that involve only S, without K. If one wants [x] to evaluate to an expression involving only x this isn’t possible if one only uses S. But as we discussed above, it’s still perfectly possible to imagine “doing a computation” only using S: one just can’t expect to have the result delivered directly on its own. Instead, one must run some kind of procedure to extract the result from a “wrapper” that contains S’s.

What about practical computations? The most obvious implementation of combinators on standard modern computer systems isn’t very efficient because it tends to involve extensive copying of expressions. But by using things like the DAG approach discussed above it’s perfectly possible to make it efficient.

What about physical systems? Is there a way to do “intrinsically combinator” computation? As I discussed above, our model of fundamental physics doesn’t quite align with combinators. But closer would be computations that can be done with molecules. Imagine a molecule with a certain structure. Now imagine that another molecule reacts with it to produce a molecule with a new structure. If the molecules were tree-like dendrimers, it’s at least conceivable that one can get something like a combinator transformation process.

I’ve been interested for decades in using ideas gleaned from exploring the computational universe to do molecular-scale computation. Combinators as such probably aren’t the best “raw material”, but understanding how computation works with combinators is likely to be helpful.

And just for fun we can imagine taking actual expressions—say from the evolution of s[s][s][s[s]][s][s]—and converting them to “molecules” just using standard chemical SMILES strings (with C in place of S):

CombinatorTraditionalForm
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"]; CombinatorTraditionalForm /@ 
 CombinatorEvolveList[s[s][s][s[s]][s][s], 10]
ToCombinatorMolecule
&#10005

The Future of Combinators

S and K at first seem so simple, so basic. But as we’ve seen here, there’s an immense richness to what they can do. It’s a story I’ve seen played out many times across the computational universe. But in a sense it’s particularly remarkable for combinators because they were invented so early, and they seem so very simple.

There’s little question that even a century after they were invented, combinators are still hard to get one’s head around. Perhaps if computation and computer technology had developed differently, we’d now find combinators easier to understand. Or perhaps the way our brains are made, they’re just intrinsically difficult.

In a sense what makes combinators particularly difficult is the extent to which they’re both featureless and fundamentally dynamic in their structure. When we apply the ideas of combinators in practical “human-oriented” computing—for example in the Wolfram Language—we annotate what’s going on in a variety of ways. But with the Wolfram Physics Project we now have the idea that what happens at the lowest level in the physics of our universe is something much more like “raw combinators”.

The details are different—we’re dealing with hypergraphs, not trees—but many of the concepts are remarkably similar. Yes, a universe made with combinators probably won’t have anything like space in the way we experience it. But a lot of ideas about updating processes and multiway systems are all there in combinators.

For most of their history, combinators have been treated mainly as a kind of backstop for proofs. Yes, it is possible to avoid variables, construct everything symbolically, etc. But a century after they were invented, we can now see that combinators in their own right have much to contribute.

What happens if we don’t just think about combinators in general, but actually look at what specific combinators do? What happens if we do experiments on combinators? In the past some elaborate behavior of a particular combinator expression might have just seemed like a curiosity. But now that we have the whole paradigm that I’ve developed from studying the computational universe we can see how such things fit in, and help build up a coherent story about the ways of computation.

In A New Kind of Science I looked a bit at the behavior of combinators; here I’ve done more. But there’s still vastly more to explore in the combinator universe—and many surprises yet to uncover. Doing it will both advance the general science of the computational universe, and will give us a new palette of phenomena and intuition with which to think about other computational systems.

There are things to learn for physics. There are things to learn for language design. There are things to learn about the theoretical foundations of computer science. There may also be things to learn for models of concrete systems in the natural and artificial world—and for the construction of useful technology.

As we look at different kinds of computational systems, several stand out for their minimalism. Particularly notable in the past have been cellular automata, Turing machines and string substitution systems. And now there are also the systems from our Wolfram Physics Project—that seem destined to have all sorts of implications even far beyond physics. And there are also combinators.

One can think of cellular automata, for example, as minimal systems that are intrinsically organized in space and time. The systems from our Wolfram Physics Project are minimal systems that purely capture relations between things. And combinators are in a sense minimal systems that are intrinsically about programs—and whose fundamental structure and operation revolve around the symbolic representation of programs.

What can be done with such things? How should we think about them?

Despite the passage of a century—and a substantial body of academic work—we’re still just at the beginning of understanding what can be done with combinators. There’s a rich and fertile future ahead, as we begin the second combinator century, now equipped with the ideas of symbolic computational language, the phenomena of the computational universe, and the computational character of fundamental physics.

Historical & Other Notes

I’m writing elsewhere about the origin of combinators, and about their interaction with the history of computation. But here let me make some remarks more specific to this piece.

Combinators were invented in 1920 by Moses Schönfinkel (hence the centenary), and since the late 1920s there’s been continuous academic work on them—notably over more than half a century by Haskell Curry.

A classic summary of combinators from a mathematical point of view is the book: Haskell B. Curry and Robert Feys, Combinatory Logic (1958). More recent treatments (also of lambda calculus) include: H. P. Barendregt, The Lambda Calculus (1981) and J. Roger Hindley and Jonathan P. Seldin, Lambda-Calculus and Combinators (1986).

In the combinator literature, what I call “combinator expressions” are often called “terms” (as in “term rewriting systems”). The part of the expression that gets rewritten is often called the “redex”; the parts that get left over are sometimes called the “residuals”. The fixed point to which a combinator expression evolves is often called its “normal form”, and expressions that reach fixed points are called “normalizing”.

Forms like a[b[a][c]] that I “immediately apply to arguments” are basically lambda expressions, written in Wolfram Language using Function. The procedure of “compiling” from lambda expressions to combinators is sometimes called bracket abstraction. As indicated by examples at the end of this piece, there are many possible methods for doing this.

The scheme for doing arithmetic with combinators at the beginning of this piece is based on work by Alonzo Church in the 1930s, and uses so-called “Church numerals”. The idea of encoding logic by combinators was discussed by Schönfinkel in his original paper, though the specific minimal encoding I give was something I found by explicit computational search in just the past few weeks. Note that if one uses s[k] for True and k for False (as in the rule 110 cellular automaton encoding) the minimal forms for the Boolean operators are:

ttresults
&#10005

The uniqueness of the fixed point for combinators is a consequence of the Church–Rosser property for combinators from 1941. It is closely related to the causal invariance property that appears in our model of physics.

There’s been a steady stream of specific combinators defined for particular mathematical purposes. An example is the Y combinator s[s][k][s[k[s[s][s[s[s][k]]]]][k]], which has the property that for any x, Y[x] can be proved to be equivalent to x[Y[x]], and “recurses forever”. Here’s how Y[x] grows if one just runs it with leftmost-outermost evaluation (and it produces expressions of the form Nest[x, _, n] at step n2 + 7n):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/Programs.wl"];
						CombinatorEvolutionPlot[
 CombinatorEvolveList[s[s][k][s[k[s[s][s[s[s][k]]]]][k]][x], 
  100], "SizeAndMatches"]

The Y combinator was notably used by Paul Graham in 2005 to name his Y Combinator startup accelerator. And perhaps channeling the aspirations of startups the “actual” Y combinator goes through many ups and downs but (with leftmost-outermost evaluation) reaches size 1 billion (“unicorn”) after 494 steps—and after 1284 steps reaches more-dollars-than-in-the-world size: 508,107,499,710,983.

Empirical studies of the actual behavior of combinators “in the wild” have been pretty sparse. The vast majority of academic work on combinators has been done by hand, and without the overall framework of A New Kind of Science the detailed behavior of actual combinators mostly just seemed like a curiosity.

I did fairly extensive computational exploration of combinators (and in general what I called “symbolic systems”) in the 1990s for A New Kind of Science. Page 712 summarized some combinator behavior I found (with /. evaluation):

Combinator behavior found with /. evaluation

I don’t know to what extent the combinator results in A New Kind of Science were anticipated elsewhere. Longtime combinator enthusiast Henk Barendregt for example recently pointed me to a paper of his from 1976 mentioning non-termination in S combinator expressions:

Non-termination in S combinator expressions

The procedure I describe for determining the termination of S combinator expression was invented by Johannes Waldmann at the end of the 1990s. (The detailed version that I used here came from Jörg Endrullis.)

What we call multiway systems have been studied in different ways in different fields, under different names. In the case of combinators, they are basically Böhm trees (named after Corrado Böhm).

I’ve concentrated here on the original S, K combinators; in recent livestreams, as in A New Kind of Science, I’ve also been exploring other combinator rules.

Thanks, etc.

Matthew Szudzik has helped me with combinator matters since 1998 (and has given a lecture on combinators almost every year for the past 18 years at our Wolfram Summer School). Roman Maeder did a demo implementation of combinators in Mathematica in 1988, and has now added CombinatorS etc. to Version 12.2 of Wolfram Language.

I’ve had specific help on this piece from Jonathan Gorard, Jose Martin-Garcia, Eric Paul, Ed Pegg, Max Piskunov, and particularly Mano Namuduri, as well as Jeremy Davis, Sushma Kini, Amy Simpson and Jessica Wong. We’ve had recent interactions about combinators with a four-academic-generation sequence of combinator researchers: Henk Barendregt, Jan Willem Klop, Jörg Endrullis and Roy Overbeek.

Combinators and the Story of Computation

$
0
0
combinators-significance-1.1
Preliminary

The Abstract Representation of Things

“In principle you could use combinators,” some footnote might say. But the implication tends to be “But you probably don’t want to.” And, yes, combinators are deeply abstract—and in many ways hard to understand. But tracing their history over the hundred years since they were invented, I’ve come to realize just how critical they’ve actually been to the development of our modern conception of computation—and indeed my own contributions to it.

The idea of representing things in a formal, symbolic way has a long history. In antiquity there was Aristotle’s logic and Euclid’s geometry. By the 1400s there was algebra, and in the 1840s Boolean algebra. Each of these was a formal system that allowed one to make deductions purely within the system. But each, in a sense, ultimately viewed itself as being set up to model something specific. Logic was for modeling the structure of arguments, Euclid’s geometry the properties of space, algebra the properties of numbers; Boolean algebra aspired to model the “laws of thought”.

But was there perhaps some more general and fundamental infrastructure: some kind of abstract system that could ultimately model or represent anything? Today we understand that’s what computation is. And it’s becoming clear that the modern conception of computation is one of the single most powerful ideas in all of intellectual history—whose implications are only just beginning to unfold.

But how did we finally get to it? Combinators had an important role to play, woven into a complex tapestry of ideas stretching across more than a century.

The main part of the story begins in the 1800s. Through the course of the 1700s and 1800s mathematics had developed a more and more elaborate formal structure that seemed to be reaching ever further. But what really was mathematics? Was it a formal way of describing the world, or was it something else—perhaps something that could exist without any reference to the world?

Developments like non-Euclidean geometry, group theory and transfinite numbers made it seem as if meaningful mathematics could indeed be done just by positing abstract axioms from scratch and then following a process of deduction. But could all of mathematics actually just be a story of deduction, perhaps even ultimately derivable from something seemingly lower level—like logic?

But if so, what would things like numbers and arithmetic be? Somehow they would have to be “constructed out of pure logic”. Today we would recognize these efforts as “writing programs” for numbers and arithmetic in a “machine code” based on certain “instructions of logic”. But back then, everything about this and the ideas around it had to be invented.

What Is Mathematics—and Logic—Made Of?

Before one could really dig into the idea of “building mathematics from logic” one had to have ways to “write mathematics” and “write logic”. At first, everything was just words and ordinary language. But by the end of the 1600s mathematical notation like +, =, > had been established. For a while new concepts—like Boolean algebra—tended to just piggyback on existing notation. By the end of the 1800s, however, there was a clear need to extend and generalize how one wrote mathematics.

In addition to algebraic variables like x, there was the notion of symbolic functions f, as in f(x). In logic, there had long been the idea of letters (p, q, …) standing for propositions (“it is raining now”). But now there needed to be notation for quantifiers (“for all x such-and-such”, or “there exists x such that…”). In addition, in analogy to symbolic functions in mathematics, there were symbolic logical predicates: not just explicit statements like x > y but also ones like p(x, y) for symbolic p.

The first full effort to set up the necessary notation and come up with an actual scheme for constructing arithmetic from logic was Gottlob Frege’s 1879 Begriffsschrift (“concept script”):

Frege’s Begriffsschrift—click to enlarge Frege’s Begriffsschrift—click to enlarge

And, yes, it was not so easy to read, or to typeset—and at first it didn’t make much of an impression. But the notation got more streamlined with Giuseppe Peano’s Formulario project in the 1890s—which wasn’t so concerned with starting from logic as starting from some specified set of axioms (the “Peano axioms”):

GPeano’s Formulario project—click to enlarge Peano’s Formulario project—click to enlarge

And then in 1910 Alfred Whitehead and Bertrand Russell began publishing their 2000-page Principia Mathematica—which pretty much by its sheer weight and ambition (and notwithstanding what I would today consider grotesque errors of language design)—popularized the possibility of building up “the complexity of mathematics” from “the simplicity of logic”:

Whitehead and Russell’s Principia Mathematica—click to enlarge

It was one thing to try to represent the content of mathematics, but there was also the question of representing the infrastructure and processes of mathematics. Let’s say one picks some axioms. How can one know if they’re consistent? What’s involved in proving everything one can prove from them?

In the 1890s David Hilbert began to develop ideas about this, particularly in the context of tightening up the formalism of Euclid’s geometry and its axioms. And after Principia Mathematica, Hilbert turned more seriously to the use of logic-based ideas to develop “metamathematics”—notably leading to the formulation of things like the “decision problem” (Entscheidungsproblem) of asking whether, given an axiom system, there’s a definite procedure to prove or disprove any statement with respect to it.

But while connections between logic and mathematics were of great interest to people concerned with the philosophy of mathematics, a more obviously mathematical development was universal algebra—in which axioms for different areas of mathematics were specified just by giving appropriate algebraic-like relations. (As it happens, universal algebra was launched under that name by the 1898 book A Treatise on Universal Algebra by Alfred Whitehead, later of Principia Mathematica fame.)

But there was one area where ideas about algebra and logic intersected: the tightening up of Boolean algebra, and in particular the finding of simpler foundations for it. Logic had pretty much always been formulated in terms of And, Or and Not. But in 1912 Henry Sheffer—attempting to simplify Principia Mathematica—showed that just Nand (or Nor) were sufficient. (It turned out that Charles Peirce had already noted the same thing in the 1880s.)

So that established that the notation of logic could be made basically as simple as one could imagine. But what about its actual structure, and axioms? Sheffer talked about needing five “algebra-style” axioms. But by going to axioms based on logical inferences Jean Nicod managed in 1917 to get it down to just one axiom. (And, as it happens, I finally finished the job in 2000 by finding the very simplest “algebra-style” axioms for logic—the single axiom: ((p·qr)·(p·((p·rp))r.)

The big question had in a sense been “What is mathematics ultimately made of?”. Well, now it was known that ordinary propositional logic could be built up from very simple elements. So what about the other things used in mathematics—like functions and predicates? Was there a simple way of building these up too?

People like Frege, Whitehead and Russell had all been concerned with constructing specific things—like sets or numbers—that would have immediate mathematical meaning. But Hilbert’s work in the late 1910s began to highlight the idea of looking instead at metamathematics and the “mechanism of mathematics”—and in effect at how the pure symbolic infrastructure of mathematics fits together (through proofs, etc.), independent of any immediate “external” mathematical meaning.

Much as Aristotle and subsequent logicians had used (propositional) logic to define a “symbolic structure” for arguments, independent of their subject matter, so too did Hilbert’s program imagine a general “symbolic structure” for mathematics, independent of particular mathematical subject matter.

And this is what finally set the stage for the invention of combinators.

Combinators Arrive

We don’t know how long it took Moses Schönfinkel to come up with combinators. From what we know of his personal history, it could have been as long as a decade. But it could also have been as short as a few weeks.

There’s no advanced math or advanced logic involved in defining combinators. But to drill through the layers of technical detail of mathematical logic to realize that it’s even conceivable that everything can be defined in terms of them is a supreme achievement of a kind of abstract reductionism.

There is much we don’t know about Schönfinkel as a person. But the 11-page paper he wrote on the basis of his December 7, 1920, talk in which he introduced combinators is extremely clear.

The paper is entitled “On the Building Blocks of Mathematical Logic” (in the original German, “Über die Bausteine der mathematischen Logik”.) In other words, its goal is to talk about “atoms” from which mathematical logic can be built. Schönfinkel explains that it’s “in the spirit of” Hilbert’s axiomatic method to build everything from as few notions as possible; then he says that what he wants to do is to “seek out those notions from which we shall best be able to construct all other notions of the branch of science in question”.

His first step is to explain that Hilbert, Whitehead, Russell and Frege all set up mathematical logic in terms of standard And, Or, Not, etc. connectives—but that Sheffer had recently been able to show that just a single connective (indicated by a stroke “|”—and what we would now call Nand) was sufficient:

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

But in addition to the “content” of these relations, I think Schönfinkel was trying to communicate by example something else: that all these logical connectives can ultimately be thought of just as examples of “abstract symbolic structures” with a certain “function of arguments” (i.e. f[x,y]) form.

The next couple of paragraphs talk about how the quantifiers “for all” (∀) and “there exists” (∃) can also be simplified in terms of the Sheffer stroke (i.e. Nand). But then comes the rallying cry: “The successes that we have encountered thus far… encourage us to attempt further progress.” And then he’s ready for the big idea—which he explains “at first glance certainly appears extremely bold”. He proposes to “eliminate by suitable reduction the remaining fundamental concepts of proposition, function and variable”.

He explains that this only makes sense for “arbitrary, logically general propositions”, or, as we’d say now, for purely symbolic constructs without specific meanings yet assigned. In other words, his goal is to create a general framework for operating on arbitrary symbolic expressions independent of their interpretation.

He explains that this is valuable both from a “methodological point of view” in achieving “the greatest possible conceptual uniformity”, but also from a certain philosophical or perhaps aesthetic point of view.

And in a sense what he was explaining—back in 1920—was something that’s been a core part of the computational language design that I’ve done for the past 40 years: that everything can be represented as a symbolic expression, and that there’s tremendous value to this kind of uniformity.

But as a “language designer” Schönfinkel was an ultimate minimalist. He wanted to get rid of as many notions as possible—and in particular he didn’t want variables, which he explained were “nothing but tokens that characterize certain argument places and operators as belonging together”; “mere auxiliary notions”.

Today we have all sorts of mathematical notation that’s at least somewhat “variable free” (think coordinate-free notation, category theory, etc.) But in 1920 mathematics as it was written was full of variables. And it needed a serious idea to see how to get rid of them. And that’s where Schönfinkel starts to go “even more symbolic”.

He explains that he’s going to make a kind of “functional calculus” (Funktionalkalkül). He says that normally functions just define a certain correspondence between the domain of their arguments, and the domain of their values. But he says he’s going to generalize that—and allow (“disembodied”) functions to appear as arguments and values of functions. In other words, he’s inventing what we’d now call higher-order functions, where functions can operate “symbolically” on other functions.

In the context of traditional calculus-and-algebra-style mathematics it’s a bizarre idea. But really it’s an idea about computation and computational structures—that’s more abstract and ultimately much more general than the mathematical objectives that inspired it.

But back to Schönfinkel’s paper. His next step is to explain that once functions can have other functions as arguments, functions only ever need to take a single argument. In modern (Wolfram Language) notation he says that you never need f[x,y]; you can always do everything with f[x][y].

In something of a sleight of hand, he sets up his notation so that fxyz (which might look like a function of three arguments f[x,y,z]) actually means (((fx)y)z) (i.e. f[x][y][z]). (In other words—somewhat confusingly with respect to modern standard functional notation—he takes function application to be left associative.)

Again, it’s a bizarre idea—though actually Frege had had a similar idea many years earlier (and now the idea is usually called currying, after Haskell Curry, who we’ll be talking about later). But with his “functional calculus” set up, and all functions needing to take only one argument, Schönfinkel is ready for his big result.

He’s effectively going to argue that by combining a small set of particular functions he can construct any possible symbolic function—or at least anything needed for predicate logic. He calls them a “sequence of particular functions of a very general nature”. Initially there are five of them: the identity function (Identitätsfunktion) I, the constancy function (Konstanzfunktion) C (which we now call K), the interchange function (Vertauschungsfunktion) T, the composition function (Zusammensetzungsfunktion) Z, and the fusion function (Verschmelzungsfunktion) S.

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

And then he’s off and running defining what we now call combinators. The definitions look simple and direct. But to get to them Schönfinkel effectively had to cut away all sorts of conceptual baggage that had come with the historical development of logic and mathematics.

Even talking about the identity combinator isn’t completely straightforward. Schönfinkel carefully explains that in I x = x, equality is direct symbolic or structural equality, or as he puts it “the equal sign is not to be taken to represent logical equivalence as it is ordi­narily defined in the propositional calculus of logic but signifies that the expressions on the left and on the right mean the same thing, that is, that the function value lx is always the same as the argument value x, whatever we may substitute for x.” He then adds parenthetically, “Thus, for instance, I I would be equal to I”. And, yes, to someone used to the mathematical idea that a function takes values like numbers, and gives back numbers, this is a bit mind-blowing.

Next he explains the constancy combinator, that he called C (even though the German word for it starts with K), and that we now call K. He says “let us assume that the argument value is again arbitrary without restric­tion, while, regardless of what this value is, the function value will always be the fixed value a”. And when he says “arbitrary” he really means it: it’s not just a number or something; it’s what we would now think of as any symbolic expression.

First he writes (C a)y = a, i.e. the value of the “constancy function C a operating on any y is a”, then he says to “let a be variable too”, and defines (C x)y = x or Cxy = x. Helpfully, almost as if he were writing computer documentation, he adds: “In practical applications C serves to permit the introduction of a quantity x as a ‘blind’ variable.”

Then he’s on to T. In modern notation the definition is T[f][x][y] = f[y][x] (i.e. T is essentially ReverseApplied). (He wrote the definition as (Tϕ)xy = ϕyx, explaining that the parentheses can be omitted.) He justifies the idea of T by saying that “The function T makes it possible to alter the order of the terms of an expression, and in this way it compensates to a certain extent for the lack of a commutative law.”

Next comes the composition combinator Z. He explains that “In [mathematical] analysis, as is well known, we speak loosely of a ‘function of a function’...”, by which he meant that it was pretty common then (and now) to write something like f(g(x)). But then he “went symbolic”—and defined a composition function that could symbolically act on any two functions f and g: Z[f][g][x] = f[g[x]]. He explains that Z allows one to “shift parentheses” in an expression: i.e. whatever the objects in an expression might be, Z allows one to transform [][][] to [[]] etc. But in case this might have seemed too abstract and symbolic, he then attempted to explain in a more “algebraic” way that the effect of Z is “somewhat like that of the associative law” (though, he added, the actual associative law is not satisfied).

Finally comes the pièce de résistance: the S combinator (that Schönfinkel calls the “fusion function”):

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

He doesn’t take too long to define it. He basically says: consider (fx)(gx) (i.e. f[x][g[x]]). This is really just “a function of x”. But what function? It’s not a composition of f and g; he calls it a “fusion”, and he defines the S combinator to create it: S[f][g][x] = f[x][g[x]].

It’s pretty clear Schönfinkel knew this kind of “symbolic gymnastics” would be hard for people to understand. He continues: “It will be advisable to make this function more intelligible by means of a practical example.” He says to take fxy (i.e. f[x][y]) to be logxy (i.e. Log[x,y]), and gz (i.e. g[z]) to be 1 + z. Then Sfgx = (fx)(gx) = logx(1 + x) (i.e. S[f][g][x]=f[x][g[x]]=Log[x,1+x]). And, OK, it’s not obvious why one would want to do that, and I’m not rushing to make S a built-in function in the Wolfram Language.

But Schönfinkel explains that for him “the practical use of the function S will be to enable us to reduce the number of occurrences of a variable—and to some extent also of a particular function—from several to a single one”.

Setting up everything in terms of five basic objects I, C (now K), T, Z and S might already seem impressive and minimalist enough. But Schönfinkel realized that he could go even further:

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge
Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

First, he says that actually I = SCC (or, in modern notation, s[k][k]). In other words, s[k][k][x] for symbolic x is just equal to x (since s[k][k][x] becomes k[x][k[x]] by using the definition of S, and this becomes x by using the definition of C). He notes that this particular reduction was communicated to him by a certain Alfred Boskowitz (who we know to have been a student at the time); he says that Paul Bernays (who was more of a colleague) had “some time before” noted that I = (SC)(CC) (i.e. s[k][k[k]]). Today, of course, we can use a computer to just enumerate all possible combinator expressions of a particular size, and find what the smallest reduction is. But in Schönfinkel’s day, it would have been more like solving a puzzle by hand.

Schönfinkel goes on, and proves that Z can also be reduced: Z = S(CS)C (i.e. s[k[s]][k]). And, yes, a very simple Wolfram Language program can verify in a few milliseconds that that is the simplest form.

OK, what about T? Schönfinkel gives 8 steps of reduction to prove that T = S(ZZS)(CC) (i.e. s[s[k[s]][k][s[k[s]][k]][s]][k[k]]). But is this the simplest possible form for T? Well, no. But (with the very straightforward 2-line Wolfram Language program I wrote) it did take my modern computer a number of minutes to determine what the simplest form is.

The answer is that it doesn't have size 12, like Schönfinkel’s, but rather size 9. Actually, there are 6 cases of size 9 that all work: s[s[k[s]][s[k[k]][s]]][k[k]] (S(S(KS)(S(KK)S))(KK))) and five others. And, yes, it takes a few steps of reduction to prove that they work (the other size-9 cases S(SSK(K(SS(KK))))S, S(S(K(S(KS)K))S)(KK), S(K(S(S(KS)K)(KK)))S, S(K(SS(KK)))(S(KK)S), S(K(S(K(SS(KK)))K))S all have more complicated reductions):

CombinatorEvolutionPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/Combinators/\
Programs.wl"]; CombinatorEvolutionPlot[
 CombinatorFixedPointList[
  s[s[k[s]][s[k[k]][s]]][k[k]][f][g][x]], "StatesDisplay"]

But, OK, what did Schönfinkel want to do with these objects he’d constructed? As the title of his paper suggests, he wanted to use them as building blocks for mathematical logic. He begins: “Let us now apply our results to a special case, that of the calculus of logic in which the basic elements are individuals and the functions are propositional functions.” I consider this sentence significant. Schönfinkel didn’t have a way to express it (the concept of universal computation hadn’t been invented yet), but he seems to have realized that what he’d done was quite general, and went even beyond being able to represent a particular kind of logic.

Still, he went on to give his example. He’d explained at the beginning of the paper that the quantifiers we now call ∀ and ∃ could both be represented in terms of a kind of “quantified Nand” that he wrote :

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

But now he wanted to “combinator-ify” everything. So he introduced a new combinator U, and defined it to represent his “quantified Nand”: Ufg = fx gx (he called U the “incompatibility function”—an interesting linguistic description of Nand):

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

“It is a remarkable fact”, he says, “that every formula of logic can now be expressed by means... solely of C, S and U.” So he’s saying that any expression from mathematical logic can be written out as some combinator expression in terms of S, C (now K) and U. He says that when there are quantifiers like “for all x...” it’s always possible to use combinators to get rid of the “bound variables” x, etc. He says that he “will not give the complete demonstration here”, but rather content himself with an example. (Unfortunately—for reasons of the trajectory of his life that are still quite unclear—he never published his “complete demonstration”.)

But, OK, so what had he achieved? He’d basically shown that any expression that might appear in predicate logic (with logical connectives, quantifiers, variables, etc.) could be reduced to an expression purely in terms of the combinators S, C (now K) and U.

Did he need the U? Not really. But he had to have some way to represent the thing with mathematical or logical “meaning” on which his combinators would be acting. Today the obvious thing to do would be to have a representation for true and false. And what’s more, to represent these purely in terms of combinators. For example, if we took K to represent true, and SK (s[k]) to represent false, then Or can be represented as SSK (s[s][k]), And as S(SS)S(SK) (s[s[s]][s][s[k]]) and Nand as S(S(S(SS(K(K(KK)))))(KS)) (s[s[s[s[s][k[k[k[k]]]]]][k[s]]). Schönfinkel got amazingly far in reducing everything to his “building blocks”. But, yes, he missed this final step.

But given that he’d managed to reduce everything to S, C and U he figured he should try to go further. So he considered an object J that would be a single building block of S and C: JJ = S and J(JJ) = C.

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

With S and K one can just point to any piece of an expression and see if it reduces. With J it’s a bit more complicated. In modern Wolfram Language terms one can state the rules as {j[j][x_][y_][z_]x[z][y[z]], j[j[j]][x_][y_]x} (where order matters) but to apply these requires pattern matching “clusters of J’s” rather than just looking at single S’s and K’s at a time.

But even though—as Schönfinkel observed—this “final reduction” to J didn’t work out, getting everything down to S and K was already amazing. At the beginning of the paper, Schönfinkel had described his objectives. And then he says “It seems to me remarkable in the extreme that the goal we have just set can be realized also; as it happens, it can be done by a reduction to three fundamental signs.” (The paper does say three fundamental signs, presumably counting U as well as S and K.)

I’m sure Schönfinkel expected that to reproduce all the richness of mathematical logic he’d need quite an elaborate set of building blocks. And certainly people like Frege, Whitehead and Russell had used what were eventually very complicated setups. Schönfinkel managed to cut through all the complexity to show that simple building blocks were all that were needed. But then he found something else: that actually just two building blocks (S and K) were enough.

In modern terms, we’d say that Schönfinkel managed to construct a system capable of universal computation. And that’s amazing in itself. But even more amazing is that he found he could do it with such a simple setup.

I’m sure Schönfinkel was extremely surprised. And here I personally feel a certain commonality with him. Because in my own explorations of the computational universe, what I’ve found over and over again is that it takes only remarkably simple systems to be capable of highly complex behavior—and of universal computation. And even after exploring the computational universe for four decades, I’m still continually surprised at just how simple the systems can be.

For me, this has turned into a general principle—the Principle of Computational Equivalence—and a whole conceptual framework around it. Schönfinkel didn’t have anything like that to think in terms of. But he was in a sense a good enough scientist that he still managed to discover what he discovered—that many decades later we can see fits in as another piece of evidence for the Principle of Computational Equivalence.

Looking at Schönfinkel’s paper a century later, it’s remarkable not only for what it discovers, but also for the clarity and simplicity with which it is presented. A little of the notation is now dated (and of course the original paper is written in German, which is no longer the kind of leading language of scholarship it once was). But for the most part, the paper still seems perfectly modern. Except, of course, that now it could be couched in terms of symbolic expressions and computation, rather than mathematical logic.

What Is Their Mathematics?

Combinators are hard to understand, and it’s not clear how many people understood them when they were first introduced—let alone understood their implications. It’s not a good sign that when Schönfinkel’s paper appeared in 1924 the person who helped prepare it for final publication (Heinrich Behmann) added his own three paragraphs at the end, that were quite confused. And Schönfinkel’s sole other published paper—coauthored with Paul Bernays in 1927—didn’t even mention combinators, even though they could have very profitably been used to discuss the subject at hand (decision problems in mathematical logic).

But in 1927 combinators (if not perhaps Schönfinkel’s recognition for them) had a remarkable piece of good fortune. Schönfinkel’s paper was discovered by a certain Haskell Curry—who would then devote more than 50 years to studying what he named “combinators”, and to spreading the word about them.

At some level I think one can view the main thrust of what Curry and his disciples did with combinators as an effort to “mathematicize” them. Schönfinkel had presented combinators in a rather straightforward “structural” way. But what was the mathematical interpretation of what he did, and of how combinators work in general? What mathematical formalism could capture Schönfinkel’s structural idea of substitution? Just what, for example, was the true notion of equality for combinators?

In the end, combinators are fundamentally computational constructs, full of all the phenomena of “unbridled computation”—like undecidability and computational irreducibility. And it’s inevitable that mathematics as normally conceived can only go so far in “cracking” them.

But back in the 1920s and 1930s the concept and power of computation was not yet understood, and it was assumed that the ideas and tools of mathematics would be the ones to use in analyzing a formal system like combinators. And it wasn’t that mathematical methods got absolutely nowhere with combinators.

Unlike cellular automata, or even Turing machines, there’s a certain immediate structural complexity to combinators, with their elaborate tree structures, equivalences and so on. And so there was progress to be made—and years of work to be done—in untangling this, without having to face the raw features of full-scale computation, like computational irreducibility.

In the end, combinators are full of computational irreducibility. But they also have layers of computational reducibility, some of which are aligned with the kinds of things mathematics and mathematical logic have been set up to handle. And in this there’s a curious resonance with our recent Physics Project.

In our models based on hypergraph rewriting there’s also a kind of bedrock of computational irreducibility. But as with combinators, there’s a certain immediate structural complexity to what our models do. And there are layers of computational reducibility associated with this. But the remarkable thing with our models is that some of those layers—and the formalisms one can build to understand them—have an immediate interpretation: they are basically the core theories of twentieth-century physics, namely general relativity and quantum mechanics.

Combinators work sufficiently differently that they don’t immediately align with that kind of interpretation. But it’s still true that one of the important properties discovered in combinators (namely confluence, related to our idea of causal invariance) turns out to be crucial to our models, their correspondence with physics, and in the end our whole ability to perceive regularity in the universe, even in the face of computational irreducibility.

But let’s get back to the story of combinators as it played out after Schönfinkel’s paper. Schönfinkel had basically set things up in a novel, very direct, structural way. But Curry wanted to connect with more traditional ideas in mathematical logic, and mathematics in general. And after a first paper (published in 1929) which pretty much just recorded his first thoughts, and his efforts to understand what Schönfinkel had done, Curry was by 1930 starting to do things like formulate axioms for combinators, and hoping to prove general theorems about mathematical properties like equality.

Without the understanding of universal computation and their relationship to it, it wasn’t clear yet how complicated it might ultimately be to deal with combinators. And Curry pushed forward, publishing more papers and trying to do things like define set theory using his axioms for combinators. But in 1934 disaster struck. It wasn’t something about computation or undecidability; instead it was that Stephen Kleene and J. Barkley Rosser showed the axioms Curry had come up with to try and “tighten up Schönfinkel” were just plain inconsistent.

To Kleene and Rosser it provided more evidence of the need for Russell’s (originally quite hacky) idea of types—and led them to more complicated axiom systems, and away from combinators. But Curry was undeterred. He revised his axiom system and continued—ultimately for many decades—to see what could be proved about combinators and things like them using mathematical methods.

But already at the beginning of the 1930s there were bigger things afoot around mathematical logic—which would soon intersect with combinators.

Gödel’s Theorem and Computability

How should one represent the fundamental constructs of mathematics? Back in the 1920s nobody thought seriously about using combinators. And instead there were basically three “big brands”: Principia Mathematica, set theory and Hilbert’s program. Relations were being found, details were being filled in, and issues were being found. But there was a general sense that progress was being made.

Quite where the boundaries might lie wasn’t clear. For example, could one specify a way to “construct any function” from lower-level primitives? The basic idea of recursion was very old (think: Fibonacci). But by the early 1920s there was a fairly well-formalized notion of “primitive recursion” in which functions always found their values from earlier values. But could all “mathematical” functions be constructed this way?

By 1926 it was known that this wouldn’t work: the Ackermann function was a reasonable “mathematical” function, but it wasn’t primitive recursive. It meant that definitions had to be generalized (e.g. to “general recursive functions” that didn’t just look back at earlier values, but could “look forward until...” as well). But there didn’t seem to be any fundamental problem with the idea that mathematics could just “mechanistically” be built out forever from appropriate primitives.

But in 1931 came Gödel’s theorem. There’d been a long tradition of identifying paradoxes and inconsistencies, and finding ways to patch them by changing axioms. But Gödel’s theorem was based on Peano’s by-then-standard axioms for arithmetic (branded by Gödel as a fragment of Principia Mathematica). And it showed there was a fundamental problem.

In essence, Gödel took the paradoxical statement “this statement is unprovable” and showed that it could be expressed purely as a statement of arithmetic—roughly a statement about the existence of solutions to appropriate integer equations. And basically what Gödel had to do to achieve this was to create a “compiler” capable of compiling things like “this statement is unprovable” into arithmetic.

In his paper one can basically see him building up different capabilities (e.g. representing arbitrary expressions as numbers through Gödel numbering, checking conditions using general recursion, etc.)—eventually getting to a “high enough level” to represent the statement he wanted:

Gödel’s “On Undecidable Propositions of Principia Mathematica and Related Systems”—click to enlarge Gödel’s “On Undecidable Propositions of Principia Mathematica and Related Systems”—click to enlarge

What did Gödel’s theorem mean? For the foundations of mathematics it meant that the idea of mechanically proving “all true theorems of mathematics” wasn’t going to work. Because it showed that there was at least one statement that by its own admission couldn’t be proved, but was still a “statement about arithmetic”, in the sense that it could be “compiled into arithmetic”.

That was a big deal for the foundations of mathematics. But actually there was something much more significant about Gödel’s theorem, even though it wasn’t recognized at the time. Gödel had used the primitives of number theory and logic to build what amounted to a computational system—in which one could take things like “this statement is unprovable”, and “run them in arithmetic”.

What Gödel had, though, wasn’t exactly a streamlined general system (after all, it only really needed to handle one statement). But the immediate question then was: if there’s a problem with this statement in arithmetic, what about Hilbert’s general “decision problem” (Entscheidungsproblem) for any axiom system?

To discuss the “general decision problem”, though, one needed some kind of general notion of how one could decide things. What ultimate primitives should one use? Schönfinkel (with Paul Bernays)—in his sole other published paper—wrote about a restricted case of the decision problem in 1927, but doesn’t seem to have had the idea of using combinators to study it.

By 1934 Gödel was talking about general recursiveness (i.e. definability through general recursion). And Alonzo Church and Stephen Kleene were introducing λ definability. Then in 1936 Alan Turing introduced Turing machines. All these approaches involved setting up certain primitives, then showing that a large class of things could be “compiled” to those primitives. And that—in effect by thinking about having it compile itself—Hilbert’s Entscheidungsproblem couldn’t be solved.

Perhaps no single result along these lines would have been so significant. But it was soon established that all three kinds of systems were exactly equivalent: the set of computations they could represent were the same, as established by showing that one system could emulate another. And from that discovery eventually emerged the modern notion of universal computation—and all its implications for technology and science.

In the early days, though, there was actually a fourth equivalent kind of system—based on string rewriting—that had been invented by Emil Post in 1920–1. Oh, and then there were combinators.

Lambda Calculus

What was the right “language” to use for setting up mathematical logic? There’d been gradual improvement since the complexities of Principia Mathematica. But around 1930 Alonzo Church wanted a new and cleaner setup. And he needed to have a way (as Frege and Principia Mathematica had done before him) to represent “pure functions”. And that’s how he came to invent λ.

Today in the Wolfram Language we have Function[x,f[x]] or xf[x] (or various shorthands). Church originally had λx[M]:

Church’s “A Set of Postulates for the Foundation of Logic”—click to enlarge

But what’s perhaps most notable is that on the very first page he defines λ, he’s referencing Schönfinkel’s combinator paper. (Well, specifically, he’s referencing it because he wants to use the device Schönfinkel invented that we now call currying—f[x][y] in place of f[x,y]—though ironically he doesn’t mention Curry.) In his 1932 paper (apparently based on work in 1928–9) λ is almost a sideshow—the main event being the introduction of 37 formal postulates for mathematical logic:

Introduction of 37 formal postulates—click to enlarge

By the next year J. Barkley Rosser is trying to retool Curry’s “combinatory logic” with combinators of his own—and showing how they correspond to lambda expressions:

J. Barkley Rosser’s combinators—click to enlarge

Then in 1935 lambda calculus has its big “coming out” in Church’s “An Unsolvable Problem of Elementary Number Theory”, in which he introduces the idea that any “effectively calculable” function should be “λ definable”, then defines integers in terms of λ’s (“Church numerals”)

Church’s “An Unsolvable Problem of Elementary Number Theory”—click to enlarge

and then shows that the problem of determining equivalence for λ expressions is undecidable.

Very soon thereafter Turing publishes his “On Computable Numbers, with an Application to the Entscheidungsproblem” in which he introduces his much more manifestly mechanistic Turing machine model of computation. In the main part of the paper there are no lambdas—or combinators—to be seen. But by late 1936 Turing had gone to Princeton to be a student with Church—and added a note showing the correspondence between his Turing machines and Church’s lambda calculus.

By the next year, when Turing is writing his rather abstruse “Systems of Logic Based on Ordinals” he’s using lambda calculus all over the place. Early in the document he writes I  λx[x], and soon he’s mixing lambdas and combinators with wild abandon—and in fact he’d already published a one-page paper which introduced the fixed-point combinator Θ (and, yes, the K in the title refers to Schönfinkel’s K combinator):

Turing’s “The p-function in lambda-K-conversion”—click to enlarge

When Church summarized the state of lambda calculus in 1941 in his “The Calculi of Lambda-Conversion” he again made extensive use of combinators. Schönfinkel’s K is prominent. But Schönfinkel’s S is nowhere to be seen—and in fact Church has his own S combinator S[n][f][x]f[n[f][x]] which implements successors in Church’s numeral system. And he also has a few other “basic combinators” that he routinely uses.

In the end, combinators and lambda calculus are completely equivalent, and it’s quite easy to convert between them—but there’s a curious tradeoff. In lambda calculus one names variables, which is good for human readability, but can lead to problems at a formal level. In combinators, things are formally much cleaner, but the expressions one gets can be completely incomprehensible to humans.

The point is that in a lambda expression like λx λy x[y] one’s naming the variables (here x and y), but really these names are just placeholders: what they are doesn’t matter; they’re just showing where different arguments go. And in a simple case like this, everything is fine. But what happens if one substitutes for y another lambda expression, say λx f[x]? What is that x? Is it the same x as the one outside, or something different? In practice, there are all sorts of renaming schemes that can be used, but they tend to be quite hacky, and things can quickly get tangled up. And if one wants to make formal proofs about lambda calculus, this can potentially be a big problem, and indeed at the beginning it wasn’t clear it wouldn’t derail the whole idea of lambda calculus.

And that’s part of why the correspondence between lambda calculus and combinators was important. With combinators there are no variables, and so no variable names to get tangled up. So if one can show that something can be converted to combinators—even if one never looks at the potentially very long and ugly combinator expression that’s generated—one knows one’s safe from issues about variable names.

There are still plenty of other complicated issues, though. Prominent among them are questions about when combinator expressions can be considered equal. Let’s say you have a combinator expression, like s[s[s[s][k]]][k]. Well, you can repeatedly apply the rules for combinators to transform and reduce it. And it’ll often end up at a fixed point, where no rules apply anymore. But a basic question is whether it matters in which order the rules are applied. And in 1936 Church and Rosser proved it doesn’t.

Actually, what they specifically proved was the analogous result for lambda calculus. They drew a picture to indicate different possible orders in which lambdas could be reduced out, and showed it didn’t matter which path one takes:

The analogous result for lambda calculus

This all might seem like a detail. But it turns out that generalizations of their result apply to all sorts of systems. In doing computations (or automatically proving theorems) it’s all about “it doesn’t matter what path you take; you’ll always get the same result”. And that’s important. But recently there’s been another important application that’s shown up. It turns out that a generalization of the “Church–Rosser property” is what we call causal invariance in our Physics Project.

And it’s causal invariance that leads in our models to relativistic invariance, general covariance, objective reality in quantum mechanics, and other central features of physics.

Practical Computation

In retrospect, one of the great achievements of the 1930s was the inception of what ended up being the idea of universal computation. But at the time what was done was couched in terms of mathematical logic and it was far from obvious that any of the theoretical structures being built would have any real application beyond thinking about the foundations of mathematics. But even as people like Hilbert were talking in theoretical terms about the mechanization of mathematics, more and more there were actual machines being built for doing mathematical calculations.

We know that even in antiquity (at least one) simple gear-based mechanical calculational devices existed. In the mid-1600s arithmetic calculators started being constructed, and by the late 1800s they were in widespread use. At first they were mechanical, but by the 1930s most were electromechanical, and there started to be systems where units for carrying out different arithmetic operations could be chained together. And by the end of the 1940s fairly elaborate such systems based on electronics were being built.

Already in the 1830s Charles Babbage had imagined an “analytical engine” which could do different operations depending on a “program” specified by punch cards—and Ada Lovelace had realized that such a machine had broad “computational” potential. But by the 1930s a century had passed and nothing like this was connected to the theoretical developments that were going on—and the actual engineering of computational systems was done without any particular overarching theoretical framework.

Still, as electronic devices got more complicated and scientific interest in psychology intensified, something else happened: there started to be the idea (sometimes associated with the name cybernetics) that somehow electronics might reproduce how things like brains work. In the mid-1930s Claude Shannon had shown that Boolean algebra could represent how switching circuits work, and in 1943 Warren McCulloch and Walter Pitts proposed a model of idealized neural networks formulated in something close to mathematical logic terms.

Meanwhile by the mid-1940s John von Neumann—who had worked extensively on mathematical logic—had started suggesting math-like specifications for practical electronic computers, including the way their programs might be stored electronically. At first he made lots of brain-like references to “organs” and “inhibitory connections”, and essentially no mention of ideas from mathematical logic. But by the end of the 1940s von Neumann was talking at least conceptually about connections to Gödel’s theorem and Turing machines, Alan Turing had become involved with actual electronic computers, and there was the beginning of widespread understanding of the notion of general-purpose computers and universal computation.

In the 1950s there was an explosion of interest in what would now be called the theory of computation—and great optimism about its relevance to artificial intelligence. There was all sorts of “interdisciplinary work” on fairly “concrete” models of computation, like finite automata, Turing machines, cellular automata and idealized neural networks. More “abstract” approaches, like recursive functions, lambda calculus—and combinators—remained, however, pretty much restricted to researchers in mathematical logic.

When early programming languages started to appear in the latter part of the 1950s, thinking about practical computers began to become a bit more abstract. It was understood that the grammars of languages could be specified recursively—and actual recursion (of functions being able to call themselves) just snuck into the specification of ALGOL 60. But what about the structures on which programs operated? Most of the concentration was on arrays (sometimes rather elegantly, as in APL) and, occasionally, character strings.

But a notable exception was LISP, described in John McCarthy’s 1960 paper “Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I” (part 2 was not written). There was lots of optimism about AI at the time, and the idea was to create a language to “implement AI”—and do things like “mechanical theorem proving”. A key idea—that McCarthy described as being based on “recursive function formalism”—was to have tree-structured symbolic expressions (“S expressions”). (In the original paper, what’s now Wolfram Language–style f[g[x]]M expression” notation, complete with square brackets, was used as part of the specification, but the quintessential-LISP-like (f (g x)) notation won out when LISP was actually implemented.)

McCarthy’s “Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I”—click to enlarge

An issue in LISP was how to take “expressions” (which were viewed as representing things) and turn them into functions (which do things). And the basic plan was to use Church’s idea of λ notation. But when it came time to implement this, there was, of course, trouble with name collisions, which ended up getting handled in quite hacky ways. So did McCarthy know about combinators? The answer is yes, as his 1960 paper shows:

McCarthy’s “Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I”—click to enlarge

I actually didn’t know until just now that McCarthy had ever even considered combinators, and in the years I knew him I don’t think I ever personally talked to him about them. But it seems that for McCarthy—as for Church—combinators were a kind of “comforting backstop” that ensured that it was OK to use lambdas, and that if things went too badly wrong with variable naming, there was at least in principle always a way to untangle everything.

In the practical development of computers and computer languages, even lambdas—let alone combinators—weren’t really much heard from again (except in a small AI circle) until the 1980s. And even then it didn’t help that in an effort variously to stay close to hardware and to structure programs there tended to be a desire to give everything a “data type”—which was at odds with the “consume any expression” approach of standard combinators and lambdas. But beginning in the 1980s—particularly with the progressive rise of functional programming—lambdas, at least, have steadily gained in visibility and practical application.

What of combinators? Occasionally as a proof of principle there’ll be a hardware system developed that natively implements Schönfinkel’s combinators. Or—particularly in modern times—there’ll be an esoteric language that uses combinators in some kind of purposeful effort at obfuscation. Still, a remarkable cross-section of notable people concerned with the foundations of computing have—at one time or another—taught about combinators or written a paper about them. And in recent years the term “combinator” has become more popular as a way to describe a “purely applicative” function.

But by and large the important ideas that first arose with combinators ended up being absorbed into practical computing by quite circuitous routes, without direct reference to their origins, or to the specific structure of combinators.

Combinators in Culture

For 100 years combinators have mostly been an obscure academic topic, studied particularly in connection with lambda calculus, at borders between theoretical computer science, mathematical logic and to some extent mathematical formalisms like category theory. Much of the work that’s been done can be traced in one way or another to the influence of Haskell Curry or Alonzo Church—particularly through their students, grandstudents, great-grandstudents, etc. Partly in the early years, most of the work was centered in the US, but by the 1960s there was a strong migration to Europe and especially the Netherlands.

But even with all their abstractness and obscurity, on a few rare occasions combinators have broken into something closer to the mainstream. One such time was with the popular logic-puzzle book To Mock a Mockingbird, published in 1985 by Raymond Smullyan—a former student of Alonzo Church’s. It begins: “A certain enchanted forest is inhabited by talking birds” and goes on to tell a story that's basically about combinators “dressed up” as birds calling each other (S is the “starling”, K the “kestrel”)—with a convenient “bird who’s who” at the end. The book is dedicated “To the memory of Haskell Curry—an early pioneer in combinatory logic and an avid bird-watcher”.

To Mock a Mockingbird by Raymond Smullyan—click to enlarge To Mock a Mockingbird by Raymond Smullyan—click to enlarge

And then there’s Y Combinator. The original Y combinator arose out of work that Curry did in the 1930s on the consistency of axiom systems for combinators, and it appeared explicitly in his 1958 classic book:

Combinatory Logic by Haskell B. Curry and Robert Feys—click to enlarge Combinatory Logic by Haskell B. Curry and Robert Feys—click to enlarge

He called it the “paradoxical combinator” because it was recursively defined in a kind of self-referential way analogous to various paradoxes. Its explicit form is SSK(S(K(SS(S(SSK))))K) and its most immediately notable feature is that under Schönfinkel’s combinator transformation rules it never settles down to a particular “value” but just keeps growing forever.

Well, in 2005 Paul Graham—who had long been an enthusiast of functional programming and LISP—decided to name his new (and now very famous) startup accelerator “Y Combinator”. I remember asking him why he’d called it that. “Because,” he said, “nobody understands the Y combinator”.

Looking in my own archives from that time I find an email I sent a combinator enthusiast who was working with me:

Email to Matthew Szudzik

Followed by, basically, “Yes our theorem prover can prove the basic property of the Y combinator” (V6 sounds so ancient; we’re now just about to release V12.2):

Proving the basic property of the Y combinator

I had another unexpected encounter with combinators last year. I had been given a book that was once owned by Alan Turing, and in it I found a piece of paper—that I recognized as being covered with none other than lambdas and combinators (but that’s not the Y combinator):

Note in Alan Turing’s book—click to enlarge

It took quite a bit of sleuthing (that I wrote extensively about)—but I eventually discovered that the piece of paper was written by Turing’s student Robin Gandy. But I never figured out why he was doing combinators....

Designing Symbolic Language

I think I first found out about combinators around 1979 by seeing Schönfinkel’s original paper in a book called From Frege to Gödel: A Source Book in Mathematical Logic (by a certain Jean van Heijenoort). How Schönfinkel’s paper ended up being in that book is an interesting question, which I’ll write about elsewhere. The spine of my copy of the book has long been broken at the location of Schönfinkel’s paper, and at different times I’ve come back to the paper, always thinking there was more to understand about it.

But why was I even studying things like this back in 1979? I guess in retrospect I can say I was engaged in an activity that goes back to Frege or even Leibniz: I was trying to find a fundamental framework for representing mathematics and beyond. But my goal wasn’t a philosophical one; it was a very practical one: I was trying to build a computer language that could do general computations in mathematics and beyond.

My immediate applications were in physics, and it was from physics that my main methodological experience came. And the result was that—like trying to understand the world in terms of elementary particles—I wanted to understand computation in terms of its most fundamental elements. But I also had lots of practical experience in using computers to do mathematical computation. And I soon developed a theory about how I thought computation could fundamentally be done.

It started from the practical issue of transformations on algebraic expressions (turn sin(2x) into 2 sin(x) cos(x), etc.). But it soon became a general idea: compute by doing transformations on symbolic expressions. Was this going to work? I wanted to understand as fundamentally as possible what computation really was—and from that I was led to its history in mathematical logic. Much of what I saw in books and papers about mathematical logic I found abstruse and steeped in sometimes horrendous notational complexity. But what were these people really doing? It made it much easier that I had a definite theory, against which I could essentially do reductionist science. That stuff in Principia Mathematica? Those ideas about rewriting systems? Yup, I could see how to represent them as rules for transformations on symbolic expressions.

And so it was that I came to design SMP: “A Symbolic Manipulation Program”—all based on transformation rules for symbolic expressions. It was easy to represent mathematical relations ($x is a pattern variable that would now in the Wolfram Language be x_ on the left-hand side only):

A Symbolic Manipulation Program

Or basic logic:

A Symbolic Manipulation Program

Or, for that matter, predicate logic of the kind Schönfinkel wanted to capture:

A Symbolic Manipulation Program

And, yes, it could emulate a Turing machine (note the tape-as-transformation-rules representation that appears at the end):

A Symbolic Manipulation Program

But the most important thing I realized is that it really worked to represent basically anything in terms of symbolic expressions, and transformation rules on them. Yes, it was quite often useful to think of “applying functions to things” (and SMP had its version of lambda, for example), but it was much more powerful to think about symbolic expressions as just “being there” (“x doesn’t have to have a value”)—like things in the world—with the language being able to define how things should transform.

In retrospect this all seems awfully like the core idea of combinators, but with one important exception: that instead of everything being built from “purely structural elements” with names like S and K, there was a whole collection of “primitive objects” that were intended to have direct understandable meanings (like Plus, Times, etc.). And indeed I saw a large part of my task in language design as being to think about computations one might want to do, and then try to “drill down” to find the “elementary particles”—or primitive objects—from which these computations might be built up.

Over time I’ve come to realize that doing this is less about what one can in principle use to construct computations, and more about making a bridge to the way humans think about things. It’s crucial that there’s an underlying structure—symbolic expressions—that can represent anything. But increasingly I’ve come to realize that what we need from a computational language is to have a way to encapsulate in precise computational form the kinds of things we humans think about—in a way that we humans can understand. And a crucial part of being able to do that is to leverage what has ultimately been at the core of making our whole intellectual development as a species possible: the idea of human language.

Human language has given us a way to talk symbolically about the world: to give symbolic names to things, and then to build things up using these. In designing a computational language the goal is to leverage this: to use what humans already know and understand, but be able to represent it in a precise computational way that is amenable to actual computation that can be done automatically by computer.

It’s probably no coincidence that the tree structure of symbolic expressions that I have found to be such a successful foundation for computational language is a bit like an idealized version of the kind of tree structure (think parse trees or sentence diagramming) that one can view human language as following. There are other ways to set up universal computation, but this is the one that seems to fit most directly with our way of thinking about things.

And, yes, in the end all those symbolic expressions could be constructed like combinators from objects—like S and K—with no direct human meaning. But that would be like having a world without nouns—a world where there’s no name for anything—and the representation of everything has to be built from scratch. But the crucial idea that’s central to human language—and now to computational language—is to be able to have layers of abstraction, where one can name things and then refer to them just by name without having to think about how they’re built up “inside”.

In some sense one can see the goal of people like Frege—and Schönfinkel—as being to “reduce out” what exists in mathematics (or the world) and turn it into something like “pure logic”. And the structural part of that is exactly what makes computational language possible. But in my conception of computational language the whole idea is to have content that relates to the world and the way we humans think about it.

And over the decades I’ve continually been amazed at just how strong and successful the idea of representing things in terms of symbolic expressions and transformations on them is. Underneath everything that’s going on in the Wolfram Language—and in all the many systems that now use it—it’s all ultimately just symbolic expressions being transformed according to particular rules, and reaching fixed points that represent results of computations, just like in those examples in Schönfinkel’s original paper.

One important feature of Schönfinkel’s setup is the idea that one doesn’t just have “functions” like f[x], or even just nested functions, like f[g[x]]. Instead one can have constructs where instead of the “name of a function” (like f) one can have a whole complex symbolic structure. And while this was certainly possible in SMP, not too much was built around it. But when I came to start designing what’s now the Wolfram Language in 1986, I made sure that the “head” (as I called it) of an expression could itself be an arbitrary expression.

And when Mathematica was first launched in 1988 I was charmed to see more than one person from mathematical logic immediately think of implementing combinators. Make the definitions:

s[x_][y_][z_] := x[z][y[z]]
&#10005
Clear[s,k];
					s[x_][y_][z_] := x[z][y[z]]
k[x_][y_] := x
&#10005
k[x_][y_] := x

Then combinators “just work” (at least if they reach a fixed point):

s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]][a][b][c]
&#10005
s[s[k[s]][s[k[k]][s[k[s]][k]]]][s[k[s[s[k][k]]]][k]][a][b][c]

But what about the idea of “composite symbolic heads”? Already in SMP I’d used them to do simple things like represent derivatives (and in Wolfram Language f'[x] is Derivative[1][f][x]). But something that’s been interesting to me to see is that as the decades have gone by, more and more gets done with “composite heads”. Sometimes one thinks of them as some kind of nesting of operations, or nesting of modifiers to a symbolic object. But increasingly they end up being a way to represent “higher-order constructs”—in effect things that produce things that produce things etc. that eventually give a concrete object one wants.

I don’t think most of us humans are particularly good at following this kind of chain of abstraction, at least without some kind of “guide rails”. And it’s been interesting for me to see over the years how we’ve been able to progressively build up guide rails for longer and longer chains of abstraction. First there were things like Function, Apply, Map. Then Nest, Fold, FixedPoint, MapThread. But only quite recently NestGraph, FoldPair, SubsetMap, etc. Even from the beginning there were direct “head manipulation” functions like Operate and Through. But unlike more “array-like” operations for list manipulation they’ve been slow to catch on.

In a sense combinators are an ultimate story of “symbolic head manipulation”: everything can get applied to everything before it’s applied to anything. And, yes, it’s very hard to keep track of what’s going on—which is why “named guide rails” are so important, and also why they’re challenging to devise. But it seems as if, as we progressively evolve our understanding, we’re slowly able to get a little further, in effect building towards the kind of structure and power that combinators—in their very non-human-relatable way—first showed us was possible a century ago.

Combinators in the Computational Universe

Combinators were invented for a definite purpose: to provide building blocks, as Schönfinkel put it, for logic. It was the same kind of thing with other models of what we now know of as computation. All of them were “constructed for a purpose”. But in the end computation—and programs—are abstract things, that can in principle be studied without reference to any particular purpose. One might have some particular reason to be looking at how fast programs of some kind can run, or what can be proved about them. But what about the analog of pure natural science: of studying what programs just “naturally do”?

At the beginning of the 1980s I got very interested in what one can think of as the “natural science of programs”. My interest originally arose out of a question about ordinary natural science. One of the very noticeable features of the natural world is how much in it seems to us highly complex. But where does this complexity really come from? Through what kind of mechanism does nature produce it? I quickly realized that in trying to address that question, I needed as general a foundation for making models of things as possible. And for that I turned to programs, and began to study just what “programs in the wild” might do.

Ever since the time of Galileo and Newton mathematical equations had been the main way that people ultimately imagined making models of nature. And on the face of it—with their real numbers and continuous character—these seemed quite different from the usual setup for computation, with its discrete elements and discrete choices. But perhaps in part through my own experience in doing mathematics symbolically on computers, I didn’t see a real conflict, and I began to think of programs as a kind of generalization of the traditional approach to modeling in science.

But what kind of programs might nature use? I decided to just start exploring all the possibilities: the whole “computational universe” of programs—starting with the simplest. I came up with a particularly simple setup involving a row of cells with values 0 or 1 updated in parallel based on the values of their neighbors. I soon learned that systems like this had actually been studied under the name “cellular automata” in the 1950s (particularly in 2D) as potential models of computation, though had fallen out of favor mainly through not having seemed very “human programmable”.

My initial assumption was that with simple programs I’d only see simple behavior. But with my cellular automata it was very easy to do actual computer experiments, and to visualize the results. And though in many cases what I saw was simple behavior, I also saw something very surprising: that in some cases—even though the rules were very simple—the behavior that was generated could be immensely complex:

GraphicsRow
&#10005
GraphicsRow[
 Labeled[ArrayPlot[CellularAutomaton[#, {{1}, 0}, {80, All}]], 
    RulePlot[CellularAutomaton[#]]] & /@ {150, 30, 73}, 
 ImageSize -> {Full, Automatic}, Spacings -> 0]

It took me years to come to terms with this phenomenon, and it’s gradually informed the way I think about science, computation and many other things. At first I studied it almost exclusively in cellular automata. I made connections to actual systems in nature that cellular automata could model. I tried to understand what existing mathematical and other methods could say about what I’d seen. And slowly I began to formulate general ideas to explain what was going on—like computational irreducibility and the Principle of Computational Equivalence.

But at the beginning of the 1990s—now armed with what would become the Wolfram Language—I decided I should try to see just how the phenomenon I had found in cellular automata would play it in other kinds of computational systems. And my archives record that on April 4, 1992, I started looking at combinators.

I seem to have come back to them several times, but in a notebook from July 10, 1994 (which, yes, still runs just fine), there it is:

Mathematica notebook from July 10, 1994

A randomly chosen combinator made of Schönfinkel’s S’s and K’s starting to show complex behavior. I seem to have a lot of notebooks that start with the simple combinator definitions—and then start exploring:

Starting with the simple combinator definitions—and exploring

There are what seem like they could be pages from a “computational naturalist’s field notebook”:

Pages from a “computational naturalist’s field notebook”

Then there are attempts to visualize combinators in the same kind of way as cellular automata:

ttempts to visualize combinators in the same kind of way as cellular automata

But the end result was that, yes, like Turing machines, string substitution systems and all the other systems I explored in the computational universe, combinators did exactly the same kinds of things I’d originally discovered in cellular automata. Combinators weren’t just systems that could be set up to do things. Even “in the wild” they could spontaneously do very interesting and complex things.

I included a few pages on what I called “symbolic systems” (essentially lambdas) at the end of my chapter on “The World of Simple Programs” in A New Kind of Science (and, yes, reading particularly the notes again now, I realize there are still many more things to explore...):

“Symbolic systems” in A New Kind of Science—click to enlarge

Later in the book I talk specifically about Schönfinkel’s combinators in connection with the threshold of computation universality. But before showing examples of what they do, I remark:

“Originally intended as an idealized way to represent structures of functions defined in logic, combinators were actually first introduced in 1920—sixteen years before Turing machines. But although they have been investigated somewhat over the past eighty years, they have for the most part been viewed as rather obscure and irrelevant constructs”

How “irrelevant” should they be seen as being? Of course it depends on what for. As things to explore in the computational universe, cellular automata have the great advantage of allowing immediate visualization. With combinators it’s a challenge to find any way to translate their behavior at all faithfully into something suitable for human perception. And since the Principle of Computational Equivalence implies that general computational features won’t depend on the particulars of different systems, there’s a tendency to feel that even in studying the computational universe, combinators “aren’t worth the trouble”.

Still, one thing that’s been prominently on display with cellular automata over the past 20 or so years is the idea that any sufficiently simple system will eventually end up being a useful model for something. Mollusc pigmentation. Catalysis processes. Road traffic flow. There are simple cellular automaton models for all of these. What about combinators? Without good visualization it’s harder to say “that looks like combinator behavior”. And even after 100 years they’re still a bit too unfamiliar. But when it comes to capturing some large-scale expression or tree behavior of some system, I won’t be surprised if combinators are a good fit.

When one looks at the computational universe, one of the important ideas is “mining” it not just for programs that can serve as models for things, but also for programs that are somehow useful for some technological purpose. Yes, one can imagine specifically “compiling” some known program to combinators. But the question is whether “naturally occurring combinators” can somehow be identified as useful for some particular purpose. Could they deliver some new kind of distributed cryptographic protocol? Could they be helpful in mapping out distributed computing systems? Could they serve as a base for setting up molecular-scale computation, say with tree-like molecules? I don’t know. But it will be interesting to find out. And as combinators enter their second century they provide a unique kind of “computational raw material” to mine from the computational universe.

Combinators All the Way Down?

What is the universe fundamentally made of? For a long time the assumption was that it must be described by something fundamentally mathematical. And indeed right around the time combinators were being invented the two great theories of general relativity and quantum mechanics were just developing. And in fact it seemed as if both physics and mathematics were going so well that people like David Hilbert imagined that perhaps both might be completely solved—and that there might be a mathematics-like axiomatic basis for physics that could be “mechanically explored” as he imagined mathematics could be.

But it didn’t work out that way. Gödel’s theorem appeared to shatter the idea of a “complete mechanical exploration” of mathematics. And while there was immense technical progress in working out the consequences of general relativity and quantum mechanics little was discovered about what might lie underneath. Computers (including things like Mathematica) were certainly useful in exploring the existing theories of physics. But physics didn’t show any particular signs of being “fundamentally computational”, and indeed the existing theories seemed structurally not terribly compatible with computational processes.

But as I explored the computational universe and saw just what rich and complex behavior could arise even from very simple rules, I began to wonder whether maybe, far below the level of existing physics, the universe might be fundamentally computational. I began to make specific models in which space and time were formed from an evolving network of discrete points. And I realized that some of the ideas that had arisen in the study of things like combinators and lambda calculus from the 1930s and 1940s might have direct relevance.

Like combinators (or lambda calculus) my models had the feature that they allowed many possible paths of evolution. And like combinators (or lambda calculus) at least some of my models had the remarkable feature that in some sense it didn’t matter what path one took; the final result would always be the same. For combinators this “Church–Rosser” or “confluence” feature was what allowed one to have a definite fixed point that could be considered the result of a computation. In my models of the universe that doesn’t just stop—things are a bit more subtle—but the generalization to what I call causal invariance is precisely what leads to relativistic invariance and the validity of general relativity.

For many years my work on fundamental physics languished—a victim of other priorities and the uphill effort of introducing new paradigms into a well-established field. But just over a year ago—with help from two very talented young physicists—I started again, with unexpectedly spectacular results.

I had never been quite satisfied with my idea of everything in the universe being represented as a particular kind of giant graph. But now I imagined that perhaps it was more like a giant symbolic expression, or, specifically, like an expression consisting of a huge collection of relations between elements—in effect, a certain kind of giant hypergraph. It was, in a way, a very combinator-like concept.

At a technical level, it’s not the same as a general combinator expression: it’s basically just a single layer, not a tree. And in fact that’s what seems to allow the physical universe to consist of something that approximates uniform (manifold-like) space, rather than showing some kind of hierarchical tree-like structure everywhere.

But when it comes to the progression of the universe through time, it’s basically just like the transformation of combinator expressions. And what’s become clear is that the existence of different paths—and their ultimate equivalences—is exactly what’s responsible not only for the phenomena of relativity, but also for quantum mechanics. And what’s remarkable is that many of the concepts that were first discovered in the context of combinators and lambda calculus now directly inform the theory of physics. Normal forms (basically fixed points) are related to black holes where “time stops”. Critical pair lemmas are related to measurement in quantum mechanics. And so on.

In practical computing, and in the creation of computational language, it was the addition of “meaningful names” to the raw structure of combinators that turned them into the powerful symbolic expressions we use. But in understanding the “data structure of the universe” we’re in a sense going back to something much more like “raw combinators”. Because now all those “atoms of space” that make up the universe don’t have meaningful names; they’re more like S’s and K’s in a giant combinator expression, distinct but yet all the same.

In the traditional, mathematical view of physics, there was always some sense that by “appropriately clever mathematics” it would be possible to “figure out what will happen” in any physical system. But once one imagines that physics is fundamentally computational, that’s not what one can expect.

And just like combinators—with their capability for universal computation—can’t in a sense be “cracked” using mathematics, so also that’ll be true of the universe. And indeed in our model that’s what the progress of time is about: it’s the inexorable, irreducible process of computation, associated with the repeated transformation of the symbolic expression that represents the universe.

When Hilbert first imagined that physics could be reduced to mathematics he probably thought that meant that physics could be “solved”. But with Gödel’s theorem—which is a reflection of universal computation—it became clear that mathematics itself couldn’t just be “solved”. But now in effect we have a theory that “reduces physics to mathematics”, and the result of the Gödel’s theorem phenomenon is something very important in our universe: it’s what leads to a meaningful notion of time.

Moses Schönfinkel imagined that with combinators he was finding “building blocks for logic”. And perhaps the very simplicity of what he came up with makes it almost inevitable that it wasn’t just about logic: it was something much more general. Something that can represent computations. Something that has the germ of how we can represent the “machine code” of the physical universe.

It took in a sense “humanizing” combinators to make them useful for things like computational language whose very purpose is to connect with humans. But there are other places where inevitably we’re dealing with something more like large-scale “combinators in the raw”. Physics is one of them. But there are others. In distributed computing. And perhaps in biology, in economics and in other places.

There are specific issues of whether one’s dealing with trees (like combinators), or hypergraphs (like our model of physics), or something else. But what’s important is that many of the ideas—particularly around what we call multiway systems—show up with combinators. And yes, combinators often aren’t the easiest places for us humans to understand the ideas in. But the remarkable fact is that they exist in combinators—and that combinators are now a century old.

I’m not sure if there’ll ever be a significant area where combinators alone will be the dominant force. But combinators have—for a century—had the essence of many important ideas. Maybe as such they are at some level destined forever to be footnotes. But in sense they are also seeds or roots—from which remarkable things have grown. And as combinators enter their second century it seems quite certain that there is still much more that will grow from them.

Where Did Combinators Come From? Hunting the Story of Moses Schönfinkel

$
0
0
blog-icon-schon
Preliminary

December 7, 1920

Where Did Combinators Come From? Hunting the Story of Moses Schönfinkel—click to enlarge

On Tuesday, December 7, 1920, the Göttingen Mathematics Society held its regular weekly meeting—at which a 32-year-old local mathematician named Moses Schönfinkel with no known previous mathematical publications gave a talk entitled “Elemente der Logik” (“Elements of Logic”).

A hundred years later what was presented in that talk still seems in many ways alien and futuristic—and for most people almost irreducibly abstract. But we now realize that that talk gave the first complete formalism for what is probably the single most important idea of this past century: the idea of universal computation.

Sixteen years later would come Turing machines (and lambda calculus). But in 1920 Moses Schönfinkel presented what he called “building blocks of logic”—or what we now call “combinators”—and then proceeded to show that by appropriately combining them one could effectively define any function, or, in modern terms, that they could be used to do universal computation.

Looking back a century it’s remarkable enough that Moses Schönfinkel conceptualized a formal system that could effectively capture the abstract notion of computation. And it’s more remarkable still that he formulated what amounts to the idea of universal computation, and showed that his system achieved it.

But for me the most amazing thing is that not only did he invent the first complete formalism for universal computation, but his formalism is probably in some sense minimal. I’ve personally spent years trying to work out just how simple the structure of systems that support universal computation can be—and for example with Turing machines it took from 1936 until 2007 for us to find the minimal case.

But back in his 1920 talk Moses Schönfinkel—presenting a formalism for universal computation for the very first time—gave something that is probably already in his context minimal.

Moses Schönfinkel described the result of his 1920 talk in an 11-page paper published in 1924 entitled “Über die Bausteine der mathematischen Logik” (“On the Building Blocks of Mathematical Logic”). The paper is a model of clarity. It starts by saying that in the “axiomatic method” for mathematics it makes sense to try to keep the number of “fundamental notions” as small as possible. It reports that in 1913 Henry Sheffer managed to show that basic logic requires only one connective, that we now call Nand. But then it begins to go further. And already within a couple of paragraphs it’s saying that “We are led to [an] idea, which at first glance certainly appears extremely bold”. But by the end of the introduction it’s reporting, with surprise, the big news: “It seems to me remarkable in the extreme that the goal we have just set can be realized… [and]; as it happens, it can be done by a reduction to three fundamental signs”.

Those “three fundamental signs”, of which he only really needs two, are what we now call the S and K combinators (he called them S and C). In concept they’re remarkably simple, but their actual operation is in many ways brain-twistingly complex. But there they were—already a century ago—just as they are today: minimal elements for universal computation, somehow conjured up from the mind of Moses Schönfinkel.

Who Was Moses Schönfinkel?

So who was this person, who managed so long ago to see so far?

The complete known published output of Moses Schönfinkel consists of just two papers: his 1924 “On the Building Blocks of Mathematical Logic”, and another, 31-page paper from 1927, coauthored with Paul Bernays, entitled “Zum Entscheidungsproblem der mathematisehen Logik” (“On the Decision Problem of Mathematical Logic”).

“Über die Bausteine der mathematischen Logik”—click to enlarge
“Zum Entscheidungsproblem der mathematischen Logik”—click to enlarge

And somehow Schönfinkel has always been in the shadows—appearing at best only as a kind of footnote to a footnote. Turing machines have taken the limelight as models of computation—with combinators, hard to understand as they are, being mentioned at most only in obscure footnotes. And even within the study of combinators—often called “combinatory logic”—even as S and K have remained ubiquitous, Schönfinkel’s invention of them typically garners at most a footnote.

About Schönfinkel as a person, three things are commonly said. First, that he was somehow connected with the mathematician David Hilbert in Göttingen. Second, that he spent time in a psychiatric institution. And third, that he died in poverty in Moscow, probably around 1940 or 1942.

But of course there has to be more to the story. And in recognition of the centenary of Schönfinkel’s announcement of combinators, I decided to try to see what I could find out.

I don’t think I’ve got all the answers. But it’s been an interesting, if at times unsettling, trek through the Europe—and mathematics—of a century or so ago. And at the end of it I feel I’ve come to know and understand at least a little more about the triumph and tragedy of Moses Schönfinkel.

The Beginning of the Story

It’s a strange and sad resonance with Moses Schönfinkel’s life… but there’s a 1953 song by Tom Lehrer about plagiarism in mathematics—where the protagonist explains his chain of intellectual theft: “I have a friend in Minsk/Who has a friend in Pinsk/Whose friend in Omsk”… “/Whose friend somehow/Is solving now/The problem in Dnepropetrovsk”. Well, Dnepropetrovsk is where Moses Schönfinkel was born.

Except, confusingly, at the time it was called (after Catherine the Great) Ekaterinoslav (Екатеринослáв)—and it’s now called Dnipro. It’s one of the larger cities in Ukraine, roughly in the center of the country, about 250 miles down the river Dnieper from Kiev. And at the time when Schönfinkel was born, Ukraine was part of the Russian Empire.

So what traces are there of Moses Schönfinkel in Ekaterinoslav (AKA Dnipro) today? 132 years later it wasn’t so easy to find (especially during a pandemic)… but here’s a record of his birth: a certificate from the Ekaterinoslav Public Rabbi stating that entry 272 of the Birth Register for Jews from 1888 records that on September 7, 1888, a son Moses was born to the Ekaterinoslav citizen Ilya Schönfinkel and his wife Masha:

Birth register—click to enlarge

This seems straightforward enough. But immediately there’s a subtlety. When exactly was Moses Schönfinkel born? What is that date? At the time the Russian Empire—which had the Russian Orthodox Church, which eschewed Pope Gregory’s 1582 revision of the calendar—was still using the Julian calendar introduced by Julius Caesar. (The calendar was switched in 1918 after the Russian Revolution, although the Orthodox Church plans to go on celebrating Christmas on January 7 until 2100.) So to know a correct modern (i.e. Gregorian calendar) date of birth we have to do a conversion. And from this we’d conclude that Moses Schönfinkel was born on September 19, 1888.

But it turns out that’s not the end of the story. There are several other documents associated with Schönfinkel’s college years that also list his date of birth as September 7, 1888. But the state archives of the Dnepropetrovsk region contain the actual, original register from the synagogue in Ekaterinoslav. And here’s entry 272—and it records the birth of Moses Schönfinkel, but on September 17, not September 7:

Birth date of September 17—click to enlarge

So the official certificate is wrong! Someone left a digit out. And there’s a check: the Birth Register also gives the date in the Jewish calendar: 24 Tishrei–which for 1888 is the Julian date September 17. So converting to modern Gregorian form, the correct date of birth for Moses Schönfinkel is September 29, 1888.

OK, now what about his name? In Russian it’s given as Моисей Шейнфинкель (or, including the patronymic, with the most common transliteration from Hebrew, Моисей Эльевич Шейнфинкель). But how should his last name be transliterated? Well, there are several possibilities. We’re using Schönfinkel—but other possibilities are Sheinfinkel and Sheynfinkel—and these show up almost randomly in different documents.

What else can we learn from Moses Schönfinkel’s “birth certificate”? Well, it describes his father Эльева (Ilya) as an Ekaterinoslav мещанина. But what is that word? It’s often translated “bourgeoisie”, but seems to have basically meant “middle-class city dweller”. And in other documents from the time, Ilya Schönfinkel is described as a “merchant of the 2nd guild” (i.e. not the “top 5%” 1st guild, nor the lower 3rd guild).

Apparently, however, his fortunes improved. The 1905 “Index of Active Enterprises Incorporated in the [Russian] Empire” lists him as a “merchant of the 1st guild” and records that in 1894 he co-founded the company of “Lurie & Sheinfinkel” (with a paid-in capital of 10,000 rubles, or about $150k today) that was engaged in the grocery trade:

Lurie & Sheinfinkel

Lurie & Sheinfinkel seems to have had multiple wine and grocery stores. Between 1901 and 1904 its “store #2” was next to a homeopathic pharmacy in a building that probably looked at the time much like it does today:

Lurie & Sheinfinkel building

And for store #1 there are actually contemporary photographs (note the -инкель for the end of “Schönfinkel” visible on the bottom left; this particular building was destroyed in World War II):

Lurie & Scheinfinkel building

There seems to have been a close connection between the Schönfinkels and the Luries—who were a prominent Ekaterinoslav family involved in a variety of enterprises. Moses Schönfinkel’s mother Maria (Masha) was originally a Lurie (actually, she was one of the 8 siblings of Ilya Schönfinkel’s business partner Aron Lurie). Ilya Schönfinkel is listed from 1894 to 1897 as “treasurer of the Lurie Synagogue”. And in 1906 Moses Schönfinkel listed his mailing address in Ekaterinoslav as Lurie House, Ostrozhnaya Square. (By 1906 that square sported an upscale park—though a century earlier it had housed a prison that was referenced in a poem by Pushkin. Now it’s the site of an opera house.)

Accounts of Schönfinkel sometimes describe him as coming from a “village in Ukraine”. In actuality, at the turn of the twentieth century Ekaterinoslav was a bustling metropolis, that for example had just become the third city in the whole Russian Empire to have electric trams. Schönfinkel’s family also seems to have been quite well to do. Some pictures of Ekaterinoslav from the time give a sense of the environment (this building was actually the site of a Lurie candy factory):

Ekaterinoslav
Ekaterinoslav

As the name “Moses” might suggest, Moses Schönfinkel was Jewish, and at the time he was born there was a large Jewish population in the southern part of Ukraine. Many Jews had come to Ekaterinoslav from Moscow, and in fact 40% of the whole population of the town was identified as Jewish.

Moses Schönfinkel went to the main high school in town (the “Ekaterinoslav classical gymnasium”)—and graduated in 1906, shortly before turning 18. Here’s his diploma:

Diploma—click to enlarge

The diploma shows that he got 5/5 in all subjects—the subjects being theology, Russian, logic, Latin, Greek, mathematics, geodesy (“mathematical geography”), physics, history, geography, French, German and drawing. So, yes, he did well in high school. And in fact the diploma goes on to say: “In view of his excellent behavior and diligence and excellent success in the sciences, especially in mathematics, the Pedagogical Council decided to award him the Gold Medal…”

Going to College in Odessa

Having graduated from high school, Moses Schönfinkel wanted to go (“for purely family reasons”, he said) to the University of Kiev. But being told that Ekaterinoslav was in the wrong district for that, he instead asked to enroll at Novorossiysk University in Odessa. He wrote a letter—in rather neat handwriting—to unscramble a bureaucratic issue, giving various excuses along the way:

Enrollment letter—click to enlarge

But in the fall of 1906, there he was: a student in the Faculty of Physics and Mathematics Faculty of Novorossiysk University, in the rather upscale and cosmopolitan town of Odessa, on the Black Sea.

The Imperial Novorossiya University, as it was then officially called, had been created out of an earlier institution by Tsar Alexander II in 1865. It was a distinguished university, with for example Dmitri Mendeleev (of periodic table fame) having taught there. In Soviet times it would be renamed after the discoverer of macrophages, Élie Metchnikoff (who worked there). Nowadays it is usually known as Odessa University. And conveniently, it has maintained its archives well—so that, still there, 114 years later, is Moses Schönfinkel’s student file:

Student file—click to enlarge

It’s amazing how “modern” a lot of what’s in it seems. First, there are documents Moses Schönfinkel sent so he could register (confirming them by telegram on September 1, 1906). There’s his high-school diploma and birth certificate—and there’s a document from the Ekaterinoslav City Council certifying his “citizen rank” (see above). The cover sheet also records a couple of other documents, one of which is presumably some kind of deferment of military service.

And then in the file there are two “photo cards” giving us pictures of the young Moses Schönfinkel, wearing the uniform of the Imperial Russian Army:

Schönfinkel’s military photo cards—click to enlarge

(These pictures actually seem to come from 1908; the style of uniform was a standard one issued after 1907; the [presumably] white collar tabs indicate the 3rd regiment of whatever division he was assigned to.)

Nowadays it would all be online, but in his physical file there is a “lecture book” listing courses (yes, every document is numbered, to correspond to a line in a central ledger):

Lecture book—click to enlarge

Here are the courses Moses Schönfinkel took in his first semester in college (fall 1906):

Courses Moses Schönfinkel took in his first semester in college—click to enlarge

Introduction to Analysis (6 hrs), Introduction to Determinant Theory (2 hrs), Analytical Geometry 1 (2 hrs), Chemistry (5 hrs), Physics 1 (3 hrs), Elementary Number Theory (2 hrs): a total of 20 hours. Here’s the bill for these courses: pretty good value at 1 ruble per course-hour, or a total of 20 rubles, which is about $300 today:

Course bill—click to enlarge

Subsequent semesters list many very familiar courses: Differential Calculus, Integrals (parts 1 and 2), and Higher Algebra, as well as “Calculus of Probabilities” (presumably probability theory) and “Determinant Theory” (essentially differently branded “linear algebra”). There are some “distribution” courses, like Astronomy (and Spherical Astronomy) and Physical Geography (or is that Geodesy?). And by 1908, there are also courses like Functions of a Complex Variable, Integro-Differential Equations (yeah, differential equations definitely pulled ahead of integral equations over the past century), Calculus of Variations and Infinite Series. And—perhaps presaging Schönfinkel’s next life move—another course that makes an appearance in 1908 is German (and it’s Schönfinkel only non-science course during his whole university career).

In Schönfinkel’s “lecture book” many of the courses also have names of professors listed. For example, there’s “Kagan”, who’s listed as teaching Foundations of Geometry (as well as Higher Algebra, Determinant Theory and Integro-Differential Equations). That’s Benjamin Kagan, who was then a young lecturer, but would later become a leader in differential geometry in Moscow—and also someone who studied the axiomatic foundations of geometry (as well as writing about the somewhat tragic life of Lobachevsky).

Another professor—listed as teaching Schönfinkel Introduction to Analysis and Theory of Algebraic Equation Solving—is “Shatunovsky”. And (at least according to Shatunovsky’s later student Sofya Yanoskaya, of whom we’ll hear more later), Samuil Shatunovsky was basically Schönfinkel’s undergraduate advisor.

Shatunovsky had been the 9th child of a poor Jewish family (actually) from a village in Ukraine. He was never able to enroll at a university, but for some years did manage to go to lectures by people around Pafnuty Chebyshev in Saint Petersburg. For quite a few years he then made a living as an itinerant math tutor (notably in Ekaterinoslav) but papers he wrote were eventually noticed by people at the university in Odessa, and, finally, in 1905, at the age of 46, he ended up as a lecturer at the university—where the following year he taught Schönfinkel.

Shatunovsky (who stayed in Odessa until his death in 1929) was apparently an energetic but precise lecturer. He seems to have been quite axiomatically oriented, creating axiomatic systems for geometry, algebraic fields, and notably, for order relations. (He was also quite a constructivist, opposed to the indiscriminate use of the Law of Excluded Middle.) The lectures from his Introduction to Analysis course (which Schönfinkel took in 1906) were published in 1923 (by the local publishing company Mathesis in which he and Kagan were involved).

Another of Schönfinkel’s professors (from whom he took Differential Calculus and “Calculus of Probabilities”) was a certain Ivan (or Jan) Śleszyński, who had worked with Karl Weierstrass on things like continued fractions, but by 1906 was in his early 50s and increasingly transitioning to working on logic. In 1911 he moved to Poland, where he sowed some of the seeds for the Polish school of mathematical logic, in 1923 writing a book called On the Significance of Logic for Mathematics (notably with no mention of Schönfinkel), and in 1925 one on proof theory.

It’s not clear how much mathematical logic Moses Schönfinkel picked up in college, but in any case, in 1910, he was ready to graduate. Here’s his final student ID (what are those pieces of string for?):

Moses Schönfinkel’s student ID—click to enlarge

There’s a certificate confirming that on April 6, 1910, Moses Schönfinkel had no books that needed returning to the library. And he sent a letter asking to graduate (with slightly-less-neat handwriting than in 1906):

Letter asking to graduate—click to enlarge

The letter closes with his signature (Моисей Шейнфинкель):

Moses Schönfinkel’s signature—click to enlarge

Göttingen, Center of the Mathematical Universe

After Moses Schönfinkel graduated college in 1910 he probably went into four years of military service (perhaps as an engineer) in the Russian Imperial Army. World War I began on July 28, 1914—and Russia mobilized on July 30. But in one of his few pieces of good luck Moses Schönfinkel was not called up, having arrived in Göttingen, Germany on June 1, 1914 (just four weeks before the event that would trigger World War I), to study mathematics.

Göttingen was at the time a top place for mathematics. In fact, it was sufficiently much of a “math town” that around that time postcards of local mathematicians were for sale there. And the biggest star was David Hilbert—which is who Schönfinkel went to Göttingen hoping to work with.

David Hilbert

Hilbert had grown up in Prussia and started his career in Königsberg. His big break came in 1888 at age 26 when he got a major result in representation theory (then called “invariant theory”)—using then-shocking non-constructive techniques. And it was soon after this that Felix Klein recruited Hilbert to Göttingen—where he remained for the rest of his life.

In 1900 Hilbert gave his famous address to the International Congress of Mathematicians where he first listed his (ultimately 23) problems that he thought should be important in the future of mathematics. Almost all the problems are what anyone would call “mathematical”. But problem 6 has always stuck out for me: “Mathematical Treatment of the Axioms of Physics”: Hilbert somehow wanted to axiomatize physics as Euclid had axiomatized geometry. And he didn’t just talk about this; he spent nearly 20 years working on it. He brought in physicists to teach him, and he worked on things like gravitation theory (“Einstein–Hilbert action”) and kinetic theory—and wanted for example to derive the existence of the electron from something like Maxwell’s equations. (He was particularly interested in the way atomistic processes limit to continua—a problem that I now believe is deeply connected to computational irreducibility, in effect implying another appearance of undecidability, like in Hilbert’s 1st, 2nd and 10th problems.)

Hilbert seemed to feel that physics was a crucial source of raw material for mathematics. But yet he developed a whole program of research based on doing mathematics in a completely formalistic way—where one just writes down axioms and somehow “mechanically” generates all true theorems from them. (He seems to have drawn some distinction between “merely mathematical” questions, and questions about physics, apparently noting—in a certain resonance with my life’s work—that in the latter case “the physicist has the great calculating machine, Nature”.)

In 1899 Hilbert had written down more precise and formal axioms for Euclid’s geometry, and he wanted to go on and figure out how to formulate other areas of mathematics in this kind of axiomatic way. But for more than a decade he seems to have spent most of his time on physics—finally returning to questions about the foundations of mathematics around 1917, giving lectures about “logical calculus” in the winter session of 1920.

By 1920, World War I had come and gone, with comparatively little effect on mathematical life in Göttingen (the nearest battle was in Belgium 200 miles to the west). Hilbert was 58 years old, and had apparently lost quite a bit of his earlier energy (not least as a result of having contracted pernicious anemia [autoimmune vitamin B12 deficiency], whose cure was found only a few years later). But Hilbert was still a celebrity around Göttingen, and generating mathematical excitement. (Among “celebrity gossip” mentioned in a letter home by young Russian topologist Pavel Urysohn is that Hilbert was a huge fan of the gramophone, and that even at his advanced age, in the summer, he would sit in a tree to study.)

I have been able to find out almost nothing about Schönfinkel’s interaction with Hilbert. However, from April to August 1920 Hilbert gave weekly lectures entitled “Problems of Mathematical Logic” which summarized the standard formalism of the field—and the official notes for those lectures were put together by Moses Schönfinkel and Paul Bernays (the “N” initial for Schönfinkel is a typo):

Lecture notes for Hilbert—click to enlarge

Photograph by Cem Bozsahin

A few months after these lectures came, at least from our perspective today, the highlight of Schönfinkel’s time in Göttingen: the talk he gave on December 7, 1920. The venue was the weekly meeting of the Göttingen Mathematics Society, held at 6pm on Tuesdays. The society wasn’t officially part of the university, but it met in the same university “Auditorium Building” that at the time housed the math institute:

Göttingen “Auditorium Building”

The talks at the Göttingen Mathematics Society were listed in the Annual Report of the German Mathematicians Association:

Talks at the Göttingen Mathematics Society—click to enlarge

There’s quite a lineup. November 9, Ludwig Neder (student of Edmund Landau): “Trigonometric Series”. November 16, Erich Bessel-Hagen (student of Carathéodory): “Discontinuous Solutions of Variational Problems”. November 23, Carl Runge (of Runge–Kutta fame, then a Göttingen professor): “American Work on Star Clusters in the Milky Way”. November 30 Gottfried Rückle (assistant of van der Waals): “Explanations of Natural Laws Using a Statistical Mechanics Basis”. And then: December 7: Moses Schönfinkel, “Elements of Logic”.

The next week, December 14, Paul Bernays, who worked with Hilbert and interacted with Schönfinkel, spoke about “Probability, the Arrow of Time and Causality” (yes, there was still a lot of interest around Hilbert in the foundations of physics). January 10+11, Joseph Petzoldt (philosopher of science): “The Epistemological Basis of Special and General Relativity”. January 25, Emmy Noether (of Noether’s theorem fame): “Elementary Divisors and General Ideal Theory”. February 1+8, Richard Courant (of PDE etc. fame) & Paul Bernays: “About the New Arithmetic Theories of Weyl and Brouwer”. February 22, David Hilbert: “On a New Basis for the Meaning of a Number” (yes, that’s foundations of math).

What in detail happened at Schönfinkel’s talk, or as a result of it? We don’t know. But he seems to have been close enough to Hilbert that just over a year later he was in a picture taken for David Hilbert’s 60th birthday on January 23, 1922:

Hilbert’s 60th birthday

There are all sorts of well-known mathematicians in the picture (Richard Courant, Hermann Minkowski, Edmund Landau, …) as well as some physicists (Peter Debye, Theodore von Kármán, Ludwig Prandtl, …). And there near the top left is Moses Schönfinkel, sporting a somewhat surprised expression.

For his 60th birthday Hilbert was given a photo album—with 44 pages of pictures of altogether about 200 mathematicians (and physicists). And there on page 22 is Moses Schönfinkel:

Birthday photo album—click to enlarge

Göttingen University, Cod. Ms. D. Hilbert 754

Album page—click to enlarge

Göttingen University, Cod. Ms. D. Hilbert 754, Bl. 22

Who are the other people on the page with him? Adolf Kratzer (1893–1983) was a student of Arnold Sommerfeld, and a “physics assistant” to Hilbert. Hermann Vermeil (1889–1959) was an assistant to Hermann Weyl, who worked on differential geometry for general relativity. Heinrich Behmann (1891–1970) was a student of Hilbert and worked on mathematical logic, and we’ll encounter him again later. Finally, Carl Ludwig Siegel (1896–1981) had been a student of Landau and would become a well-known number theorist.

Problems Are Brewing

There’s a lot that’s still mysterious about Moses Schönfinkel’s time in Göttingen. But we have one (undated) letter written by Nathan Schönfinkel, Moses’s younger brother, presumably in 1921 or 1922 (yes, he romanizes his name “Scheinfinkel” rather than “Schönfinkel”):

Nathan Scheinfinkel’s letter to David Hilbert—click to enlarge

Göttingen University, Cod. Ms. D. Hilbert 455: 9

Dear Professor!

I received a letter from Rabbi Dr. Behrens in which he wrote that my brother was in need, that he was completely malnourished. It was very difficult for me to read these lines, even more so because I cannot help my brother. I haven’t received any messages or money myself for two years. Thanks to the good people where I live, I am protected from severe hardship. I am able to continue my studies. I hope to finish my PhD in 6 months. A few weeks ago I received a letter from my cousin stating that our parents and relatives are healthy. My cousin is in Kishinev (Bessarabia), now in Romania. He received the letter from our parents who live in Ekaterinoslav. Our parents want to help us but cannot do so because the postal connections are nonexistent. I hope these difficulties will not last long. My brother is helpless and impractical in this material world. He is a victim of his great love for science. Even as a 12 year old boy he loved mathematics, and all window frames and doors were painted with mathematical formulas by him. As a high school student, he devoted all his free time to mathematics. When he was studying at the university in Odessa, he was not satisfied with the knowledge there, and his striving and ideal was Göttingen and the king of mathematics, Prof. Hilbert. When he was accepted in Göttingen, he once wrote to me the following: “My dear brother, it seems to me as if I am dreaming but this is reality: I am in Göttingen, I saw Prof. Hilbert, I spoke to Prof. Hilbert.” The war came and with it suffering. My brother, who is helpless, has suffered more than anyone else. But he did not write to me so as not to worry me. He has a good heart. I ask you, dear Professor, for a few months until the connections with our city are established, to help him by finding a suitable (not harmful to his health) job for him. I will be very grateful to you, dear Professor, if you will answer me.

Sincerely.

N. Scheinfinkel

We’ll talk more about Nathan Schönfinkel later. But suffice it to say here that when he wrote the letter he was a physiology graduate student at the University of Bern—and he would get his PhD in 1922, and later became a professor. But the letter he wrote is probably our best single surviving source of information about the situation and personality of Moses Schönfinkel. Obviously he was a serious math enthusiast from a young age. And the letter implies that he’d wanted to work with Hilbert for some time (presumably hence the German classes in college).

It also implies that he was financially supported in Göttingen by his parents—until this was disrupted by World War I. (And we learn that his parents were OK in the Russian Revolution.) (By the way, the rabbi mentioned is probably a certain Siegfried Behrens, who left Göttingen in 1922.)

There’s no record of any reply to Nathan Schönfinkel’s letter from Hilbert. But at least by the time of Hilbert’s 60th birthday in 1922 Moses Schönfinkel was (as we saw above) enough in the inner circle to be invited to the birthday party.

What else is there in the university archives in Göttingen about Moses Schönfinkel? There’s just one document, but it’s very telling:

Reference for Schönfinkel—click to enlarge

Göttingen University, Unia GÖ, Sek. 335.55

It’s dated 18 March 1924. And it’s a carbon copy of a reference for Schönfinkel. It’s rather cold and formal, and reads:

“The Russian privatdozent [private lecturer] in mathematics, Mr. Scheinfinkel, is hereby certified to have worked in mathematics for ten years with Prof. Hilbert in Göttingen.”

It’s signed (with a stylized “S”) by the “University Secretary”, a certain Ludwig Gossmann, who we’ll be talking about later. And it’s being sent to Ms. Raissa Neuburger, at Bühlplatz 5, Bern. That address is where the Physiology Institute at the University of Bern is now, and also was in 1924. And Raissa Neuberger either was then, or soon would become, Nathan Schönfinkel’s wife.

But there’s one more thing, handwritten in black ink at the bottom of the document. Dated March 20, it’s another note from the University Secretary. It’s annotated “a.a.”, i.e. ad acta—for the records. And in German it reads:

Gott sei Dank, dass Sch weg ist

which translates in English as:

Thank goodness Sch is gone

Hmm. So for some reason at least the university secretary was happy to see Schönfinkel go. (Or perhaps it was a German 1920s version of an HR notation: “not eligible for rehire”.) But let’s analyze this document in a little more detail. It says Schönfinkel worked with Hilbert for 10 years. That agrees with him having arrived in Göttingen in 1914 (which is a date we know for other reasons, as we’ll see below).

But now there’s a mystery. The reference describes Schönfinkel as a “privatdozent”. That’s a definite position at a German university, with definite rules, that in 1924 one would expect to have been rigidly enforced. The basic career track was (and largely still is): first, spend 2–5 years getting a PhD. Then perhaps get recruited for a professorship, or if not, continue doing research, and write a habilitation, after which the university may issue what amounts to an official government “license to teach”, making someone a privatdozent, able to give lectures. Being a privatdozent wasn’t as such a paid gig. But it could be combined with a job like being an assistant to a professor—or something outside the university, like tutoring, teaching high school or working at a company.

So if Schönfinkel was a privatdozent in 1924, where is the record of his PhD, or his habilitation? To get a PhD required “formally publishing” a thesis, and printing (as in, on a printing press) at least 20 or so copies of the thesis. A habilitation was typically a substantial, published research paper. But there’s absolutely no record of any of these things for Schönfinkel. And that’s very surprising. Because there are detailed records for other people (like Paul Bernays) who were around at the time, and were indeed privatdozents.

And what’s more the Annual Report of the German Mathematicians Association—which listed Schönfinkel’s 1920 talk—seems to have listed mathematical goings-on in meticulous detail. Who gave what talk. Who wrote what paper. And most definitely who got a PhD, did a habilitation or became a privatdozent. (And becoming a privatdozent also required an action of the university senate, which was carefully recorded.) But going through all the annual reports of the German Mathematicians Association we find only four mentions of Schönfinkel. There’s his 1920 talk, and also a 1921 talk with Paul Bernays that we’ll discuss later. There’s the publication of his papers in 1924 and 1927. And there’s a single other entry, which says that on November 4, 1924, Richard Courant gave a report to the Göttingen Mathematical Society about a conference in Innsbruck, where Heinrich Behmann reported on “published work by M. Schönfinkel”. (It describes the work as follows: “It is a continuation of Sheffer’s [1913] idea of replacing the elementary operations of symbolic logic with a single one. By means of a certain function calculus, all logical statements (including the mathematical ones) are represented by three basic signs alone.”)

So, it seems, the university secretary wasn’t telling it straight. Schönfinkel might have worked with Hilbert for 10 years. But he wasn’t a privatdozent. And actually it doesn’t seem as if he had any “official status” at all.

So how do we even know that Schönfinkel was in Göttingen from 1914 to 1924? Well, he was Russian, and so in Germany he was an “alien”, and as such he was required to register his address with the local police (no doubt even more so from 1914 to 1918 when Germany was, after all, at war with Russia). And the remarkable thing is that even after all these years, Schönfinkel’s registration card is still right there in the municipal archives of the city of Göttingen:

Address registration card—click to enlarge

Stadtarchiv Göttingen, Meldekartei

So that means we have all Schönfinkel’s addresses during his time in Göttingen. Of course, there are confusions. There’s yet another birthdate for Schönfinkel: September 4, 1889. Wrong year. Perhaps a wrongly done correction from the Julian calendar. Perhaps “adjusted” for some reason of military service obligations. But, in any case, the document says that Moses Schönfinkel from Ekaterinoslav arrived in Göttingen on June 1, 1914, and started living at 6 Lindenstraße (now Felix-Klein-Strasse).

He moved pretty often (11 times in 10 years), not at particularly systematic times of year. It’s not clear exactly what the setup was in all these places, but at least at the end (and in another document) it lists addresses and “with Frau….”, presumably indicating that he was renting a room in someone’s house.

Where were all those addresses? Well, here’s a map of Göttingen circa 1920, with all of them plotted (along with a red “M” for the location of the math institute):

Map of Göttingen circa 1920—click to enlarge

Stadtarchiv Göttingen, D 2, V a 62

The last item on the registration card says that on March 18, 1924 he departed Göttingen, and went to Moscow. And the note on the copy of the reference saying “thank goodness [he’s] gone” is dated March 20, so that all ties together.

But let’s come back to the reference. Who was this “University Secretary” who seems to have made up the claim that Schönfinkel was a privatdozent? It was fairly easy to find out that his name was Ludwig Gossmann. But the big surprise was to find out that the university archives in Göttingen have nearly 500 pages about him—primarily in connection with a “criminal investigation”.

Here’s the story. Ludwig Gossmann was born in 1878 (so he was 10 years older than Schönfinkel). He grew up in Göttingen, where his father was a janitor at the university. He finished high school but didn’t go to college and started working for the local government. Then in 1906 (at age 28) he was hired by the university as its “secretary”.

The position of “university secretary” was a high-level one. It reported directly to the vice-rector of the university, and was responsible for “general administrative matters” for the university, including, notably, the supervision of international students (of whom there were many, Schönfinkel being one). Ludwig Gossmann held the position of university secretary for 27 years—even while the university had a different rector (normally a distinguished academic) every year.

But Mr. Gossmann also had a sideline: he was involved in real estate. In the 1910s he started building houses (borrowing money from, among others, various university professors). And by the 1920s he had significant real estate holdings—and a business where he rented to international visitors and students at the university.

Years went by. But then, on January 24, 1933, the newspaper headline announced: “Sensational arrest: senior university official Gossmann arrested on suspicion of treason—communist revolution material [Zersetzungsschrift] confiscated from his apartment”. It was said that perhaps it was a setup, and that he’d been targeted because he was gay (though, a year earlier, at age 54, he did marry a woman named Elfriede).

Gossmann in the newspaper headline—click to enlarge

Göttingen University, Kur 3730, Sek 356 2

This was a bad time to be accused of being a communist (Hitler would become chancellor less than a week later, on January 30, 1933, in part propelled by fears of communism). Gossmann was taken to Hanover “for questioning”, but was then allowed back to Göttingen “under house arrest”. He’d had health problems for several years, and died of a heart attack on February 24, 1933.

But none of this really helps us understand why Gossmann would go out on a limb to falsify the reference for Schönfinkel. We can’t specifically find an address match, but perhaps Schönfinkel had at least at some point been a tenant of Gossmann’s. Perhaps he still owed rent. Perhaps he was just difficult in dealing with the university administration. It’s not clear. It’s also not clear why the reference Gossmann wrote was sent to Schönfinkel’s brother in Bern, even though Schönfinkel himself was going to Moscow. Or why it wasn’t just handed to Schönfinkel before he left Göttingen.

The 1924 Paper

Whatever was going on with Schönfinkel in Göttingen in 1924, we know one thing for sure: it was then that he published his remarkable paper about what are now called combinators. Let’s talk in a bit more detail about the paper—though the technicalities I’m discussing elsewhere.

First, there’s some timing. At the end of the paper, it says it was received by the journal on March 15, 1924, i.e. just three days before the date of Ludwig Gossmann’s reference for Schönfinkel. And then at the top of the paper, there’s something else: under Schönfinkel’s name it says “in Moskau”, i.e. at least as far as the journal was concerned, Schönfinkel was in Moscow, Russia, at the time the article was published:

“M. Schönfinkel in Moskau”—click to enlarge

There’s also a footnote on the first page of the paper:

Footnote—click to enlarge

“The following thoughts were presented by the author to the Mathematical Society in Göttingen on December 7, 1920. Their formal and stylistic processing for this publication was done by H. Behmann in Göttingen.”

The paper itself is written in a nice, clear and mathematically mature way. Its big result (as I’ve discussed elsewhere) is the introduction of what would later be called combinators: two abstract constructs from which arbitrary functions and computations can be built up. Schönfinkel names one of them S, after the German word “Schmelzen” for “fusion”. The other has become known as K, although Schönfinkel calls it C, even though the German word for “constancy” (which is what would naturally describe it) is “Konstantheit”, which starts with a K.

The paper ends with three paragraphs, footnoted with “The considerations that follow are the editor’s” (i.e. Behmann’s). They’re not as clear as the rest of the paper, and contain a confused mistake.

The main part of the paper is “just math” (or computation, or whatever). But here’s the page where S and K (called C here) are first used:

Schönfinkel’s “Über die Bausteine der mathematischen Logik”—click to enlarge

And now there’s something more people-oriented: a footnote to the combinator equation I = SCC saying “This reduction was communicated to me by Mr. Boskowitz; some time before that, Mr. Bernays had called the somewhat less simple one (SC)(CC) to my attention.” In other words, even if nothing else, Schönfinkel had talked to Boskowitz and Bernays about what he was doing.

OK, so we’ve got three people—in addition to David Hilbert—somehow connected to Moses Schönfinkel.

Let’s start with Heinrich Behmann—the person footnoted as “processing” Schönfinkel’s paper for publication:

Heinrich Behmann

He was born in Bremen, Germany in 1891, making him a couple of years younger than Schönfinkel. He arrived in Göttingen as a student in 1911, and by 1914 was giving a talk about Whitehead and Russell’s Principia Mathematica (which had been published in 1910). When World War I started he volunteered for military service, and in 1915 he was wounded in action in Poland (receiving an Iron Cross)—but in 1916 he was back in Göttingen studying under Hilbert, and in 1918 he wrote his PhD thesis on “The Antinomy of the Transfinite Number and Its Resolution by the Theory of Russell and Whitehead” (i.e. using the idea of types to deal with paradoxes associated with infinity).

Behmann continued in the standard academic track (i.e. what Schönfinkel apparently didn’t do)—and in 1921 he got his habilitation with the thesis “Contributions to the Algebra of Logic, in Particular to the Entscheidungsproblem [Decision Problem]”. There’d been other decision problems discussed before, but Behmann said what he meant was a “procedure [giving] complete instructions for determining whether a [logical or mathematical] assertion is true or false by a deterministic calculation after finitely many steps”. And, yes, Alan Turing’s 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem” was what finally established that the halting problem, and therefore the Entscheidungsproblem, was undecidable. Curiously, in principle, there should have been enough in Schönfinkel’s paper that this could have been figured out back in 1921 if Behmann or others had been thinking about it in the right way (which might have been difficult before Gödel’s work).

So what happened to Behmann? He continued to work on mathematical logic and the philosophy of mathematics. After his habilitation in 1921 he became a privatdozent at Göttingen (with a job as an assistant in the applied math institute), and then in 1925 got a professorship in Halle in applied math—though having been an active member of the Nazi Party since 1937, lost this professorship in 1945 and became a librarian. He died in 1970.

(By the way, even though in 1920 “PM” [Principia Mathematica] was hot—and Behmann was promoting it—Schönfinkel had what in my opinion was the good taste to not explicitly mention it in his paper, referring only to Hilbert’s much-less-muddy ideas about the formalization of mathematics.)

OK, so what about Boskovitz, credited in the footnote with having discovered the classic combinator result I = SKK? That was Alfred Boskovitz, in 1920 a 23-year-old Jewish student at Göttingen, who came from Budapest, Hungary, and worked with Paul Bernays on set theory. Boskovitz is notable for having contributed far more corrections (nearly 200) to Principia Mathematica than anyone else, and being acknowledged (along with Behmann) in a footnote in the (1925–27) second edition. (This edition also gives a reference to Schönwinkel’s [sic] paper at the end of a list of 14 “other contributions to mathematical logic” since the first edition.) In the mid-1920s Boskovitz returned to Budapest. In 1936 he wrote to Behmann that anti-Jewish sentiment there made him concerned for his safety. There’s one more known communication from him in 1942, then no further trace.

The third person mentioned in Schönfinkel’s paper is Paul Bernays, who ended up living a long and productive life, mostly in Switzerland. But we’ll come to him later.

So where was Schönfinkel’s paper published? It was in a journal called Mathematische Annalen (Annals of Mathematics)—probably the top math journal of the time. Here’s its rather swank masthead, with quite a collection of famous names (including physicists like Einstein, Born and Sommerfeld):

Mathematische Annalen

The “instructions to contributors” on the inside cover of each issue had a statement from the “Editorial Office” about not changing things at the proof stage because “according to a calculation they [cost] 6% of the price of a volume”. The instructions then go on to tell people to submit papers to the editors—at their various home addresses (it seems David Hilbert lived just down the street from Felix Klein…):

Home addresses—click to enlarge

Here’s the complete table of contents for the volume in which Schönfinkel’s paper appears:

Table of contents—click to enlarge

There are a variety of famous names here. But particularly notable for our purposes are Aleksandr Khintchine (of Khinchin constant fame) and the topologists Pavel Alexandroff and Pavel Urysohn, who were all from Moscow State University, and who are all indicated, like Schönfinkel as being “in Moscow”.

There’s a little bit of timing information here. Schönfinkel’s paper was indicated as having been received by the journal on March 15, 1924. The “thank goodness [he’s] gone [from Göttingen]” comment is dated March 20. Meanwhile, the actual issue of the journal with Schönfinkel’s article (number 3 of 4) was published September 15, with table of contents:

Issue of the journal with Schönfinkel’s article—click to enlarge

But note the ominous † next to Urysohn’s name. Turns out his fatal swimming accident was August 17, so—notwithstanding their admonitions—the journal must have added the † quite quickly at the proof stage.

The “1927” Paper

Beyond his 1924 paper on combinators, there’s only one other known piece of published output from Moses Schönfinkel: a paper coauthored with Paul Bernays “On the Decision Problem of Mathematical Logic”:

“On the Decision Problem of Mathematical Logic”—click to enlarge

It’s actually much more widely cited than Schönfinkel’s 1924 combinator paper, but it’s vastly less visionary and ultimately much less significant; it’s really about a technical point in mathematical logic.

About halfway through the paper it has a note:

A note—click to enlarge

“The following thoughts were inspired by Hilbert’s lectures on mathematical logic and date back several years. The decision procedure for a single function F(x, y) was derived by M. Schönfinkel, who first tackled the problem; P. Bernays extended the method to several logical functions, and also wrote the current paper.”

The paper was submitted on March 24, 1927. But in the records of the German Mathematicians Association we find a listing of another talk at the Göttingen Mathematical Society: December 6, 1921, P. Bernays and M. Schönfinkel, “Das Entscheidungsproblem im Logikkalkul”. So the paper had a long gestation period, and (as the note in the paper suggests) it basically seems to have fallen to Bernays to get it written, quite likely with little or no communication with Schönfinkel.

So what else do we know about it? Well, remarkably enough, the Bernays archive contains two notebooks (the paper kind!) by Moses Schönfinkel that are basically an early draft of the paper (with the title already being the same as it finally was, but with Schönfinkel alone listed as the author):

Schönfinkel’s notebooks—click to enlarge

ETH Zurich, Bernays Archive, Hs. 974: 282

These notebooks are basically our best window into the front lines of Moses Schönfinkel’s work. They aren’t dated as such, but at the end of the second notebook there’s a byline of sorts, that lists his street address in Göttingen—and we know he lived at that address from September 1922 until March 1924:

Signature and address—click to enlarge

OK, so what’s in the notebooks? The first page might indicate that the notebooks were originally intended for a different purpose. It’s just a timetable of lectures:

Timetable of lectures—click to enlarge

“Hilbert lectures: Monday: Mathematical foundations of quantum theory; Thursday: Hilbert–Bernays: Foundations of arithmetic; Saturday: Hilbert: Knowledge and mathematical thinking”. (There’s also a slightly unreadable note that seems to say “Hoppe. 6–8… electricity”, perhaps referring to Edmund Hoppe, who taught physics in Göttingen, and wrote a history of electricity.)

But then we’re into 15 pages (plus 6 in the other notebook) of content, written in essentially perfect German, but with lots of parentheticals of different possible word choices:

Page of Schönfinkel’s notebook—click to enlarge

The final paper as coauthored with Bernays begins:

“The central problem of mathematical logic, which is also closely connected to its axiomatic foundations, is the decision problem [Entscheidungsproblem]. And it deals with the following. We have logical formulas which contain logic functions, predicates, …”

Schönfinkel’s version begins considerably more philosophically (here with a little editing for clarity):

“Generality has always been the main goal—the ideal of the mathematician. Generality in the solution, in the method, in the concept and formulation of the theorem, in the problem and question. This tendency is even more pronounced and clearer with modern mathematicians than with earlier ones, and reaches its high point in the work of Hilbert and Ms. Noether. Such an ideal finds its most extreme expression when one faces the problem of “solving all problems”—at least all mathematical problems, because everything else after is easy, as soon as this “Gordian Knot” is cut (because the world is written in “mathematical letters” according to Hilbert).

In just the previous century mathematicians would have been extremely skeptical and even averse to such fantasies… But today’s mathematician has already been trained and tested in the formal achievements of modern mathematics and Hilbert’s axiomatics, and nowadays one has the courage and the boldness to dare to touch this question as well. We owe to mathematical logic the fact that we are able to have such a question at all.

From Leibniz’s bold conjectures, the great logician-mathematicians went step by step in pursuit of this goal, in the systematic structure of mathematical logic: Boole (discoverer of the logical calculus), (Bolzano?), Ernst Schröder, Frege, Peano, Ms. Ladd-Franklin, the two Peirces, Sheffer, Whitehead, Couturat, Huntington, Padoa, Shatunovsky, Sleshinsky, Kagan, Poretsky, Löwenheim, Skolem, … and their numerous students, collaborators and contemporaries … until in 1910–1914 “the system” by Bertrand Russell and Whitehead appeared—the famous “Principia Mathematica”—a mighty titanic work, a large system. Finally came our knowledge of logic from Hilbert’s lectures on (the algebra of) logic (-calculus) and, following on from this, the groundbreaking work of Hilbert’s students: Bernays and Behmann.

The investigations of all these scholars and researchers have led (in no uncertain terms) to the fact that it has become clear that actual mathematics represents a branch of logic. … This emerges most clearly from the treatment and conception of mathematical logic that Hilbert has given. And now, thanks to Hilbert’s approach, we can (satisfactorily) formulate the great decision problem of mathematical logic.”

Pages of the first notebook—click to enlarge
Pages of the second notebook—click to enlarge

We learn quite a bit about Schönfinkel from this. Perhaps the most obvious thing is that he was a serious fan of Hilbert and his approach to mathematics (with a definite shout-out to “Ms. Noether”). It’s also interesting that he refers to Bernays and Behmann as “students” of Hilbert. That’s pretty much correct for Behmann. But Bernays (as we’ll see soon) was more an assistant or colleague of Hilbert’s than a student.

It gives interesting context to see Schönfinkel rattle off a sequence of contributors to what he saw as the modern view of mathematical logic. He begins—quite rightly I think—mentioning “Leibniz’s bold conjectures”. He’s not sure whether Bernard Bolzano fits (and neither am I). Then he lists Schröder, Frege and Peano—all pretty standard choices, involved in building up the formal structure of mathematical logic.

Next he mentions Christine Ladd-Franklin. At least these days, she’s not particularly well known, but she had been a mathematical logic student of Charles Peirce, and in 1881 she’d written a paper about the “Algebra of Logic” which included a truth table, a solid 40 years before Post or Wittgenstein. (In 1891 she had also worked in Göttingen on color vision with the experimental psychologist Georg Müller—who was still there in 1921.) It’s notable that Schönfinkel mentions Ladd-Franklin ahead of the father-and-son Peirces. Next we see Sheffer, who Schönfinkel quotes in connection with Nand in his combinator paper. (No doubt unbeknownst to Schönfinkel, Henry Sheffer—who spent most of his life in the US—was also born in Ukraine, and was also Jewish, and was just 6 years older than Schönfinkel.) I’m guessing Schönfinkel mentions Whitehead next in connection with universal algebra, rather than his later collaboration with Russell.

Next comes Louis Couturat, who frankly wouldn’t have made my list for mathematical logic, but was another “algebra of logic” person, as well as a Leibniz fan, and developer of the Ido language offshoot from Esperanto. Huntington was involved in the axiomatization of Boolean algebra; Padoa was connected to Peano’s program. Shatunovsky, Sleshinsky and Kagan were all professors of Schönfinkel’s in Odessa (as mentioned above), concerned in various ways with foundations of mathematics. Platon Poretsky I must say I had never heard of before; he seems to have done fairly technical work on propositional logic. And finally Schönfinkel lists Löwenheim and Skolem, both of whom are well known in mathematical logic today.

I consider it rather wonderful that Schönfinkel refers to Whitehead and Russell’s Principia Mathematica as a “titanic work” (Titanenwerk). The showy and “overconfident” Titanic had come to grief on its iceberg in 1912, somehow reminiscent of Principia Mathematica, eventually coming to grief on Gödel’s theorem.

At first it might just seem charming—particularly in view of his brother’s comment that “[Moses] is helpless and impractical in this material world”—to see Schönfinkel talk about how after one’s solved all mathematical problems, then solving all problems will be easy, explaining that, after all, Hilbert has said that “the world is written in ‘mathematical letters’”. He says that in the previous century mathematicians wouldn’t have seriously considered “solving everything”, but now, because of progress in mathematical logic, “one has the courage and the boldness to dare to touch this question”.

It’s very easy to see this as naive and unworldly—the writing of someone who knew only about mathematics. But though he didn’t have the right way to express it, Schönfinkel was actually onto something, and something very big. He talks at the beginning of his piece about generality, and about how recent advances in mathematical logic embolden one to pursue it. And in a sense he was very right about this. Because mathematical logic—through work like his—is what led us to the modern conception of computation, which really is successful in “talking about everything”. Of course, after Schönfinkel’s time we learned about Gödel’s theorem and computational irreducibility, which tell us that even though we may be able to talk about everything, we can never expect to “solve every problem” about everything.

But back to Schönfinkel’s life and times. The remainder of Schönfinkel’s notebooks give the technical details of his solution to a particular case of the decision problem. Bernays obviously worked through these, adding more examples as well as some generalization. And Bernays cut out Schönfinkel’s philosophical introduction, no doubt on the (probably correct) assumption that it would seem too airy-fairy for the paper’s intended technical audience.

So who was Paul Bernays? Here’s a picture of him from 1928:

Paul Bernays

Bernays was almost exactly the same age as Schönfinkel (he was born on October 17, 1888—in London, where there was no calendar issue to worry about). He came from an international business family, was a Swiss citizen and grew up in Paris and Berlin. He studied math, physics and philosophy with a distinguished roster of professors in Berlin and Göttingen, getting his PhD in 1912 with a thesis on analytic number theory.

After his PhD he went to the University of Zurich, where he wrote a habilitation (on complex analysis), and became a privatdozent (yes, with the usual documentation, that can still be found), and an assistant to Ernst Zermelo (of ZFC set theory fame). But in 1917 Hilbert visited Zurich and soon recruited Bernays to return to Göttingen. In Göttingen, for apparently bureaucratic reasons, Bernays wrote a second habilitation, this time on the axiomatic structure of Principia Mathematica (again, all the documentation can still be found). Bernays was also hired to work as a “foundations of math assistant” to Hilbert. And it was presumably in that capacity that he—along with Moses Schönfinkel—wrote the notes for Hilbert’s 1920 course on mathematical logic.

Unlike Schönfinkel, Bernays followed a fairly standard—and successful—academic track. He became a professor in Göttingen in 1922, staying there until he was dismissed (because of partially Jewish ancestry) in 1933—after which he moved back to Zurich, where he stayed and worked very productively, mostly in mathematical logic (von Neumann–Bernays–Gödel set theory, etc.), until he died in 1977.

Back when he was in Göttingen one of the things Bernays did with Hilbert was to produce the two-volume classic Grundlagen der Mathematik (Foundations of Mathematics). So did the Grundlagen mention Schönfinkel? It has one mention of the Bernays–Schönfinkel paper, but no direct mention of combinators. However, there is one curious footnote:

Curious footnote-1

This starts “A system of axioms that is sufficient to derive all true implicational formulas was first set up by M. Schönfinkel…”, then goes on to discuss work by Alfred Tarski. So do we have evidence of something else Schönfinkel worked on? Probably.

In ordinary logic, one starts from an axiom system that gives relations, say about And, Or and Not. But, as Sheffer established in 1910, it’s also possible to give an axiom system purely in terms of Nand (and, yes, I’m proud to say that I found the very simplest such axiom system in 2000). Well, it’s also possible to use other bases for logic. And this footnote is about using Implies as the basis. Actually, it’s implicational calculus, which isn’t as strong as ordinary logic, in the sense that it only lets you prove some of the theorems. But there’s a question again: what are the possible axioms for implicational calculus?

Well, it seems that Schönfinkel found a possible set of such axioms, though we’re not told what they were; only that Tarski later found a simpler set. (And, yes, I looked for the simpler axiom systems for implicational calculus in 2000, but didn’t find any.) So again we see Schönfinkel in effect trying to explore the lowest-level foundations of mathematical logic, though we don’t know any details.

So what other interactions did Bernays have with Schönfinkel? There seems to be no other information in Bernays’s archives. But I have been able to get a tiny bit more information. In a strange chain of connections, someone who’s worked on Mathematica and Wolfram Language since 1987 is Roman Maeder. And Roman’s thesis advisor (at ETH Zurich) was Erwin Engeler—who was a student of Paul Bernays. Engeler (who is now in his 90s) worked for many years on combinators, so of course I had to ask him what Bernays might have told him about Schönfinkel. He told me he recalled only two conversations. He told me he had the impression that Bernays found Schönfinkel a difficult person. He also said he believed that the last time Bernays saw Schönfinkel it was in Berlin, and that Schönfinkel was somehow in difficult circumstances. Any such meeting in Berlin would have had to be before 1933. But try as we might to track it down, we haven’t succeeded.

To Moscow and Beyond…

In the space of three days in March 1924 Moses Schönfinkel—by then 35 years old—got his paper on combinators submitted to Mathematische Annalen, got a reference for himself sent out, and left for Moscow. But why did he go to Moscow? We simply don’t know.

A few things are clear, though. First, it wasn’t difficult to get to Moscow from Göttingen at that time; there was pretty much a direct train there. Second, Schönfinkel presumably had a valid Russian passport (and, one assumes, didn’t have any difficulties from not having served in the Russian military during World War I).

One also knows that there was a fair amount of intellectual exchange and travel between Göttingen and Moscow. The very same volume of Mathematische Annalen in which Schönfinkel’s paper was published has three (out of 19 authors) authors in addition to Schönfinkel listed as being in Moscow: Pavel Alexandroff, Pavel Urysohn and Aleksandr Khinchin. Interestingly, all of these people were at Moscow State University.

And we know there was more exchange with that university. Nikolai Luzin, for example, got his PhD in Göttingen in 1915, and went on to be a leader in mathematics at Moscow State University (until he was effectively dismissed by Stalin in 1936). And we know that for example in 1930, Andrei Kolmogorov, having just graduated from Moscow State University, came to visit Hilbert.

Did Schönfinkel go to Moscow State University? We don’t know (though we haven’t yet been able to access any archives that may be there).

Did Schönfinkel go to Moscow because he was interested in communism? Again, we don’t know. It’s not uncommon to find mathematicians ideologically sympathetic to at least the theory of communism. But communism doesn’t seem to have particularly been a thing in the mathematics or general university community in Göttingen. And indeed when Ludwig Gossmann was arrested in 1933, investigations of who he might have recruited into communism didn’t find anything of substance.

Still, as I’ll discuss later, there is a tenuous reason to think that Schönfinkel might have had some connection to Leon Trotsky’s circle, so perhaps that had something to do with him going to Moscow—though it would have been a bad time to be involved with Trotsky, since by 1925 he was already out of favor with Stalin.

A final theory is that Schönfinkel might have had relatives in Moscow; at least it looks as if some of his Lurie cousins ended up there.

But realistically we don’t know. And beyond the bylines on the journals, we don’t really have any documentary evidence that Schönfinkel was in Moscow. However, there is one more data point, from November 1927 (8 months after the submission of Schönfinkel’s paper with Bernays). Pavel Alexandroff was visiting Princeton University, and when Haskell Curry (who we’ll meet later) asked him about Schönfinkel he was apparently told that “Schönfinkel has… gone insane and is now in a sanatorium & will probably not be able to work any more.”

Ugh! What happened? Once again, we don’t know. Schönfinkel doesn’t seem to have ever been “in a sanatorium” while he was in Göttingen; after all, we have all his addresses, and none of them were sanatoria. Maybe there’s a hint of something in Schönfinkel’s brother’s letter to Hilbert. But are we really sure that Schönfinkel actually suffered from mental illness? There’s a bunch of hearsay that says he did. But then it’s a common claim that logicians who do highly abstract work are prone to mental illness (and, well, yes, there are a disappointingly large number of historical examples).

Mental illness wasn’t handled very well in the 1920s. Hilbert’s only child, his son Franz (who was about five years younger than Schönfinkel), suffered from mental illness, and after a delusional episode that ended up with him in a clinic, David Hilbert simply said “From now on I have to consider myself as someone who does not have a son”. In Moscow in the 1920s—despite some political rhetoric—conditions in psychiatric institutions were probably quite poor, and there was for example quite a bit of use of primitive shock therapy (though not yet electroshock). It’s notable, by the way, that Curry reports that Alexandroff described Schönfinkel as being “in a sanatorium”. But while at that time the word “sanatorium” was being used in the US as a better term for “insane asylum”, in Russia it still had more the meaning of a place for a rest cure. So this still doesn’t tell us if Schönfinkel was in fact “institutionalized”—or just “resting”. (By the way, if there was mental illness involved, another connection for Schönfinkel that doesn’t seem to have been made is that Paul Bernays’s first cousin once removed was Martha Bernays, wife of Sigmund Freud.)

Whether or not he was mentally ill, what would it have been like for Schönfinkel in what was then the Soviet Union in the 1920s? One thing is that in the Soviet system, everyone was supposed to have a job. So Schönfinkel was presumably employed doing something—though we have no idea what. Schönfinkel had presumably been at least somewhat involved with the synagogue in Göttingen (which is how the rabbi there knew to tell his brother he was in bad shape). There was a large and growing Jewish population in Moscow in the 1920s, complete with things like Yiddish newspapers. But by the mid 1930s it was no longer so comfortable to be Jewish in Moscow, and Jewish cultural organizations were being shut down.

By the way, in the unlikely event that Schönfinkel was involved with Trotsky, there could have been trouble even by 1925, and certainly by 1929. And it’s notable that it was a common tactic for Stalin (and others) to claim that their various opponents were “insane”.

So what else do we know about Schönfinkel in Moscow? It’s said that he died there in 1940 or 1942, aged 52–54. Conditions in Moscow wouldn’t have been good then; the so-called Battle of Moscow occurred in the winter of 1941. And there are various stories told about Schönfinkel’s situation at that time.

The closest to a primary source seems to be a summary of mathematical logic in the Soviet Union, written by Sofya Yanovskaya in 1948. Yanovskaya was born in 1896 (so 8 years after Schönfinkel), and grew up in Odessa. She attended the same university there as Schönfinkel, studying mathematics, though arrived five years after Schönfinkel graduated. She had many of the same professors as Schönfinkel, and, probably like Schönfinkel, was particularly influenced by Shatunovsky. When the Russian Revolution happened, Yanovskaya went “all in”, becoming a serious party operative, but eventually began to teach, first at the Institute of Red Professors, and then from 1925 at Moscow State University—where she became a major figure in mathematical logic, and was eventually awarded the Order of Lenin.

One might perhaps have thought that mathematical logic would be pretty much immune to political issues. But the founders of communism had talked about mathematics, and there was a complex debate about the relationship between Marxist–Leninist ideology and formal ideas in mathematics, notably the Law of Excluded Middle. Sofya Yanovskaya was deeply involved, initially in trying to “bring mathematics to heel”, but later in defending it as a discipline, as well as in editing Karl Marx’s mathematical writings.

It’s not clear to what extent her historical writings were censored or influenced by party considerations, but they certainly contain lots of good information, and in 1948 she wrote a paragraph about Schönfinkel:

Yanovskaya’s paragraph about Schönfinkel

“The work of M. I. Sheinfinkel played a substantial role in the further development of mathematical logic. This brilliant student of S. O. Shatunovsky, unfortunately, left us early. (After getting mentally ill [заболев душевно], M. I. Sheinfinkel passed away in Moscow in 1942.) He did the work mentioned here in 1920, but only published it in 1924, edited by Behmann.”

Unless she was hiding things, this quote doesn’t make it sound as if Yanovskaya knew much about Schönfinkel. (By the way, her own son was apparently severely mentally ill.) A student of Jean van Heijenoort (who we’ll encounter later) named Irving Anellis did apparently in the 1990s ask a student of Yanovskaya’s whether Yanovskaya had known Schönfinkel. Apparently he responded that unfortunately nobody had thought to ask her that question before she died in 1966.

What else do we know? Nothing substantial. The most extensively embellished story I’ve seen about Schönfinkel appears in an anonymous comment on the talk page for the Wikipedia entry about Schönfinkel:

“William Hatcher, while spending time in St Petersburg during the 1990s, was told by Soviet mathematicians that Schönfinkel died in wretched poverty, having no job and but one room in a collective apartment. After his death, the rough ordinary people who shared his apartment burned his manuscripts for fuel (WWII was raging). The few Soviet mathematicians around 1940 who had any discussions with Schönfinkel later said that those mss reinvented a great deal of 20th century mathematical logic. Schönfinkel had no way of accessing the work of Turing, Church, and Tarski, but had derived their results for himself. Stalin did not order Schönfinkel shot or deported to Siberia, but blame for Schönfinkel’s death and inability to publish in his final years can be placed on Stalin’s doorstep. 202.36.179.65 06:50, 25 February 2006 (UTC)”

William Hatcher was a mathematician and philosopher who wrote extensively about the Baháʼí Faith and did indeed spend time at the Steklov Institute of Mathematics in Saint Petersburg in the 1990s—and mentioned Schönfinkel’s technical work in his writings. People I’ve asked at the Steklov Institute do remember Hatcher, but don’t know anything about what it’s claimed he was told about Schönfinkel. (Hatcher died in 2005, and I haven’t been successful at getting any material from his archives.)

So are there any other leads? I did notice that the IP address that originated the Wikipedia comment is registered to the University of Canterbury in New Zealand. So I asked people there and in the New Zealand foundations of math scene. But despite a few “maybe so-and-so wrote that” ideas, nobody shed any light.

OK, so what about at least a death certificate for Schönfinkel? Well, there’s some evidence that the registry office in Moscow has one. But they tell us that in Russia only direct relatives can access death certificates….

Other Schönfinkels…

So far as we know, Moses Schönfinkel never married, and didn’t have children. But he did have a brother, Nathan, who we encountered earlier in connection with the letter he wrote about Moses to David Hilbert. And in fact we know quite a bit about Nathan Scheinfinkel (as he normally styled himself). Here’s a biographical summary from 1932:

Biographical summary

Deutsches Biographisches Archiv, II 1137, 103

The basic story is that he was about five years younger than Moses, and went to study medicine at the University of Bern in Switzerland in April 1914 (i.e. just before World War I began). He got his MD in 1920, then got his PhD on “Gas Exchange and Metamorphosis of Amphibian Larvae after Feeding on the Thyroid Gland or Substances Containing Iodine” in 1922. He did subsequent research on the electrochemistry of the nervous system, and in 1929 became a privatdozent—with official “license to teach” documentation:

License to teach—click to enlarge

(In a piece of bizarre small-worldness, my grandfather, Max Wolfram, also got a PhD in the physiology [veterinary medicine] department at the University of Bern [studying the function of the thymus gland], though that was in 1909, and presumably he had left before Nathan Scheinfinkel arrived.)

But in any case, Nathan Scheinfinkel stayed at Bern, eventually becoming a professor, and publishing extensively, including in English. He became a Swiss citizen in 1932, with the official notice stating:

“Scheinfinkel, Nathan. Son of Ilia Gerschow and Mascha [born] Lurie, born in Yekaterinoslav, Russia, September 13, 1893 (old style). Doctor of medicine, residing in Bern, Neufeldstrasse 5a, husband of Raissa [born] Neuburger.”

In 1947, however, he moved to become a founding professor in a new medical school in Ankara, Turkey. (Note that Turkey, like Switzerland, had been neutral in World War II.) In 1958 he moved again, this time to found the Institute of Physiology at Ege University in Izmir, Turkey, and then at age 67, in 1961, he retired and returned to Switzerland.

Nathan Scheinfinkel

Did Nathan Scheinfinkel have children (whose descendents, at least, might know something about “Uncle Moses”)? It doesn’t seem so. We tracked down Nuran Harirî, now an emeritus professor, but in the 1950s a young physiology resident at Ege University responsible for translating Nathan Scheinfinkel’s lectures into Turkish. She said that Nathan Scheinfinkel was at that point living in campus housing with his wife, but she never heard mention of any children, or indeed of any other family members.

What about any other siblings? Amazingly, looking through handwritten birth records from Ekaterinoslav, we found one! Debora Schönfinkel, born December 22, 1889 (i.e. January 3, 1890, in the modern calendar):

Debora Schönfinkel’s birth record—click to enlarge

So Moses Schönfinkel had a younger sister, as well as a younger brother. And we even know that his sister graduated from 7th grade in June 1907. But we don’t know anything else about her, or about other siblings. We know that Schönfinkel’s mother died in 1936, at the age of 74.

Might there have been other Schönfinkel relatives in Ekaterinoslav? Perhaps, but it’s unlikely they survived World War II—because in one of those shocking and tragic pieces of history, over a four-day period in February 1942 almost the whole Jewish population of 30,000 was killed.

Could there be other Schönfinkels elsewhere? The name is not common, but it does show up (with various spellings and transliterations), both before and after Moses Schönfinkel. There’s a Scheinfinkel Russian revolutionary buried in the Kremlin Wall; there was a Lovers of Zion delegate Scheinfinkel from Ekaterinoslav. There was a Benjamin Scheinfinkel in New York City in the 1940s; a Shlomo Scheinfinkel in Haifa in the 1930s. There was even a certain curiously named Bas Saul Haskell Scheinfinkel born in 1875. But despite quite a bit of effort, I’ve been unable to locate any living relative of Moses Schönfinkel. At least so far.

Haskell Curry

What happened with combinators after Schönfinkel published his 1924 paper? Initially, so far as one can tell, nothing. That is, until Haskell Curry found Schönfinkel’s paper in the library at Princeton University in November 1927—and launched into a lifetime of work on combinators.

Who was Haskell Curry? And why did he know to care about Schönfinkel’s paper?

Haskell Curry

Haskell Brooks Curry was born on September 12, 1900, in a small town near Boston, MA. His parents were both elocution educators, who by the time Haskell Curry was born were running the School of Expression (which had evolved from his mother’s Boston-based School of Elocution and Expression). (Many years later, the School of Expression would evolve into Curry College in Waltham, Massachusetts—which happens to be where for several years we held our Wolfram Summer School, often noting the “coincidence” of names when combinators came up.)

Haskell Curry went to college at Harvard, graduating in mathematics in 1920. After a couple of years doing electrical engineering, he went back to Harvard, initially working with Percy Bridgman, who was primarily an experimental physicist, but was writing a philosophy of science book entitled The Logic of Modern Physics. And perhaps through this Curry got introduced to Whitehead and Russell’s Principia Mathematica.

But in any case, there’s a note in his archive about Principia Mathematica dated May 20, 1922:

Note about Principia Mathematica—click to enlarge

Haskell P. Curry papers, PSUA 222, Special Collections Library, Pennsylvania State University

Curry seems—perhaps like an electrical engineer or a “pre-programmer”—to have been very interested in the actual process of mathematical logic, starting his notes with: “No logical process is possible without the phenomenon of substitution.” He continued, trying to break down the process of substitution.

But then his notes end, more philosophically, and perhaps with “expression” influence: “Phylogenetic origin of logic: 1. Sensation; 2. Association: Red hot poker–law of permanence”.

At Harvard Curry started working with George Birkhoff towards a PhD on differential equations. But by 1927–8 he had decided to switch to logic, and was spending a year as an instructor at Princeton. And it was there—in November 1927—that he found Schönfinkel’s paper. Preserved in his archives are the notes he made:

Curry’s notes—click to enlarge Curry’s notes—click to enlarge Curry’s notes—click to enlarge

Haskell P. Curry papers, PSUA 222, Special Collections Library, Pennsylvania State University

At the top there’s a date stamp of November 28, 1927. Then Curry writes: “This paper anticipates much of what I have done”—then launches into a formal summary of Schönfinkel’s paper (charmingly using f@x to indicate function application—just as we do in Wolfram Language, except his is left associative…).

He ends his “report” with “In criticism I might say that no formal development have been undertaken in the above. Equality is taken intuitively and such things as universality, and proofs of identity are shown on the principle that if for every z, x@z : y@z then x=y ….”

But then there’s another piece:

Curry’s notes—click to enlarge

“On discovery of this paper I saw Prof. Veblen. Schönfinkel’s paper said ‘in Moskau’. Accordingly we sought out Paul Alexandroff. The latter says Schönfinkel has since gone insane and is now in a sanatorium & will probably not be able to work any more. The paper was written with help of Paul Bernays and Behman [sic]; who would presumably be the only people in the world who would write on that subject.”

What was the backstory to this? Oswald Veblen was a math professor at Princeton who had worked on the axiomatization of geometry and was by then working on topology. Pavel Alexandroff (who we encountered earlier) was visiting from Moscow State University for the year, working on topology with Hopf, Lefschetz, Veblen and Alexander. I’m not quite sure why Curry thought Bernays and Behmann “would be the only people in the world who would write on that subject”; I don’t see how he could have known.

Curry’s notes—click to enlarge

Curry continues: “It was suggested I write to Bernays, who is außerord. prof. [long-term lecturer] at Göttingen.” But then he adds—in depressingly familiar academic form: “Prof. Veblen thought it unwise until I had something definite ready to publish.”

Curry’s notes—click to enlarge

“A footnote to Schönfinkel’s paper said the ideas were presented before Math Gesellschaft in Göttingen on Dec. 7, 1920 and that its formal and elegant [sic] write up was due to H. Behman”. “Elegant” is a peculiar translation of “stilistische” that probably gives Behmann too much credit; a more obvious translation might be “stylistic”.

Curry continues: “Alexandroff’s statements, as I interpret them, are to the effect that Bernays, Behman, Ackermann, von Neumann, Schönfinkel & some others form a small school of math logicians working on this & similar topics in Göttingen.”

And so it was that Curry resolved to study in Göttingen, and do his PhD in logic there. But before he left for Göttingen, Curry wrote a paper (published in 1929):

Curry’s “An Analysis of Logical Substitution”—click to enlarge

Already there’s something interesting in the table of contents: the use of the word “combinatory”, which, yes, in Curry’s care is going to turn into “combinator”.

The paper starts off reading a bit like a student essay, and one’s not encouraged by a footnote a few pages in:

“In the writing the foregoing account I have naturally made use of any ideas I may have gleaned from reading the literature. The writings of Hilbert are fundamental in this connection. I hope that I have added clearness to certain points where the existing treatments are obscure.” [“Clearness” not “clarity”?]

Then, towards the end of the “Preliminary Discussion” is this:

Curry’s “An Analysis of Logical Substitution”—click to enlarge

And the footnote says: “See the paper of Schönfinkel cited below”. It’s (so far as I know) the first-ever citation to Schönfinkel’s paper!

On the next page Curry starts to give details. Curry starts talking about substitution, then says (in an echo of modern symbolic language design) this relates to the idea of “transformation of functions”:

Curry’s “An Analysis of Logical Substitution”—click to enlarge

At first he’s off talking about all the various combinatorial arrangements of variables, etc. But then he introduces Schönfinkel—and starts trying to explain in a formal way what Schönfinkel did. And even though he says he’s talking about what one assumes is structural substitution, he seems very concerned about what equality means, and how Schönfinkel didn’t quite define that. (And, of course, in the end, with universal computation, undecidability, etc. we know that the definition of equality wasn’t really accessible in the 1920s.)

By the next page, here we are, S and K (Curry renamed Schönfinkel’s C):

Curry’s “An Analysis of Logical Substitution”—click to enlarge

At first he’s imagining that the combinators have to be applied to something (i.e. f[x] not just f). But by the next page he comes around to what Schönfinkel was doing in looking at “pure combinators”:

Curry’s “An Analysis of Logical Substitution”—click to enlarge

The rest of the paper is basically concerned with setting up combinators that can successively represent permutations—and it certainly would have been much easier if Curry had had a computer (and one could imagine minimal “combinator sorters” like minimal sorting networks):

Curry’s “An Analysis of Logical Substitution”—click to enlarge

After writing this paper, Curry went to Göttingen—where he worked with Bernays. I must say that I’m curious what Bernays said to Curry about Schönfinkel (was it more than to Erwin Engeler?), and whether other people around Göttingen even remembered Schönfinkel, who by then had been gone for more than four years. In 1928, travel in Europe was open enough that Curry should have had no trouble going, for example, to Moscow, but there’s no evidence he made any effort to reach out to Schönfinkel. But in any case, in Göttingen he worked on combinators, and over the course of a year produced his first official paper on “combinatory logic”:

“Grundlagen der kombinatorischen Logik”—click to enlarge

Strangely, the paper was published in an American journal—as the only paper not in English in that volume. The paper is more straightforward, and in many ways more “Schönfinkel like”. But it was just the first of many papers that Curry wrote about combinators over the course of nearly 50 years.

Curry was particularly concerned with the “mathematicization” of combinators, finding and fixing problems with axioms invented for them, connecting to other formalisms (notably Church’s lambda calculus), and generally trying to prove theorems about what combinators do. But more than that, Curry spread the word about combinators far and wide. And before long most people viewed him as “Mr. Combinator”, with Schönfinkel at most a footnote.

In 1958, when Haskell Curry and Robert Feys wrote their book on Combinatory Logic, there’s a historical footnote—that gives the impression that Curry “almost” had Schönfinkel’s ideas before he saw Schönfinkel’s paper in 1927:

“Combinatory Logic”—click to enlarge
“Combinatory Logic”—click to enlarge

I have to say that I don’t think that’s a correct impression. What Schönfinkel did was much more singular than that. It’s plausible to think that others (and particularly Curry) could have had the idea that there could be a way to go “below the operations of mathematical logic” and find more fundamental building blocks based on understanding things like the process of substitution. But the actuality of how Schönfinkel did it is something quite different—and something quite unique.

And when one sees Schönfinkel’s S combinator: what mind could have come up with such a thing? Even Curry says he didn’t really understand the significance of the S combinator until the 1940s.

I suppose if one’s just thinking of combinatory logic as a formal system with a certain general structure then it might not seem to matter that things as simple as S and K can be the ultimate building blocks. But the whole point of what Schönfinkel was trying to do (as the title of his paper says) was to find the “building blocks of logic”. And the fact that he was able to do it—especially in terms of things as simple as S and K—was a great and unique achievement. And not something that (despite all the good he did for combinators) Curry did.

Schönfinkel Rediscovered

In the decade or so after Schönfinkel’s paper appeared, Curry occasionally referenced it, as did Church and a few other closely connected people. But soon Schönfinkel’s paper—and Schönfinkel himself—disappeared completely from view, and standard databases list no citations.

But in 1967 Schönfinkel’s paper was seen again—now even translated into English. The venue was a book called From Frege to Gödel: A Source Book in Mathematical Logic, 1879–1931. And there, sandwiched between von Neumann on transfinite numbers and Hilbert on “the infinite”, is Schönfinkel’s paper, in English, with a couple of pages of introduction by Willard Van Orman Quine. (And indeed it was from this book that I myself first became aware of Schönfinkel and his work.)

But how did Schönfinkel’s paper get into the book? And do we learn anything about Schönfinkel from its appearance there? Maybe. The person who put the book together was a certain Jean van Heijenoort, who himself had a colorful history. Born in 1912, he grew up mostly in France, and went to college to study mathematics—but soon became obsessed with communism, and in 1932 left to spend what ended up being nearly ten years working as a kind of combination PR person and bodyguard for Leon Trotsky, initially in Turkey but eventually in Mexico. Having married an American, van Heijenoort moved to New York City, eventually enrolling in a math PhD program, and becoming a professor doing mathematical logic (though with some colorful papers along the way, with titles like “The Algebra of Revolution”).

Why is this relevant? Well, the question is: how did van Heijenoort know about Schönfinkel? Perhaps it was just through careful scholarship. But just maybe it was through Trotsky. There’s no real evidence, although it is known that during his time in Mexico, Trotsky did request a copy of Principia Mathematica (or was it his “PR person”?). But at least if there was a Trotsky connection it could help explain Schönfinkel’s strange move to Moscow. But in the end we just don’t know.

What Should We Make of Schönfinkel?

When one reads about the history of science, there’s a great tendency to get the impression that big ideas come suddenly to people. But my historical research—and my personal experience—suggest that that’s essentially never what happens. Instead, there’s usually a period of many years in which some methodology or conceptual framework gradually develops, and only then can the great idea emerge.

So with Schönfinkel it’s extremely frustrating that we just can’t see that long period of development. The records we have just tell us that Schönfinkel announced combinators on December 7, 1920. But how long had he been working towards them? We just don’t know.

On the face of it, his paper seems simple—the kind of thing that could have been dashed off in a few weeks. But I think it’s much more likely that it was the result of a decade of development—of which, through foibles of history, we now have no trace.

Yes, what Schönfinkel finally came up with is simple to explain. But to get to it, he had to cut through a whole thicket of technicality—and see the essence of what lay beneath. My life as a computational language designer has often involved doing very much this same kind of thing. And at the end of it, what you come up with may seem in retrospect “obvious”. But to get there often requires a lot of hard intellectual work.

And in a sense what Schönfinkel did was the most impressive possible version of this. There were no computers. There was no ambient knowledge of computation as a concept. Yet Schönfinkel managed to come up with a system that captures the core of those ideas. And while he didn’t quite have the language to describe it, I think he did have a sense of what he was doing—and the significance it could have.

What was the personal environment in which Schönfinkel did all this? We just don’t know. We know he was in Göttingen. We don’t think he was involved in any particularly official way with the university. Most likely he was just someone who was “around”. Clearly he had some interaction with people like Hilbert and Bernays. But we don’t know how much. And we don’t really know if they ever thought they understood what Schönfinkel was doing.

Even when Curry picked up the idea of combinators—and did so much with it—I don’t think he really saw the essence of what Schönfinkel was trying to do. Combinators and Schönfinkel are a strange episode in intellectual history. A seed sown far ahead of its time by a person who left surprisingly few traces, and about whom we know personally so little.

But much as combinators represent a way of getting at the essence of computation, perhaps in combinators we have the essence of Moses Schönfinkel: years of a life compressed to two “signs” (as he would call them) S and K. And maybe if the operation we now call currying needs a symbol we should be using the “sha” character Ш from the beginning of Schönfinkel’s name to remind us of a person about whom we know so little, but who planted a seed that gave us so much.

Thanks

Many people and organizations have helped in doing research and providing material for this piece. Thanks particularly to Hatem Elshatlawy (fieldwork in Göttingen, etc.), Erwin Engeler (first-person history), Unal Goktas (Turkish material), Vitaliy Kaurov (locating Ukraine + Russia material), Anna & Oleg Marichev (interpreting old Russian handwriting), Nik Murzin (fieldwork in Moscow), Eila Stiegler (German translations), Michael Trott (interpreting German). Thanks also for input from Henk Barendregt, Semih Baskan, Metin Baştuğ, Cem Boszahin, Jason Cawley, Jack Copeland, Nuran Hariri, Ersin Koylu, Alexander Kuzichev, Yuri Matiyasevich, Roman Maeder, Volker Peckhaus, Jonathan Seldin, Vladimir Shalack, Matthew Szudzik, Christian Thiel, Richard Zach. Particular thanks to the following archives and staff: Berlin State Library [Gabriele Kaiser], Bern University Archive [Niklaus Bütikofer], ETHZ (Bernays) Archive [Flavia Lanini, Johannes Wahl], Göttingen City Archive [Lena Uffelmann], Göttingen University [Katarzyna Chmielewska, Bärbel Mund, Petra Vintrová, Dietlind Willer].

Launching Version 12.2 of Wolfram Language & Mathematica: 228 New Functions and Much More…

$
0
0
version-12.2-icon

Yet Bigger than Ever Before

When we released Version 12.1 in March of this year, I was pleased to be able to say that with its 182 new functions it was the biggest .1 release we’d ever had. But just nine months later, we’ve got an even bigger .1 release! Version 12.2, launching today, has 228 completely new functions!

Launching Version 12.2 of Wolfram Language & Mathematica: 228 New Functions and Much More...

We always have a portfolio of development projects going on, with any given project taking anywhere from a few months to more than a decade to complete. And of course it’s a tribute to our whole Wolfram Language technology stack that we’re able to develop so much, so quickly. But Version 12.2 is perhaps all the more impressive for the fact that we didn’t concentrate on its final development until mid-June of this year. Because between March and June we were concentrating on 12.1.1, which was a “polishing release”. No new features, but more than a thousand outstanding bugs fixed (the oldest being a documentation bug from 1993):


More than a thousand bugs fixed in 12.2—click to enlarge

How did we design all those new functions and new features that are now in 12.2? It’s a lot of work! And it’s what I personally spend a lot of my time on (along with other “small items” like physics, etc.). But for the past couple of years we’ve done our language design in a very open way—livestreaming our internal design discussions, and getting all sorts of great feedback in real time. So far we’ve recorded about 550 hours—of which Version 12.2 occupied at least 150 hours.

Live CEOing

By the way, in addition to all of the fully integrated new functionality in 12.2, there’s also been significant activity in the Wolfram Function Repository—and even since 12.1 was released 534 new, curated functions for all sorts of specialized purposes have been added there.

Biomolecular Sequences: Symbolic DNA, Proteins, etc.

There are so many different things in so many areas in Version 12.2 that it’s hard to know where to start. But let’s talk about a completely new area: bio-sequence computation. Yes, we’ve had gene and protein data in the Wolfram Language for more than a decade. But what’s new in 12.2 is the beginning of the ability to do flexible, general computation with bio sequences. And to do it in a way that fits in with all the chemical computation capabilities we’ve been adding to the Wolfram Language over the past few years.

Here’s how we represent a DNA sequence (and, yes, this works with very long sequences too):

BioSequence
&#10005
BioSequence["DNA", "CTTTTCGAGATCTCGGCGTCA"]

This translates the sequence to a peptide (like a “symbolic ribosome”):

BioSequenceTranslate
&#10005
BioSequenceTranslate[%]

Now we can find out what the corresponding molecule is:

Molecule
&#10005
Molecule[%]

And visualize it in 3D (or compute lots of properties):

MoleculePlot3D
&#10005
MoleculePlot3D[%]

I have to say that I agonized a bit about the “non-universality” of putting the specifics of “our” biology into our core language… but it definitely swayed my thinking that, of course, all our users are (for now) definitively eukaryotes. Needless to say, though, we’re set up to deal with other branches of life too:

Entity
&#10005
Entity["GeneticTranslationTable", 
  "AscidianMitochondrial"]["StartCodons"]

You might think that handling genome sequences is “just string manipulation”—and indeed our string functions are now set up to work with bio sequences:

StringReverse
&#10005
StringReverse[BioSequence["DNA", "CTTTTCGAGATCTCGGCGTCA"]]

But there’s also a lot of biology-specific additional functionality. Like this finds a complementary base-pair sequence:

BioSequenceComplement
&#10005
BioSequenceComplement[BioSequence["DNA", "CTTTTCGAGATCTCGGCGTCA"]]

Actual, experimental sequences often have base pairs that are somehow uncertain—and there are standard conventions for representing this (e.g. “S” means C or G; “N” means any base). And now our string patterns also understand things like this for bio sequences:

StringMatchQ
&#10005
StringMatchQ[BioSequence["DNA", "CTTT"], "STTT"]

And there are new functions like BioSequenceInstances for resolving degenerate characters:

BioSequenceInstances
&#10005
BioSequenceInstances[BioSequence["DNA", "STTT"]]

BioSequence is also completely integrated with our built-in genome and protein data. Here’s a gene that we can ask for in natural language “Wolfram|Alpha style”:

BioSequence
&#10005
BioSequence[CloudGet["https://wolfr.am/ROWvGTNr"]]

Now we ask to do sequence alignment between these two genes (in this case, both human—which is, needless to say, the default):

BioSequence
&#10005

What’s in 12.2 is really just the beginning of what we’re planning for bio-sequence computation. But already you can do very flexible things with large datasets. And, for example, it’s now straightforward for me to read my genome in from FASTA files and start exploring it…

BioSequence
&#10005
BioSequence["DNA", 
 First[Import["Genome/Consensus/c1.fa.consensus.fa"]]]

 

Spatial Statistics & Modeling

Locations of birds’ nests, gold deposits, houses for sale, defects in a material, galaxies…. These are all examples of spatial point datasets. And in Version 12.2 we now have a broad collection of functions for handling such datasets.

Here’s the “spatial point data” for the locations of US state capitals:

SpatialPointData
&#10005
SpatialPointData[
 GeoPosition[EntityClass["City", "UnitedStatesCapitals"]]]

Since it’s geo data, it’s plotted on a map:

PointValuePlot
&#10005
PointValuePlot[%]

Let’s restrict our domain to the contiguous US:

capitals = SpatialPointData
&#10005
capitals = 
  SpatialPointData[
   GeoPosition[EntityClass["City", "UnitedStatesCapitals"]], 
   Entity["Country", "UnitedStates"]];
PointValuePlot
&#10005
PointValuePlot[%]

Now we can start computing spatial statistics. Like here’s the mean density of state capitals:

MeanPointDensity
&#10005
MeanPointDensity[capitals]

Assume you’re in a state capital. Here’s the probability to find the nearest other state capital a certain distance away:

NearestNeighborG
&#10005
NearestNeighborG[capitals]
Plot
&#10005
Plot[%[Quantity[r, "Miles"]], {r, 0, 400}]

This tests whether the state capitals are randomly distributed; needless to say, they’re not:

SpatialRandomnessTest
&#10005
SpatialRandomnessTest[capitals]

In addition to computing statistics from spatial data, Version 12.2 can also generate spatial data according to a wide range of models. Here’s a model that picks “center points” at random, then has other points clustered around them:

PointValuePlot
&#10005
PointValuePlot[
 RandomPointConfiguration[MaternPointProcess[.0001, 1, .1, 2], 
  CloudGet["https://wolfr.am/ROWwlIqR"]]]

You can also go the other way around, and fit a spatial model to data:

EstimatedPointProcess
&#10005
EstimatedPointProcess[capitals, 
 MaternPointProcess[\[Mu], \[Lambda], r, 2], {\[Mu], \[Lambda], r}]

 

Convenient Real-World PDEs

In some ways we’ve been working towards it for 30 years. We first introduced NDSolve back in Version 2.0, and we’ve been steadily enhancing it ever since. But our long-term goal has always been convenient handling of real-world PDEs of the kind that appear throughout high-end engineering. And in Version 12.2 we’ve finally got all the pieces of underlying algorithmic technology to be able to create a truly streamlined PDE-solving experience.

OK, so how do you specify a PDE? In the past, it was always done explicitly in terms of particular derivatives, boundary conditions, etc. But most PDEs used for example in engineering consist of higher-level components that “package together” derivatives, boundary conditions, etc. to represent features of physics, materials, etc.

The lowest level of our new PDE framework consists of symbolic “terms”, corresponding to common mathematical constructs that appear in real-world PDEs. For example, here’s a 2D “Laplacian term”:

LaplacianPDETerm
&#10005
LaplacianPDETerm[{u[x, y], {x, y}}]

And now this is all it takes to find the first 5 eigenvalues of the Laplacian in a regular polygon:

NDEigenvalues
&#10005
NDEigenvalues[LaplacianPDETerm[{u[x, y], {x, y}}], 
 u[x, y], {x, y} \[Element] RegularPolygon[5], 5]

And the important thing is that you can put this kind of operation into a whole pipeline. Like here we’re getting the region from an image, solving for the 10th eigenmode, and then 3D plotting the result:

NDEigensystem
&#10005
NDEigensystem[{LaplacianPDETerm[{u[x, y], {x, y}}]}, u[x, y],
  {x, y} \[Element] ImageMesh[CloudGet["https://wolfr.am/ROWwBtE7"]], 
  10][[2, -1]]
Plot3D
&#10005
Plot3D[%, {x, y} \[Element] 
  ImageMesh[CloudGet["https://wolfr.am/ROWwGqjg"]]]

In addition to LaplacianPDETerm, there are things like DiffusionPDETerm and ConvectionPDETerm that represent other terms that arise in real-world PDEs. Here’s a term for isotropic diffusion with unit diffusion coefficient:

DiffusionPDETerm
&#10005
DiffusionPDETerm[{\[Phi][x, y, z], {x, y, z}}]

Beyond individual terms, there are also “components” that combine multiple terms, usually with various parameters. Here’s a Helmholtz PDE component:

HelmholtzPDEComponent
&#10005
HelmholtzPDEComponent[{u[x, y], {x, y}}, <|"HelmholtzEigenvalue" -> k|>]

By the way, it’s worth pointing out that our “terms” and “components” are set up to represent the symbolic structure of PDEs in a form suitable for structural manipulation and for things like numerical analysis. And to ensure that they maintain their structure, they’re normally kept in an inactivated form. But you can always “activate” them if you want to do things like algebraic operations:

Activate
&#10005
Activate[%]

In real-world PDEs, one’s often dealing with actual, physical processes taking place in actual physical materials. And in Version 12.2 we’ve got immediate ways to deal not only with things like diffusion, but also with acoustics, heat transfer and mass transport—and to feed in properties of actual materials. Typically the structure is that there’s a PDE “component” that represents the bulk behavior of the material, together with a variety of PDE “values” or “conditions” that represent boundary conditions.

Here’s a typical PDE component, using material properties from the Wolfram Knowledgebase:

HeatTransferPDEComponent
&#10005
HeatTransferPDEComponent[{\[CapitalTheta][t, x, y], t, {x, y}}, <|
  "Material" -> CloudGet["https://wolfr.am/ROWwUQai"]|>]

There’s quite a bit of diversity and complexity to the possible boundary conditions. For example, for heat transfer, there’s HeatFluxValue, HeatInsulationValue and five other symbolic boundary condition specification constructs. In each case, the basic idea is to say where (geometrically) the condition applies, then what it applies to, and what parameters relate to it.

So, for example, here’s a condition that specifies that there’s a fixed “surface temperature” θ0 everywhere outside the (circular) region defined by x2 + y2 = 1:

HeatTemperatureCondition
&#10005
HeatTemperatureCondition[
 x^2 + y^2 > 1, {\[CapitalTheta][t, x, y], t, {x, y}}, <|
  "SurfaceTemperature" -> Subscript[\[Theta], 0]|>]

What’s basically happening here is that our high-level “physics” description is being “compiled” into explicit “mathematical” PDE structures—like Dirichlet boundary conditions.

OK, so how does all this fit together in a real-life situation? Let me show an example. But first, let me tell a story. Back in 2009 I was having tea with our lead PDE developer. I picked up a teaspoon and asked “When will we be able to model the stresses in this?” Our lead developer explained that there was quite a bit to build to get to that point. Well, I’m excited to say that after 11 years of work, in Version 12.2 we’re there. And to prove it, our lead developer just gave me… a (computational) spoon!

spoon = CloudGet
&#10005
spoon = CloudGet["https://wolfr.am/ROWx6wKF"];

The core of the computation is a 3D diffusion PDE term, with a “diffusion coefficient” given by a rank-4 tensor parametrized by Young’s modulus (here Y) and Poisson ratio (ν):

pdeterm = DiffusionPDETerm
&#10005
pdeterm = 
  DiffusionPDETerm[{{u[x, y, z], v[x, y, z], w[x, y, z]}, {x, y, z}}, 
   Y/(1 + \[Nu]) {
     {{
       {(1 - \[Nu])/(1 - 2 \[Nu]), 0, 0},
       {0, 1/2, 0},
       {0, 0, 1/2}
      }, {
       {0, \[Nu]/(1 - 2 \[Nu]), 0},
       {1/2, 0, 0},
       {0, 0, 0}
      }, {
       {0, 0, \[Nu]/(1 - 2 \[Nu])},
       {0, 0, 0},
       {1/2, 0, 0}
      }},
     {{
       {0, 1/2, 0},
       {\[Nu]/(1 - 2 \[Nu]), 0, 0},
       {0, 0, 0}
      }, {
       {1/2, 0, 0},
       {0, (1 - \[Nu])/(1 - 2 \[Nu]), 0},
       {0, 0, 1/2}
      }, {
       {0, 0, 0},
       {0, 0, \[Nu]/(1 - 2 \[Nu])},
       {0, 1/2, 0}
      }},
     {{
       {0, 0, 1/2},
       {0, 0, 0},
       {\[Nu]/(1 - 2 \[Nu]), 0, 0}
      }, {
       {0, 0, 0},
       {0, 0, 1/2},
       {0, \[Nu]/(1 - 2 \[Nu]), 0}
      }, {
       {1/2, 0, 0},
       {0, 1/2, 0},
       {0, 0, (1 - \[Nu])/(1 - 2 \[Nu])}
      }}
    }, <|Y -> 10^9, \[Nu] -> 33/100|>];

There are boundary conditions to specify how the spoon is being held, and pushed. Then solving the PDE (which takes just a few seconds) gives the displacement field for the spoon

dfield = NDSolveValue
&#10005
dfield = deformations = 
   NDSolveValue[{pdeterm == {0, NeumannValue[-1000, x <= -100], 0}, 
     DirichletCondition[{u[x, y, z] == 0., v[x, y, z] == 0., 
       w[x, y, z] == 0.}, x >= 100]}, {u, v, w}, {x, y, z} \[Element] 
     spoon];

which we can then use to find how the spoon would deform:

Show
&#10005
Show[MeshRegion[
  Table[Apply[if, m], {m, MeshCoordinates[spoon]}, {if, 
     deformations}] + MeshCoordinates[spoon], 
  MeshCells[spoon, MeshCells[spoon, {2, All}]]], 
 Graphics3D[Style[spoon, LightGray]]]

PDE modeling is a complicated area, and I consider it to be a major achievement that we’ve now managed to “package” it as cleanly as this. But in Version 12.2, in addition to the actual technology of PDE modeling, something else that’s important is a large collection of computational essays about PDE modeling—altogether about 400 pages of detailed explanation and application examples, currently in acoustics, heat transfer and mass transport, but with many other domains to come.

Just Type TEX

The Wolfram Language is all about expressing yourself in precise computational language. But in notebooks you can also express yourself with ordinary text in natural language. But what if you want to display math in there as well? For 25 years we’ve had the infrastructure to do the math display—through our box language. But the only convenient way to enter the math is through Wolfram Language math constructs—that in some sense have to have computational meaning.

But what about “math” that’s “for human eyes only”? That has a certain visual layout that you want to specify, but that doesn’t necessarily have any particular underlying computational meaning that’s been defined? Well, for many decades there’s been a good way to specify such math, thanks to my friend Don Knuth: just use TEX. And in Version 12.2 we’re now supporting direct entry of TEX math into Wolfram Notebooks, both on the desktop and in the cloud. Underneath, the TEX is being turned into our box representation, so it structurally interoperates with everything else. But you can just enter it—and edit it—as TEX.

The interface is very much like the += interface for Wolfram|Alpha-style natural language input. But for TEX (in a nod to standard TEX delimiters), it’s +$.

Type +$ and you get a TEX input box. When you’ve finished the TEX, just hit and it’ll be rendered:

TeX

Like with +=, if you click the rendered form, it’ll go back to text and you can edit again, just as TEX.

Entering TEX in text cells is the most common thing to want. But Version 12.2 also supports entering TEX in input cells:

TeX typing

What happens if you + evaluate? Your input will be treated as TraditionalForm, and at least an attempt will be made to interpret it. Though, of course, if you wrote “computationally meaningless math” that won’t work.

Just Draw Anything

Type Canvas[] and you’ll get a blank canvas to draw whatever you want:

Canvas[]
&#10005
Canvas[]

We’ve worked hard to make the drawing tools as ergonomic as possible.

Canvas[]
Canvas

Applying Normal gives you graphics that you can then use or manipulate:

Normal
&#10005
GraphicsGrid[
 Partition[
  Table[Rasterize[Rotate[Normal[%], \[Theta]], 
    ImageSize -> 50], {\[Theta], 0, 2 Pi, .4}], UpTo[8]], 
 ImageSize -> 500]
GraphicsGrid
&#10005
GraphicsGrid[
 Partition[
  Table[Rasterize[Rotate[Normal[%], \[Theta]], 
    ImageSize -> 50], {\[Theta], 0, 2 Pi, .4}], UpTo[8]], 
 ImageSize -> 500]

When you create a canvas, it can have any graphic as initial content—and it can have any background you want:

Canvas
&#10005
Canvas[Graphics[
  Style[Disk[], Opacity[.4, Red], EdgeForm[{Thick, Red}]]], 
 Background -> 
  GeoGraphics[
   Entity["MannedSpaceMission", "Apollo16"][
    EntityProperty["MannedSpaceMission", "LandingPosition"]]]]

On the subject of drawing anything, Version 12.2 has another new function: MoleculeDraw, for drawing (or editing) molecules. Start with the symbolic representation of a molecule:

Caffeine molecule
&#10005
Molecule[Entity["Chemical", "Caffeine"]]

Now use MoleculeDraw to bring up the interactive molecule drawing environment, make an edit, and return the result:

MoleculeDraw

It’s another molecule now:

New molecule

The Never-Ending Math Story

Math has been a core use case for the Wolfram Language (and Mathematica) since the beginning. And it’s been very satisfying over the past third of a century to see how much math we’ve been able to make computational. But the more we do, the more we realize is possible, and the further we can go. It’s become in a sense routine for us. There’ll be some area of math that people have been doing by hand or piecemeal forever. And we’ll figure out: yes, we can make an algorithm for that! We can use the giant tower of capabilities we’ve built over all these years to systematize and automate yet more mathematics; to make yet more math computationally accessible to anyone. And so it has been with Version 12.2. A whole collection of pieces of “math progress”.

Let’s start with something rather cut and dried: special functions. In a sense, every special function is an encapsulation of a certain nugget of mathematics: a way of defining computations and properties for a particular type of mathematical problem or system. Starting from Mathematica 1.0 we’ve achieved excellent coverage of special functions, steadily expanding to more and more complicated functions. And in Version 12.2 we’ve got another class of functions: the Lamé functions.

Lamé functions are part of the complicated world of handling ellipsoidal coordinates; they appear as solutions to the Laplace equation in an ellipsoid. And now we can evaluate them, expand them, transform them, and do all the other kinds of things that are involved in integrating a function into our language:

Plot
&#10005
Plot[Abs[LameS[3/2 + I, 3, z, 0.1 + 0.1 I]], {z, -8 EllipticK[1/3], 
  8 EllipticK[1/3]}]
Series
&#10005
Series[LameC[\[Nu], j, z, m], {z, 0, 3}]

Also in Version 12.2 we’ve done a lot on elliptic functions—dramatically speeding up their numerical evaluation and inventing algorithms doing this efficiently at arbitrary precision. We’ve also introduced some new elliptic functions, like JacobiEpsilon—which provides a generalization of EllipticE that avoids branch cuts and maintains the analytic structure of elliptic integrals:

ComplexPlot3D
&#10005
ComplexPlot3D[JacobiEpsilon[z, 1/2], {z, 6}]

We’ve been able to do many symbolic Laplace and inverse Laplace transforms for a couple of decades. But in Version 12.2 we’ve solved the subtle problem of using contour integration to do inverse Laplace transforms. It’s a story of knowing enough about the structure of functions in the complex plane to avoid branch cuts and other nasty singularities. A typical result effectively sums over an infinite number of poles:

InverseLaplaceTransform
&#10005
InverseLaplaceTransform[Coth[s \[Pi] /2 ]/(1 + s^2), s, t]

And between contour integration and other methods we’ve also added numerical inverse Laplace transforms. It all looks easy in the end, but there’s a lot of complicated algorithmic work needed to achieve this:

InverseLaplaceTransform
&#10005
InverseLaplaceTransform[1/(s + Sqrt[s] + 1), s, 1.5]

Another new algorithm made possible by finer “function understanding” has to do with asymptotic expansion of integrals. Here’s a complex function that becomes increasingly wiggly as λ increases:

Table
&#10005
Table[ReImPlot[(t^10 + 3) Exp[I  \[Lambda] (t^5 + t + 1)], {t, -2, 
   2}], {\[Lambda], 10, 30, 10}]

And here’s the asymptotic expansion for λ:

AsymptoticIntegrate
&#10005
AsymptoticIntegrate[(t^10 + 3) Exp[
   I  \[Lambda] (t^5 + t + 1)], {t, -2, 2}, {\[Lambda], Infinity, 2}]

 

Tell Me about That Function

It’s a very common calculus exercise to determine, for example, whether a particular function is injective. And it’s pretty straightforward to do this in easy cases. But a big step forward in Version 12.2 is that we can now systematically figure out these kinds of global properties of functions—not just in easy cases, but also in very hard cases. Often there are whole networks of theorems that depend on functions having such-and-such a property. Well, now we can automatically determine whether a particular function has that property, and so whether the theorems hold for it. And that means that we can create systematic algorithms that automatically use the theorems when they apply.

Here’s an example. Is Tan[x] injective? Not globally:

FunctionInjective
&#10005
FunctionInjective[Tan[x], x]

But over an interval, yes:

FunctionInjective
&#10005
FunctionInjective[{Tan[x], 0 < x < Pi/2}, x]

What about the singularities of Tan[x]? This gives a description of the set:

FunctionSingularities
&#10005
FunctionSingularities[Tan[x], x]

You can get explicit values with Reduce:

Reduce
&#10005
Reduce[%, x]

So far, fairly straightforward. But things quickly get more complicated:

FunctionSingularities
&#10005
FunctionSingularities[ArcTan[x^y], {x, y}, Complexes]

And there are more sophisticated properties you can ask about as well:

FunctionMeromorphic
&#10005
FunctionMeromorphic[Log[z], z]
FunctionMeromorphic
&#10005
FunctionMeromorphic[{Log[z], z > 0}, z]

We’ve internally used various kinds of function-testing properties for a long time. But with Version 12.2 function properties are much more complete and fully exposed for anyone to use. Want to know if you can interchange the order of two limits? Check FunctionSingularities. Want to know if you can do a multivariate change of variables in an integral? Check FunctionInjective.

And, yes, even in Plot3D we’re routinely using FunctionSingularities to figure out what’s going on:

Plot3D
&#10005
Plot3D[Re[ArcTan[x^y]], {x, -5, 5}, {y, -5, 5}]

 

Mainstreaming Video

In Version 12.1 we began the process of introducing video as a built-in feature of the Wolfram Language. Version 12.2 continues that process. In 12.1 we could only handle video in desktop notebooks; now it’s extended to cloud notebooks—so when you generate a video in Wolfram Language it’s immediately deployable to the cloud.

A major new video feature in 12.2 is VideoGenerator. Provide a function that makes images (and/or audio), and VideoGenerator will generate a video from them (here a 4-second video):

VideoGenerator
VideoGenerator
&#10005
VideoGenerator[Graphics3D[AugmentedPolyhedron[Icosahedron[], # - 2],
   ImageSize -> {200, 200}] &, 4]

To add a sound track, we can just use VideoCombine:

VideoCombine
&#10005
VideoCombine[{%, \!\(\*
TagBox[
RowBox[{"CloudGet", "[", "\"\<https://wolfr.am/ROWzckqS\>\"", "]"}],
Audio`AudioBox["AudioClass" -> "AudioData"],
Editable->False,
Selectable->False]\)}]

So how would we edit this video? In Version 12.2 we have programmatic versions of standard video editing functions. VideoSplit, for example, splits the video at particular times:

VideoSplit
&#10005
VideoSplit[%, {.3, .5, 2}]

But the real power of the Wolfram Language comes in systematically applying arbitrary functions to videos. VideoMap lets you apply a function to a video to get another video. For example, we could progressively blur the video we just made:

VideoMap
VideoMap
&#10005
VideoMap[Blur[#Image, 20 #Time] &, %%]

There are also two new functions for analyzing videos—VideoMapList and VideoMapTimeSeries—which respectively generate a list and a time series by applying a function to the frames in a video, and to its audio track.

Another new function—highly relevant for video processing and video editing—is VideoIntervals, which determines the time intervals over which any given criterion applies in a video:

VideoIntervals
&#10005
VideoIntervals[%, Length[DominantColors[#Image]] < 3 &]

Now, for example, we can delete those intervals in the video:

VideoDelete
&#10005
VideoDelete[%, %%]

A common operation in the practical handling of videos is transcoding. And in Version 12.2 the function VideoTranscode lets you convert a video among any of the over 300 containers and codecs that we support. By the way, 12.2 also has new functions ImageWaveformPlot and ImageVectorscopePlot that are commonly used in video color correction:

ImageVectorscopePlot
&#10005
ImageVectorscopePlot[CloudGet["https://wolfr.am/ROWzsGFw"]]

One of the main technical issues in handling video is dealing with the large amount of data in a typical video. In Version 12.2 there’s now finer control over where that data is stored. The option GeneratedAssetLocation (with default $GeneratedAssetLocation) lets you pick between different files, directories, local object stores, etc.

But there’s also a new function in Version 12.2 for handling “lightweight video”, in the form of AnimatedImage. AnimatedImage simply takes a list of images and produces an animation that immediately plays in your notebook—and has everything directly stored in your notebook.

AnimatedImage
&#10005
AnimatedImage[
 Table[Rasterize[Rotate[Style["W", 40], \[Theta]]], {\[Theta], 0, 
   2 Pi, .1}]]

 

Big Computations? Send Them to a Cloud Provider!

It comes up quite frequently for me—especially given our Physics Project. I’ve got a big computation I’d like to do, but I don’t want to (or can’t) do it on my computer. And instead what I’d like to do is run it as a batch job in the cloud.

This has been possible in principle for as long as cloud computation providers have been around. But it’s been very involved and difficult. Well, now, in Version 12.2 it’s finally easy. Given any piece of Wolfram Language code, you can just use RemoteBatchSubmit to send it to be run as a batch job in the cloud.

There’s a little bit of setup required on the batch computation provider side. First, you have to have an account with an appropriate provider—and initially we’re supporting AWS Batch and Charity Engine. Then you have to configure things with that provider (and we’ve got workflows that describe how to do that). But as soon as that’s done, you’ll get a remote batch submission environment that’s basically all you need to start submitting batch jobs:

env = RemoteBatchSubmissionEnvironment
&#10005
env = RemoteBatchSubmissionEnvironment[
  "AWSBatch", <|"JobQueue" -> 
    "arn:aws:batch:us-east-1:123456789012:job-queue/MyQueue", 
   "JobDefinition" -> 
    "arn:aws:batch:us-east-1:123456789012:job-definition/MyDefinition:\
1", "IOBucket" -> "my-job-bucket"|>]

OK, so what would be involved, say, in submitting a neural net training? Here’s how I would run it locally on my machine (and, yes, this is a very simple example):

NetTrain
&#10005
NetTrain[NetModel["LeNet"], "MNIST"]

And here’s the minimal way I would send it to run on AWS Batch:

job = RemoteBatchSubmit
&#10005
job = RemoteBatchSubmit[env, NetTrain[NetModel["LeNet"], "MNIST"]]

I get back an object that represents my remote batch job—that I can query to find out what’s happened with my job. At first it’ll just tell me that my job is “runnable”:

job
&#10005
job["JobStatus"]

Later on, it’ll say that it’s “starting”, then “running”, then (if all goes well) “succeeded”. And once the job is finished, you can get back the result like this:

job
&#10005
job["EvaluationResult"]

There’s lots of detail you can retrieve about what actually happened. Like here’s the beginning of the raw job log:

JobLog
&#10005
job["JobLog"]

But the real point of running your computations remotely in a cloud is that they can potentially be bigger and crunchier than the ones you can run on your own machines. Here’s how we could run the same computation as above, but now requesting the use of a GPU:

RemoteBatchSubmit
&#10005
RemoteBatchSubmit[env, 
 NetTrain[NetModel["LeNet"], "MNIST", TargetDevice -> "GPU"],
 RemoteProviderSettings -> <|"GPUCount" -> 1|>]

RemoteBatchSubmit can also handle parallel computations. If you request a multicore machine, you can immediately run ParallelMap etc. across its cores. But you can go even further with RemoteBatchMapSubmit—which automatically distributes your computation across a whole collection of separate machines in the cloud.

Here’s an example:

job = RemoteBatchMapSubmit
&#10005
job = RemoteBatchMapSubmit[env, ImageIdentify, 
  WebImageSearch["happy", 100]]

While it’s running, we can get a dynamic display of the status of each part of the job:

job
&#10005
job["DynamicStatusVisualization"]

About 5 minutes later, the job is finished:

job
&#10005
job["JobStatus"]

And here are our results:

ReverseSort
&#10005
ReverseSort[Counts[job["EvaluationResults"]]]

RemoteBatchSubmit and RemoteBatchMapSubmit give you high-level access to cloud compute services for general batch computation. But in Version 12.2 there is also a direct lower-level interface available, for example for AWS.

Connect to AWS:

aws = ServiceConnect
&#10005
aws = ServiceConnect["AWS"]

Once you’ve authenticated, you can see all the services that are available:

aws
&#10005
aws["Services"]

This gives a handle to the Amazon Translate service:

aws
&#10005
aws["GetService", "Name" -> "Translate"]

Now you can use this to call the service:

%
&#10005
%["TranslateText",
 "Text" -> "今日は良い一日だった",
 "SourceLanguageCode" -> "auto",
 "TargetLanguageCode" -> "en"
 ]

Of course, you can always do language translation directly through the Wolfram Language too:

TextTranslation
&#10005
TextTranslation["今日は良い一日だった"]

 

Can You Make a 10-Dimensional Plot?

It’s straightforward to plot data that involves one, two or three dimensions. For a few dimensions above that, you can use colors or other styling. But by the time you’re dealing with ten dimensions, that breaks down. And if you’ve got a lot of data in 10D, for example, then you’re probably going to have to use something like DimensionReduce to try to tease out “interesting features”.

But if you’re just dealing with a few “data points”, there are other ways to visualize things like 10-dimensional data. And in Version 12.2 we’re introducing several functions for doing this.

As a first example, let’s look at ParallelAxisPlot. The idea here is that every “dimension” is plotted on a “separate axis”. For a single point it’s not that exciting:

ParallelAxisPlot
&#10005
ParallelAxisPlot[{{10, 17, 19, 8, 7, 5, 17, 4, 8, 2}}, 
 PlotRange -> {0, 20}]

Here’s what happens if we plot three random “10D data points”:

ParallelAxisPlot
&#10005
ParallelAxisPlot[RandomInteger[20, {3, 10}], PlotRange -> {0, 20}]

But one of the important features of ParallelAxisPlot is that by default it automatically determines the scale on each axis, so there’s no need for the axes to be representing similar kinds of things. So, for example, here are 7 completely different quantities plotted for all the chemical elements:

ParallelAxisPlot
&#10005
ParallelAxisPlot[
 EntityValue[
  "Element", {EntityProperty["Element", "AtomicMass"], 
   EntityProperty["Element", "AtomicRadius"], 
   EntityProperty["Element", "BoilingPoint"], 
   EntityProperty["Element", "ElectricalConductivity"], 
   EntityProperty["Element", "MeltingPoint"], 
   EntityProperty["Element", "NeutronCrossSection"], 
   EntityProperty["Element", "ThermalConductivity"]}]]

Different kinds of high-dimensional data do best on different kinds of plots. Another new type of plot in Version 12.2 is RadialAxisPlot. (This type of plot also goes by names like radar plot, spider plot and star plot.)

RadialAxisPlot plots each dimension in a different direction:

RadialAxisPlot
&#10005
RadialAxisPlot[
 EntityValue[
  "Element", {EntityProperty["Element", "AtomicMass"], 
   EntityProperty["Element", "AtomicRadius"], 
   EntityProperty["Element", "BoilingPoint"], 
   EntityProperty["Element", "ElectricalConductivity"], 
   EntityProperty["Element", "MeltingPoint"], 
   EntityProperty["Element", "NeutronCrossSection"], 
   EntityProperty["Element", "ThermalConductivity"]}]]

It’s typically most informative when there aren’t too many data points:

RadialAxisPlot
&#10005
RadialAxisPlot[
 EntityValue[{Entity["City", {"Chicago", "Illinois", "UnitedStates"}],
    Entity["City", {"Dallas", "Texas", "UnitedStates"}], 
   Entity["City", {"NewYork", "NewYork", "UnitedStates"}], 
   Entity["City", {"LosAngeles", "California", 
     "UnitedStates"}]}, {EntityProperty["City", 
    "MedianHomeSalePrice"], 
   EntityProperty["City", "TotalSalesTaxRate"], 
   EntityProperty["City", "MedianHouseholdIncome"], 
   EntityProperty["City", "Population"], 
   EntityProperty["City", "Area"]}, "EntityAssociation"], 
 PlotLegends -> Automatic]

 

3D Array Plots

Back in 1984 I used a Cray supercomputer to make 3D pictures of 2D cellular automata evolving in time (yes, captured on 35 mm slides):

Slides of cellular automata

I’ve been waiting for 36 years to have a really streamlined way to reproduce these. And now finally in Version 12.2 we have it: ArrayPlot3D. Already in 2012 we introduced Image3D to represent and display 3D images composed of 3D voxels with specified colors and opacities. But its emphasis is on “radiology-style” work, in which there’s a certain assumption of continuity between voxels. And if you’ve really got a discrete array of discrete data (as in cellular automata) that won’t lead to crisp results.

And here it is, for a slightly more elaborate case of a 3D cellular automaton:

Table
&#10005
Table[ArrayPlot3D[
  CellularAutomaton[{14, {2, 1}, {1, 1, 1}}, {{{{1}}}, 
    0}, {{{t}}}]], {t, 20, 40, 10}]

Another new ArrayPlot-family function in 12.2 is ComplexArrayPlot, here applied to an array of values from Newton’s method:

ComplexArrayPlot
&#10005
Table[ArrayPlot3D[
  CellularAutomaton[{14, {2, 1}, {1, 1, 1}}, {{{{1}}}, 0}, {{{t}}}], 
  PlotTheme -> "Web"], {t, 10, 40, 10}]

 

Advancing the Computational Aesthetics of Visualization

One of our objectives in Wolfram Language is to have visualizations that just “automatically look good”—because they’ve got algorithms and heuristics that effectively implement good computational aesthetics. In Version 12.2 we’ve tuned up the computational aesthetics for a variety of types of visualization. For example, in 12.1 this is what a SliceVectorPlot3D looked like by default:

SliceVectorPlot3D
&#10005
SliceVectorPlot3D[{y + x, z, -y}, {x, -2, 2}, {y, -2, 2}, {z, -2, 2}]

Now it looks like this:

Vector plot

Since Version 10, we’ve also been making increasing use of our PlotTheme option, to “bank switch” detailed options to make visualizations that are suitable for different purposes, and meet different aesthetic goals. So for example in Version 12.2 we’ve added plot themes to GeoRegionValuePlot. Here’s an example of the default (which has been updated, by the way):

GeoRegionValuePlot
&#10005
GeoRegionValuePlot[CloudGet["https://wolfr.am/ROWDoxAw"] -> "GDP"]

And here it is with the "Marketing" plot theme:

GeoRegionValuePlot
&#10005
GeoRegionValuePlot[CloudGet["https://wolfr.am/ROWDoxAw"] -> "GDP", 
 PlotTheme -> "Marketing"]

Another thing in Version 12.2 is the addition of new primitives and new “raw material” for creating aesthetic visual effects. In Version 12.1 we introduced things like HatchFilling for cross-hatching. In Version 12.2 we now also have LinearGradientFilling:

Graphics
&#10005
Graphics[Style[Disk[], 
  LinearGradientFilling[{RGBColor[1., 0.71, 0.75], RGBColor[0.64, 
Rational[182, 255], 
Rational[244, 255]]}]]]

And we can now add this kind of effect to the filling in a plot:

Plot
&#10005
Plot[2 Sin[x] + x, {x, 0, 15}, 
 FillingStyle -> LinearGradientFilling[{RGBColor[0.64, 
Rational[182, 255], 
Rational[244, 255]], RGBColor[1., 0.71, 0.75]}, Top], 
 Filling -> Bottom]

To be even more stylish, one can plot random points using the new ConicGradientFilling:

Graphics
&#10005
Graphics[Table[
  Style[Disk[RandomReal[20, 2]], 
   ConicGradientFilling[RandomColor[3]]], 100]]

 

Making Code Just a Bit More Beautiful

A core goal of the Wolfram Language is to define a coherent computational language that can readily be understood by both computers and humans. We (and I in particular!) put a lot of effort into the design of the language, and into things like picking the right names for functions. But in making the language as easy to read as possible, it’s also important to streamline its “non-verbal” or syntactic aspects. For function names, we’re basically leveraging people’s understanding of words in natural language. For syntactic structure, we want to leverage people’s “ambient understanding”, for example, from areas like math.

More than a decade ago we introduced as a way to specify Function functions, so instead of writing

Function
&#10005
Function[x, x^2]

(or #2&) you could write:

x |-> x^2
&#10005
x |-> x^2

But to enter you had to type \[Function] or at least  fn , which tended to feel “a bit difficult”.

Well, in Version 12.2, we’re “mainstreaming” by making it possible to type just as |->

x | - > x^2
&#10005
x | - > x^2

You can also do things like

{x, y} |-> x + y
&#10005
{x, y} |-> x + y

as well as things like:

SameTest
&#10005
SameTest -> ({x, y} |-> Mod[x - y, 2] == 0)

In Version 12.2, there’s also another new piece of “short syntax”: //=

Imagine you’ve got a result, say called res. Now you want to apply a function to res, and then “update res”. The new function ApplyTo (written //=) makes it easy to do that:

res = 10
&#10005
res = 10
res //= f
&#10005
res //= f
res
&#10005
res

We’re always on the lookout for repeated “lumps of computation” that we can “package” into functions with “easy-to-understand names”. And in Version 12.2 we have a couple of new such functions: FoldWhile and FoldWhileList. FoldList normally just takes a list and “folds” each successive element into the result it’s building up—until it gets to the end of the list:

FoldList
&#10005
FoldList[f, {1, 2, 3, 4}]

But what if you want to “stop early”? FoldWhileList lets you do that. So here we’re successively dividing by 1, 2, 3, …, stopping when the result isn’t an integer anymore:

FoldWhileList
&#10005
FoldWhileList[Divide, 5!, Range[10], IntegerQ]

 

More Array Gymnastics: Column Operations and Their Generalizations

Let’s say you’ve got an array, like:

{{a, b, c, d}, {x, y, z, w}} // MatrixForm
&#10005
{{a, b, c, d}, {x, y, z, w}} // MatrixForm

Map lets you map a function over the “rows” of this array:

Map
&#10005
Map[f, {{a, b, c, d}, {x, y, z, w}}]

But what if you want to operate on the “columns” of the array, effectively “reducing out” the first dimension of the array? In Version 12.2 the function ArrayReduce lets you do this:

ArrayReduce
&#10005
ArrayReduce[f, {{a, b, c, d}, {x, y, z, w}}, 1]

Here’s what happens if instead we tell ArrayReduce to “reduce out” the second dimension of the array:

ArrayReduce
&#10005
ArrayReduce[f, {{a, b, c, d}, {x, y, z, w}}, 2]

What’s really going on here? The array has dimensions 2×4:

Dimensions
&#10005
Dimensions[{{a, b, c, d}, {x, y, z, w}}]

ArrayReduce[f, ..., 1] “reduces out” the first dimension, leaving an array with dimensions {4}. ArrayReduce[f, ..., 2] reduces out the second dimension, leaving an array with dimensions {2}.

Let’s look at a slightly bigger case—a 2×3×4 array:

array = ArrayReshape
&#10005
array = ArrayReshape[Range[24], {2, 3, 4}]

This now eliminates the “first dimension”, leaving a 3×4 array:

ArrayReduce
&#10005
ArrayReduce[f, array, 1]
Dimensions
&#10005
Dimensions[%]

This, on the other hand, eliminates the “second dimension”, leaving a 2×4 array:

ArrayReduce
&#10005
ArrayReduce[f, array, 2]
Dimensions
&#10005
Dimensions[%]

Why is this useful? One example is when you have arrays of data where different dimensions correspond to different attributes, and then you want to “ignore” a particular attribute, and aggregate the data with respect to it. Let’s say that the attribute you want to ignore is at level n in your array. Then all you do to “ignore” it is to use ArrayReduce[f, ..., n], where f is the function that aggregates values (often something like Total or Mean).

You can achieve the same results as ArrayReduce by appropriate sequences of Transpose, Apply, etc. But it’s quite messy, and ArrayReduce provides an elegant “packaging” of these kinds of array operations.

ArrayReduce is quite general; it lets you not only “reduce out” single dimensions, but whole collections of dimensions:

ArrayReduce
&#10005
ArrayReduce[f, array, {2, 3}]
ArrayReduce
&#10005
ArrayReduce[f, array, {{2}, {3}}]

At the simplest level, ArrayReduce is a convenient way to apply functions “columnwise” on arrays. But in full generality it’s a way to apply functions to subarrays with arbitrary indices. And if you’re thinking in terms of tensors, ArrayReduce is a generalization of contraction, in which more than two indices can be involved, and elements can be “flattened” before the operation (which doesn’t have to be summation) is applied.

Watch Your Code Run: More in the Echo Family

It’s an old adage in debugging code: “put in a print statement”. But it’s more elegant in the Wolfram Language, thanks particularly to Echo. It’s a simple idea: Echo[expr] “echoes” (i.e. prints) the value of expr, but then returns that value. So the result is that you can put Echo anywhere into your code (often as Echo@…) without affecting what your code does.

In Version 12.2 there are some new functions that follow the “Echo” pattern. A first example is EchoLabel, which just adds a label to what’s echoed:

EchoLabel
&#10005
EchoLabel["a"]@5! + EchoLabel["b"]@10!

Aficionados might wonder why EchoLabel is needed. After all, Echo itself allows a second argument that can specify a label. The answer—and yes, it’s a mildly subtle piece of language design—is that if one’s going to just insert Echo as a function to apply (say with @), then it can only have one argument, so no label. EchoLabel is set up to have the operator form EchoLabel[label] so that EchoLabel[label][expr] is equivalent to Echo[expr,label].

Another new “echo function” in 12.2 is EchoTiming, which displays the timing (in seconds) of whatever it evaluates:

Table
&#10005
Table[Length[EchoTiming[Permutations[Range[n]]]], {n, 8, 10}]

It’s often helpful to use both Echo and EchoTiming:

Length
&#10005
Length[EchoTiming[Permutations[Range[Echo@10]]]]

And, by the way, if you always want to print evaluation time (just like Mathematica 1.0 did by default 32 years ago) you can always globally set $Pre=EchoTiming.

Another new “echo function” in 12.2 is EchoEvaluation which echoes the “before” and “after” for an evaluation:

EchoEvaluation
&#10005
EchoEvaluation[2 + 2]

You might wonder what happens with nested EchoEvaluation’s. Here’s an example:

EchoEvaluation
&#10005
EchoEvaluation[
 Accumulate[EchoEvaluation[Reverse[EchoEvaluation[Range[10]]]]]]

By the way, it’s quite common to want to use both EchoTiming and EchoEvaluation:

Table
&#10005
Table[EchoTiming@EchoEvaluation@FactorInteger[2^(50 n) - 1], {n, 2}]

Finally, if you want to leave echo functions in your code, but want your code to “run quiet”, you can use the new QuietEcho to “quiet” all the echoes (like Quiet “quiets” messages):

QuietEcho@Table
&#10005
QuietEcho@
 Table[EchoTiming@EchoEvaluation@FactorInteger[2^(50 n) - 1], {n, 2}]

 

Confirm/Enclose: Symbolic Exception Handling

Did something go wrong inside your program? And if so, what should the program do? It can be possible to write very elegant code if one ignores such things. But as soon as one starts to put in checks, and has logic for unwinding things if something goes wrong, it’s common for the code to get vastly more complicated, and vastly less readable.

What can one do about this? Well, in Version 12.2 we’ve developed a high-level symbolic mechanism for handling things going wrong in code. Basically the idea is that you insert Confirm (or related functions)—a bit like you might insert Echo—to “confirm” that something in your program is doing what it should. If the confirmation works, then your program just keeps going. But if it fails, then the program stops–and exits to the nearest enclosing Enclose. In a sense, Enclose “encloses” regions of your program, not letting anything that goes wrong inside immediately propagate out.

Let’s see how this works in a simple case. Here the Confirm successfully “confirms” y, just returning it, and the Enclose doesn’t really do anything:

Enclose
&#10005
Enclose[f[x, Confirm[y], z]]

But now let’s put $Failed in place of y. $Failed is something that Confirm by default considers to be a problem. So when it sees $Failed, it stops, exiting to the Enclose—which in turn yields a Failure object:

Enclose
&#10005
Enclose[f[x, Confirm[$Failed], z]]

If we put in some echoes, we’ll see that x is successfully reached, but z is not; as soon as the Confirm fails, it stops everything:

Enclose
&#10005
Enclose[f[Echo[x], Confirm[$Failed], Echo[z]]]

A very common thing is to want to use Confirm/Enclose when you define a function:

addtwo
&#10005
addtwo[x_] := Enclose[Confirm[x] + 2]

Use argument 5 and everything just works:

addtwo
&#10005
addtwo[5]

But if we instead use Missing[]—which Confirm by default considers to be a problem—we get back a Failure object:

addtwo
&#10005
addtwo[Missing[]]

We could achieve the same thing with If, Return, etc. But even in this very simple case, it wouldn’t look as nice.

Confirm has a certain default set of things that it considers “wrong” ($Failed, Failure[...], Missing[...] are examples). But there are related functions that allow you to specify particular tests. For example, ConfirmBy applies a function to test if an expression should be confirmed.

Here, ConfirmBy confirms that 2 is a number:

Enclose
&#10005
Enclose[f[1, ConfirmBy[2, NumberQ], 3]]

But x is not considered so by NumberQ:

Enclose
&#10005
Enclose[f[1, ConfirmBy[x, NumberQ], 3]]

OK, so let’s put these pieces together. Let’s define a function that’s supposed to operate on strings:

world
&#10005
world[x_] := Enclose[ConfirmBy[x, StringQ] <> " world!"]

If we give it a string, all is well:

world
&#10005
world["hello"]

But if we give it a number instead, the ConfirmBy fails:

world
&#10005
world[4]

But here’s where really nice things start to happen. Let’s say we want to map world over a list, always confirming that it gets a good result. Here everything is OK:

Enclose
&#10005
Enclose[Confirm[world[#]] & /@ {"a", "b", "c"}]

But now something has gone wrong:

Enclose
&#10005
Enclose[Confirm[world[#]] & /@ {"a", "b", 3}]

The ConfirmBy inside the definition of world failed, causing its enclosing Enclose to produce a Failure object. Then this Failure object caused the Confirm inside the Map to fail, and the enclosing Enclose gave a Failure object for the whole thing. Once again, we could have achieved the same thing with If, Throw, Catch, etc. But Confirm/Enclose do it more robustly, and more elegantly.

These are all very small examples. But where Confirm/Enclose really show their value is in large programs, and in providing a clear, high-level framework for handling errors and exceptions, and defining their scope.

In addition to Confirm and ConfirmBy, there’s also ConfirmMatch, which confirms that an expression matches a specified pattern. Then there’s ConfirmQuiet, which confirms that the evaluation of an expression doesn’t generate any messages (or, at least, none that you told it to test for). There’s also ConfirmAssert, which simply takes an “assertion” (like p>0) and confirms that it’s true.

When a confirmation fails, the program always exits to the nearest enclosing Enclose, delivering to the Enclose a Failure object with information about the failure that occurred. When you set up the Enclose, you can tell it how to handle failure objects it receives—either just returning them (perhaps to enclosing Confirm’s and Enclose’s), or applying functions to their contents.

Confirm and Enclose are an elegant mechanism for handling errors, that are easy and clean to insert into programs. But—needless to say—there are definitely some tricky issues around them. Let me mention just one. The question is: which Confirm’s does a given Enclose really enclose? If you’ve written a piece of code that explicitly contains Enclose and Confirm, it’s pretty obvious. But what if there’s a Confirm that’s somehow generated—perhaps dynamically—deep inside some stack of functions? It’s similar to the situation with named variables. Module just looks for the variables directly (“lexically”) inside its body. Block looks for variables (“dynamically”) wherever they may occur. Well, Enclose by default works like Module, “lexically” looking for Confirm’s to enclose. But if you include tags in Confirm and Enclose, you can set them up to “find each other” even if they’re not explicitly “visible” in the same piece of code.

Function Robustification

Confirm/Enclose provide a good high-level way to handle the “flow” of things going wrong inside a program or a function. But what if there’s something wrong right at the get-go? In our built-in Wolfram Language functions, there’s a standard set of checks we apply. Are there the correct number of arguments? If there are options, are they allowed options, and are they in the correct place? In Version 12.2 we’ve added two functions that can perform these standard checks for functions you write.

This says that f should have two arguments, which here it doesn’t:

CheckArguments
&#10005
CheckArguments[f[x, y, z], 2]

Here’s a way to make CheckArguments part of the basic definition of a function:

f
&#10005
f[args___] := Null /; CheckArguments[f[args], 2] 

Give it the wrong number of arguments, and it’ll generate a message, and then return unevaluated, just like lots of built-in Wolfram Language functions do:

f
&#10005
f[7]

ArgumentsOptions is another new function in Version 12.2—that separates “positional arguments” from options in a function. Set up options for a function:

Options
&#10005
Options[f] = {opt -> Automatic};

This expect one positional argument, which it finds:

ArgumentsOptions
&#10005
ArgumentsOptions[f[x, opt -> 7], 1]

If it doesn’t find exactly one positional argument, it generates a message:

ArgumentsOptions
&#10005
ArgumentsOptions[f[x, y], 1]

 

Cleaning Up After Your Code

You run a piece of code and it does what it does—and typically you don’t want it to leave anything behind. Often you can use scoping constructs like Module, Block, BlockRandom, etc. to achieve this. But sometimes there’ll be something you set up that needs to be explicitly “cleaned up” when your code finishes.

For example, you might create a file in your piece of code, and want the file removed when that particular piece of code finishes. In Version 12.2 there’s a convenient new function for managing things like this: WithCleanup.

WithCleanup[expr, cleanup] evaluates expr, then cleanup—but returns the result from expr. Here’s a trivial example (which could really be achieved better with Block). You’re assigning a value to x, getting its square—then clearing x before returning the square:

WithCleanup
&#10005
WithCleanup[x = 7; x^2, Clear[x]]

It’s already convenient just to have a construct that does cleanup while still returning the main expression you were evaluating. But an important detail of WithCleanup is that it also handles the situation where you abort the main evaluation you were doing. Normally, issuing an abort would cause everything to stop. But WithCleanup is set up to make sure that the cleanup happens even if there’s an abort. So if the cleanup involves, for example, deleting a file, the file gets deleted, even if the main operation is aborted.

WithCleanup also allows an initialization to be given. So here the initialization is done, as is the cleanup, but the main evaluation is aborted:

WithCleanup
&#10005
WithCleanup[Echo[1], Abort[]; Echo[2], Echo[3]]

By the way, WithCleanup can also be used with Confirm/Enclose to ensure that even if a confirmation fails, certain cleanup will be done.

Dates—with 37 New Calendars

It’s December 16, 2020, today—at least according to the standard Gregorian calendar that’s usually used in the US. But there are many other calendar systems in use for various purposes around the world, and even more that have been used at one time or another historically.

In earlier versions of Wolfram Language we supported a few common calendar systems. But in Version 12.2 we’ve added very broad support for calendar systems—altogether 41 of them. One can think of calendar systems as being a bit like projections in geodesy or coordinate systems in geometry. You have a certain time: now you have to know how it is represented in whatever system you’re using. And much like GeoProjectionData, there’s now CalendarData which can give you a list of available calendar systems:

CalendarData
&#10005
CalendarData["DateCalendar"]

So here’s the representation of “now” converted to different calendars:

CalendarConvert
&#10005
CalendarConvert[Now, #] & /@ CalendarData["DateCalendar"]

There are many subtleties here. Some calendars are purely “arithmetic”; others rely on astronomical computations. And then there’s the matter of “leap variants”. With the Gregorian calendar, we’re used to just adding a February 29. But the Chinese calendar, for example, can add whole “leap months” within a year (so that, for example, there can be two “fourth months”). In the Wolfram Language, we now have a symbolic representation for such things, using LeapVariant:

DateObject
&#10005
DateObject[{72, 25, LeapVariant[4], 20}, CalendarType -> "Chinese"]

One reason to deal with different calendar systems is that they’re used to determine holidays and festivals in different cultures. (Another reason, particularly relevant to someone like me who studies history quite a bit is in the conversion of historical dates: Newton’s birthday was originally recorded as December 25, 1642, but converting it to a Gregorian date it’s January 4, 1643.)

Given a calendar, something one often wants to do is to select dates that satisfy a particular criterion. And in Version 12.2 we’ve introduced the function DateSelect to do this. So, for example, we can select dates within a particular interval that satisfy the criterion that they are Wednesdays:

DateSelect
&#10005
DateSelect[DateInterval[{{{2020, 4, 1}, {2020, 4, 30}}}, "Day", 
  "Gregorian", -5.], #DayName == Wednesday &]

As a more complicated example, we can convert the current algorithm for selecting dates of US presidential elections to computable form, and then use it to determine dates for the next 50 years:

DateSelect
&#10005
DateSelect[DateInterval[{{2020}, {2070}}, "Day"], 
 Divisible[#Year, 4] && #Month == 11 && #DayName == Tuesday && 
   Or[#DayNameInstanceInMonth == 1 && #Day =!= 
      1, #DayNameInstanceInMonth == 2 && #Day == 8] &]

 

New in Geo

By now, the Wolfram Language has strong capabilities in geo computation and geo visualization. But we’re continuing to expand our geo functionality. In Version 12.2 an important addition is spatial statistics (mentioned above)—which is fully integrated with geo. But there are also a couple of new geo primitives. One is GeoBoundary, which computes boundaries of things:

GeoBoundary
&#10005
GeoBoundary[CloudGet["https://wolfr.am/ROWGPJ4I"]]
GeoLength
&#10005
GeoLength[%]

There’s also GeoPolygon, which is a full geo generalization of ordinary polygons. One of the tricky issues GeoPolygon has to handle is what counts as the “interior” of a polygon on the Earth. Here it’s picking the larger area (i.e. the one that wraps around the globe):

GeoGraphics
&#10005
GeoGraphics[
 GeoPolygon[{{-50, 70}, {30, -90}, {70, 50}}, "LargerArea"]]

GeoPolygon can also—like Polygon—handle holes, or in fact arbitrary levels of nesting:

GeoGraphics
&#10005
GeoGraphics[
 GeoPolygon[
  Entity["AdministrativeDivision", {"Illinois", "UnitedStates"}] -> 
   Entity["AdministrativeDivision", {"ChampaignCounty", "Illinois", 
     "UnitedStates"}]]]

But the biggest “coming attraction” of geo is completely new rendering of geo graphics and maps. It’s still preliminary (and unfinished) in Version 12.2, but there’s at least experimental support for vector-based map rendering. The most obvious payoff from this is maps that look much crisper and sharper at all scales. But another payoff is our ability to introduce new styling for maps, and in Version 12.2 we’re including eight new map styles.

Here’s our “old-style”map:

GeoGraphics
&#10005
GeoGraphics[Entity["Building", "EiffelTower::5h9w8"], 
 GeoRange -> Quantity[400, "Meters"]]

Here’s the new, vector version of this “classic” style:

GeoGraphics
&#10005
GeoGraphics[Entity["Building", "EiffelTower::5h9w8"], 
 GeoBackground -> "VectorClassic", 
 GeoRange -> Quantity[400, "Meters"]]

Here’s a new (vector) style, intended for the web:

GeoGraphics
&#10005
GeoGraphics[Entity["Building", "EiffelTower::5h9w8"], 
 GeoBackground -> "VectorWeb", GeoRange -> Quantity[400, "Meters"]]

And here’s a “dark” style, suitable for having information overlaid on it:

GeoGraphics
&#10005
GeoGraphics[Entity["Building", "EiffelTower::5h9w8"], 
 GeoBackground -> "VectorDark", GeoRange -> Quantity[400, "Meters"]]

 

Importing PDF

Want to analyze a document that’s in PDF? We’ve been able to extract basic content from PDF files for well over a decade. But PDF is a highly complex (and evolving) format, and many documents “in the wild” have complicated structure. In Version 12.2, however, we’ve dramatically expanded our PDF import capabilities, so that it becomes realistic to, for example, take a random paper from arXiv, and import it:

Import
&#10005
Import["https://arxiv.org/pdf/2011.12174.pdf"]

By default, what you’ll get is a high-resolution image for each page (in this particular case, all 100 pages).

If you want the text, you can import that with "Plaintext":

Import
&#10005
Import["https://arxiv.org/pdf/2011.12174.pdf", "Plaintext"]

Now you can immediately make a word cloud of the words in the paper:

WordCloud
&#10005
WordCloud[%]

This picks out all the images from the paper, and makes a collage of them:

ImageCollage
&#10005
ImageCollage[Import["https://arxiv.org/pdf/2011.12174.pdf", "Images"]]

You can get the URLs from each page:

Import
&#10005
Import["https://arxiv.org/pdf/2011.12174.pdf", "URLs"]

Now pick off the last two, and get images of those webpages:

WebImage /@ Take
&#10005
WebImage /@ Take[Flatten[Values[%]], -2]

Depending on how they’re produced, PDFs can have all sorts of structure. "ContentsGraph" gives a graph representing the overall structure detected for a document:

Import
&#10005
Import["https://arxiv.org/pdf/2011.12174.pdf", "ContentsGraph"]

And, yes, it really is a graph:

Graph
&#10005
Graph[EdgeList[%]]

For PDFs that are fillable forms, there’s more structure to import. Here I grabbed a random unfilled government form from the web. Import gives an association whose keys are the names of the fields—and if the form had been filled in, it would have given their values too, so you could immediately do analysis on them:

Import
&#10005
Import["https://www.fws.gov/forms/3-200-41.pdf", "FormFieldRules"]

 

The Latest in Industrial-Strength Convex Optimization

Starting in Version 12.0, we’ve been adding state-of-the-art capabilities for solving large-scale optimization problems. In Version 12.2 we’ve continued to round out these capabilities.

One new thing is the superfunction ConvexOptimization, which automatically handles the full spectrum of linear, linear-fractional, quadratic, semidefinite and conic optimization—giving both optimal solutions and their dual properties. In 12.1 we added support for integer variables (i.e. combinatorial optimization); in 12.2 we’re also adding support for complex variables.

But the biggest new things for optimization in 12.2 are the introduction of robust optimization and of parametric optimization. Robust optimization lets you find an optimum that’s valid across a whole range of values of some of the variables. Parametric optimization lets you get a parametric function that gives the optimum for any possible value of particular parameters. So for example this finds the optimum for x, y for any (positive) value of α:

ParametricConvexOptimization
&#10005
ParametricConvexOptimization[(x - 1)^2 + 
  Abs[y], {(x + \[Alpha])^2 <= 1, x + y >= \[Alpha]}, {x, 
  y}, {\[Alpha]}]

Now evaluate the parametric function for a particular α:

%
&#10005
%[.76]

As with everything in the Wolfram Language, we’ve put a lot of effort into making sure that convex optimization integrates seamlessly into the rest of the system—so you can set up models symbolically, and flow their results into other functions. We’ve also included some very powerful convex optimization solvers. But particularly if you’re doing mixed (i.e. real+integer) optimization, or you’re dealing with really huge (e.g. 10 million variables) problems, we’re also giving access to other, external solvers. So, for example, you can set up your problem using Wolfram Language as your “algebraic modeling language”, then (assuming you have the appropriate external licenses) just by setting Method to, say, “Gurobi” or “Mosek” you can immediately run your problem with an external solver. (And, by the way, we now have an open framework for adding more solvers.)

Supporting Combinators and Other Formal Building Blocks

One can say that the whole idea of symbolic expressions (and their transformations) on which we rely so much in the Wolfram Language originated with combinators—which just celebrated their centenary on December 7, 2020. The version of symbolic expressions that we have in Wolfram Language is in many ways vastly more advanced and usable than raw combinators. But in Version 12.2—partly by way of celebrating combinators—we wanted to add a framework for raw combinators.

So now for example we have CombinatorS, CombinatorK, etc., rendered appropriately:

CombinatorS
&#10005
CombinatorS[CombinatorK]

But how should we represent the application of one combinator to another? Today we write something like:

f@g@h@x
&#10005
f@g@h@x

But in the early days of mathematical logic there was a different convention—that involved left-associative application, in which one expected “combinator style” to generate “functions” not “values” from applying functions to things. So in Version 12.2 we’re introducing a new “application operator” Application, displayed as (and entered as \[Application] or  ap ):

Application
&#10005
Application[f, Application[g, Application[h, x]]]
Application
&#10005
Application[Application[Application[f, g], h], x]

And, by the way, I fully expect Application—as a new, basic “constructor”—to have a variety of uses (not to mention “applications”) in setting up general structures in the Wolfram Language.

The rules for combinators are trivial to specify using pattern transformations in the Wolfram Language:

{CombinatorS
&#10005
{CombinatorS\[Application]x_\[Application]y_\[Application]z_ :> 
  x\[Application]z\[Application](y\[Application]z), 
 CombinatorK\[Application]x_\[Application]y_ :> x}

But one can also think about combinators more “algebraically” as defining relations between expressions—and there’s now a theory in AxiomaticTheory for that.

And in 12.2 a few more other theories have been added to AxiomaticTheory, as well as several new properties.

Euclidean Geometry Goes Interactive

One of the major advances in Version 12.0 was the introduction of a symbolic representation for Euclidean geometry: you specify a symbolic GeometricScene, giving a variety of objects and constraints, and the Wolfram Language can “solve” it, and draw a diagram of a random instance that satisfies the constraints. In Version 12.2 we’ve made this interactive, so you can move the points in the diagram around, and everything will (if possible) interactively be rearranged so as to maintain the constraints.

Here’s a random instance of a simple geometric scene:

RandomInstance
&#10005
RandomInstance[
 GeometricScene[{a, b, c, d}, {CircleThrough[{a, b, c}, d], 
   Triangle[{a, b, c}], d == Midpoint[{a, c}]}]]

If you move one of the points, the other points will interactively be rearranged so as to maintain the constraints defined in the symbolic representation of the geometric scene:

RandomInstance
&#10005
RandomInstance[
 GeometricScene[{a, b, c, d}, {CircleThrough[{a, b, c}, d], 
   Triangle[{a, b, c}], d == Midpoint[{a, c}]}]]

What’s really going on inside here? Basically, the geometry is getting converted to algebra. And if you want, you can get the algebraic formulation:

%
&#10005
%["AlgebraicFormulation"]

And, needless to say, you can manipulate this using the many powerful algebraic computation capabilities of the Wolfram Language.

In addition to interactivity, another major new feature in 12.2 is the ability to handle not just complete geometric scenes, but also geometric constructions that involve building up a scene in multiple steps. Here’s an example—that happens to be taken directly from Euclid:

RandomInstance
&#10005
RandomInstance[GeometricScene[
  {{\[FormalCapitalA], \[FormalCapitalB], \[FormalCapitalC], \
\[FormalCapitalD], \[FormalCapitalE], \[FormalCapitalF]}, {}},
  {
   GeometricStep[{Line[{\[FormalCapitalA], \[FormalCapitalB]}], 
     Line[{\[FormalCapitalA], \[FormalCapitalC]}]}, 
    "Define an arbitrary angle BAC."],
   GeometricStep[{\[FormalCapitalD] \[Element] 
      Line[{\[FormalCapitalA], \[FormalCapitalB]}], \[FormalCapitalE] \
\[Element] Line[{\[FormalCapitalA], \[FormalCapitalC]}], 
     EuclideanDistance[\[FormalCapitalA], \[FormalCapitalD]] == 
      EuclideanDistance[\[FormalCapitalA], \[FormalCapitalE]]}, 
    "Put D and E on AB and AC equidistant from A."], 
   GeometricStep[{Line[{\[FormalCapitalD], \[FormalCapitalE]}], 
     GeometricAssertion[{\[FormalCapitalA], \[FormalCapitalF]}, \
{"OppositeSides", Line[{\[FormalCapitalD], \[FormalCapitalE]}]}], 
     GeometricAssertion[
      Triangle[{\[FormalCapitalE], \[FormalCapitalF], \
\[FormalCapitalD]}], "Equilateral"], 
     Line[{\[FormalCapitalA], \[FormalCapitalF]}]}, 
    "Construct an equilateral triangle on DE."]
   }
  ]]

The first image you get is basically the result of the construction. And—like all other geometric scenes—it’s now interactive. But if you mouse over it, you’ll get controls that allow you to move to earlier steps:

RandomInstance
&#10005

Move a point at an earlier step, and you’ll see what consequences that has for later steps in the construction.

Euclid’s geometry is the very first axiomatic system for mathematics that we know about. So—2000+ years later—it’s exciting that we can finally make it computable. (And, yes, it will eventually connect up with AxiomaticTheory, FindEquationalProof, etc.)

But in recognition of the significance of Euclid’s original formulation of geometry, we’ve added computable versions of his propositions (as well as a bunch of other “famous geometric theorems”). The example above turns out to be proposition 9 in Euclid’s book 1. And now, for example, we can get his original statement of it in Greek:

Entity
&#10005
Entity["GeometricScene", "EuclidBook1Proposition9"]["GreekStatement"]

And here it is in modern Wolfram Language—in a form that can be understood by both computers and humans:

Entity
&#10005
Entity["GeometricScene", "EuclidBook1Proposition9"]["Scene"]

 

Yet More Kinds of Knowledge for the Knowledgebase

An important part of the story of Wolfram Language as a full-scale computational language is its access to our vast knowledgebase of data about the world. The knowledgebase is continually being updated and expanded, and indeed in the time since Version 12.1 essentially all domains have had data (and often a substantial amount) updated, or entities added or modified.

But as examples of what’s been done, let me mention a few additions. One area that’s received a lot of attention is food. By now we have data about more than half a million foods (by comparison, a typical large grocery store stocks perhaps 30,000 types of items). Pick a random food:

RandomEntity
&#10005
RandomEntity["Food"]

Now generate a nutrition label:

%
&#10005
%["NutritionLabel"]

As another example, a new type of entity that’s been added is physical effects. Here are some random ones:

RandomEntity
&#10005
RandomEntity["PhysicalEffect", 10]

And as an example of something that can be done with all the data in this domain, here’s a histogram of the dates when these effects were discovered:

DateHistogram
&#10005
DateHistogram[EntityValue["PhysicalEffect", "DiscoveryDate"], "Year", 
 PlotRange -> {{DateObject[{1700}, "Year", "Gregorian", -5.`], 
    DateObject[{2000}, "Year", "Gregorian", -5.`]}, Automatic}]

As another sample of what we’ve been up to, there’s also now what one might (tongue-in-cheek) call a “heavy-lifting” domain—weight-training exercises:

BenchPress
&#10005
Entity["WeightTrainingExercise", "BenchPress"]["Dataset"]

An important feature of the Wolfram Knowledgebase is that it contains symbolic objects, which can represent not only “plain data”—like numbers or strings—but full computational content. And as an example of this, Version 12.2 allows one to access the Wolfram Demonstrations Project—with all its active Wolfram Language code and notebooks—directly in the knowledgebase. Here are some random Demonstrations:

RandomEntity
&#10005
RandomEntity["WolframDemonstration", 5]

The values of properties can be dynamic interactive objects:

Entity
&#10005
Entity["WolframDemonstration", "MooreSpiegelAttractor"]["Manipulate"]

And because everything is computable, one can for example immediately make an image collage of all Demonstrations on a particular topic:

ImageCollage
&#10005
ImageCollage[
 EntityValue[
  EntityClass["WolframDemonstration", "ChemicalEngineering"], 
  "Thumbnail"]]

 

The Continuing Story of Machine Learning

It’s been nearly 7 years since we first introduced Classify and Predict, and began the process of fully integrating neural networks into the Wolfram Language. There’ve been two major directions: the first is to develop “superfunctions”, like Classify and Predict, that—as automatically as possible—perform machine-learning-based operations. The second direction is to provide a powerful symbolic framework to take advantage of the latest advances with neural nets (notably through the Wolfram Neural Net Repository) and to allow flexible continued development and experimentation.

Version 12.2 has progress in both these areas. An example of a new superfunction is FaceRecognize. Give it a small number of tagged examples of faces, and it will try to identify them in images, videos, etc. Let’s get some training data from web searches (and, yes, it’s somewhat noisy):

AssociationMap
&#10005
faces = Image[#, ImageSize -> 30] & /@ AssociationMap[Flatten[
     FindFaces[#, "Image"] & /@ 
      WebImageSearch["star trek " <> #]] &, {"Jean-Luc Picard", 
    "William Riker", "Phillipa Louvois", "Data"}]

Now create a face recognizer with this training data:

FaceRecognize
&#10005
recognizer = FaceRecognize[faces]

Now we can use this to find out who’s on screen in each frame of a video:

VideoMapList
&#10005
VideoMapList[recognizer[FindFaces[#Image, "Image"]] &, Video[URLDownload["https://ia802900.us.archive.org/7/items/2000-promo-for-star-trek-the-next-generation/2000%20promo%20for%20Star%20Trek%20-%20The%20Next%20Generation.ia.mp4"]]] /. 
 m_Missing \[RuleDelayed] "Other"

Now plot the results:

ListPlot
&#10005
ListPlot[Catenate[
  MapIndexed[{First[#2], #1} &, ArrayComponents[%], {2}]], Sequence[
 ColorFunction -> ColorData["Rainbow"], Ticks -> {None, 
Thread[{
Range[
Max[
ArrayComponents[rec]]], 
DeleteDuplicates[
Flatten[rec]]}]}]]

In the Wolfram Neural Net Repository there’s a regular stream of new networks being added. Since Version 12.1 about 20 new kinds of networks have been added—including many new transformer nets, as well as EfficientNet and for example feature extractors like BioBERT and SciBERT specifically trained on text from scientific papers.

In each case, the networks are immediately accessible—and usable—through NetModel. Something that’s updated in Version 12.2 is the visual display of networks:

NetModel
&#10005
NetModel["ELMo Contextual Word Representations Trained on 1B Word \
Benchmark"]

There are lots of new icons, but there’s also now a clear convention that circles represent fixed elements of a net, while squares represent trainable ones. In addition, when there’s a thick border in an icon, it means there’s an additional network inside, that you can see by clicking.

Whether it’s a network that comes from NetModel or your construct yourself (or a combination of those two), it’s often convenient to extract the “summary graphic” for the network, for example so you can put it in documentation or a publication. Information provides several levels of summary graphics:

Information
&#10005
Information[
 NetModel["CapsNet Trained on MNIST Data"], "SummaryGraphic"]

There are several important additions to our core neural net framework that broaden the range of neural net functionality we can access. The first is that in Version 12.2 we have native encoders for graphs and for time series. So, here, for example, we’re making a feature space plot of 20 random named graphs:

FeatureSpacePlot
&#10005
FeatureSpacePlot[GraphData /@ RandomSample[GraphData[], 20]]

Another enhancement to the framework has to do with diagnostics for models. We introduced PredictorMeasurements and ClassifierMeasurements many years ago to provide a symbolic representation for the performance of models. In Version 12.2—in response to many requests—we’ve made it possible to feed final predictions, rather than a model, to create a PredictorMeasurements object, and we’ve streamlined the appearance and operation of PredictorMeasurements objects:

PredictorMeasurements
&#10005
PredictorMeasurements[{3.2, 3.5, 4.6, 5}, {3, 4, 5, 6}]

An important new feature of ClassifierMeasurements is the ability to compute a calibration curve that compares the actual probabilities observed from sampling a test set with the predictions from the classifier. But what’s even more important is that Classify automatically calibrates its probabilities, in effect trying to “sculpt” the calibration curve:

Row
&#10005
Row[{
  First@ClassifierMeasurements[
    Classify[training, Method -> "RandomForest", 
     "Calibration" -> False], test, "CalibrationCurve"],
  "  \[LongRightArrow]  ",
  First@ClassifierMeasurements[
    Classify[training, Method -> "RandomForest", 
     "Calibration" -> True], test, "CalibrationCurve"]
  }]

Version 12.2 also has the beginning of a major update to the way neural networks can be constructed. The fundamental setup has always been to put together a certain collection of layers that expose what amount to array indices that are connected by explicit edges in a graph. Version 12.2 now introduces FunctionLayer, which allows you to give something much closer to ordinary Wolfram Language code. As an example, here’s a particular function layer:

FunctionLayer
&#10005
FunctionLayer[
 2*(#v . #m . {0.25, 0.75}) . NetArray[<|"Array" -> {0.1, 0.9}|>] & ]

And here’s the representation of this function layer as an explicit NetGraph:

NetGraph
&#10005
NetGraph[%]

v and m are named “input ports”. The NetArray—indicated by the square icons in the net graph—is a learnable array, here containing just two elements.

There are cases where it’s easier to use the “block-based” (or “graphical”) programming approach of just connecting together layers (and we’ve worked hard to ensure that the connections can be made as automatically as possible). But there are also cases where it’s easier to use the “functional” programming approach of FunctionLayer. For now, FunctionLayer supports only a subset of the constructs available in the Wolfram Language—though this already includes many standard array and functional programming operations, and more will be added in the future.

An important feature of FunctionLayer is that the neural net it produces will be as efficient as any other neural net, and can run on GPUs etc. But what can you do about Wolfram Language constructs that are not yet natively supported by FunctionLayer? In Version 12.2 we’re adding another new experimental function—CompiledLayer—that extends the range of Wolfram Language code that can be handled efficiently.

It’s perhaps worth explaining a bit about what’s happening inside. Our main neural net framework is essentially a symbolic layer that organizes things for optimized low-level implementation, currently using MXNet. FunctionLayer is effectively translating certain Wolfram Language constructs directly to MXNet. CompiledLayer is translating Wolfram Language to LLVM and then to machine code, and inserting this into the execution process within MXNet. CompiledLayer makes use of the new Wolfram Language compiler, and its extensive type inference and type declaration mechanisms.

OK, so let’s say one’s built a magnificent neural net in our Wolfram Language framework. Everything is set up so that the network can immediately be used in a whole range of Wolfram Language superfunctions (Classify, FeatureSpacePlot, AnomalyDetection, FindClusters, …). But what if one wants to use the network “standalone” in an external environment? In Version 12.2 we’re introducing the capability to export essentially any network in the recently developed ONNX standard representation.

And once one has a network in ONNX form, one can use the whole ecosystem of external tools to deploy it in a wide variety of environments. A notable example—that’s now a fairly streamlined process—is to take a full Wolfram Language–created neural net and run it in CoreML on an iPhone, so that it can for example directly be included in a mobile app.

Form Notebooks

What’s the best way to collect structured material? If you just want to get a few items, an ordinary form created with FormFunction (and for example deployed in the cloud) can work well. But what if you’re trying to collect longer, richer material?

For example, let’s say you’re creating a quiz where you want students to enter a whole sequence of complex responses. Or let’s say you’re creating a template for people to fill in documentation for something. What you need in these cases is a new concept that we’re introducing in Version 12.2: form notebooks.

A form notebook is basically a notebook that is set up to be used as a complex “form”, where the inputs in the form can be all the kinds of things that you’re used to having in a notebook.

The basic workflow for form notebooks is the following. First you author a form notebook, defining the various “form elements” (or areas) that you want the user of the form notebook to fill in. As part of the authoring process, you define what you want to have happen to the material the user of the form notebook enters when they use the form notebook (e.g. put the material in a Wolfram Data Drop databin, send the material to a cloud API, send the material as a symbolic expression by email, etc.).

After you’ve authored the form notebook, you then generate an active version that can be sent to whoever will be using the form notebook. Once someone has filled in their material in their copy of the deployed form notebook, they press a button, typically “Submit”, and their material is then sent as a structured symbolic expression to whatever destination the author of the form notebook specified.

It’s perhaps worth mentioning how form notebooks relate to something that sounds similar: template notebooks. In a sense, a template notebook is doing the reverse of a form notebook. A form notebook is about having a user enter material that will then be processed. A template notebook, on the other hand, is about having the computer generate material which will then be used to populate a notebook whose structure is defined by the template notebook.

OK, so how do you get started with form notebooks? Just go to File > New > Programmatic Notebook > Form Notebook Authoring:

Form notebooks

This is just a notebook, where you can enter whatever content you want—say an explanation of what you want people to do when they “fill out” the form notebook. But then there are special cells or sequences of cells in the form notebook that we call “form elements” and “editable notebook areas”. These are what the user of the form notebook “fills out” to enter their “responses”, and the material they provide is what gets sent when they press the “Submit” button (or whatever final action has been defined).

In the authoring notebook, the toolbar gives you a menu of possible form elements that you can insert:

Form notebooks

Let’s pick “Input Field” as an example:

Form notebooks

What does all this mean? Basically a form element is represented by a very flexible symbolic Wolfram Language expression, and this is giving you a way to specify the expression you want. You can give a label and a hint to put in the input field. But it’s with the Interpreter that you start to see the power of Wolfram Language. Because the Interpreter is what takes whatever the user of the form notebooks enters in this input field, and interprets it as a computable object. The default is just to treat it as a string. But it could for example be a “Country” or a “MathExpression”. And with these choices, the material will automatically be interpreted as a country, math expression, etc., with the user typically being prompted if their input can’t be interpreted as specified.

There are lots of options about the details of how even an input field can work. Some of them are provided in the Add Action menu:

Form notebooks

But so what actually “is” this form element? Press the CODE tab on the left to see:

Form notebooks

What would a user of the form notebook see here? Press the PREVIEW tab to find out:

Form notebooks

Beyond input fields, there are lots of other possible form elements. There are things like checkboxes, radio buttons and sliders. And in general it’s possible to use any of the rich symbolic user interface constructs that exist in the Wolfram Language.

Once you’ve finishing authoring, you press Generate to generate a form notebook that is ready to be provided to users to be filled in. The Settings define things like how the “submit” action should be specified, and what should be done when the form notebook is submitted:

Form notebooks

So what is the “result” of a submitted form notebook? Basically it’s an association that says what was filled into each area of the form notebook. (The areas are identified by keys in the association that were specified when the areas were first defined in the authoring notebook.)

Let’s see how this works in a simple case. Here’s the authoring notebook for a form notebook:

Form notebooks

Here’s the generated form notebook, ready to be filled in:
Form notebooks

Here’s a sample of how the form notebook might be filled in:
Form notebooks

And this is what “comes back” when Submit is pressed:

Form notebooks

For testing, you can just have this association placed interactively in a notebook. But in practice it’s more common to send the association to a databin, store it in a cloud object, or generally put it in a more “centralized” location.

Notice that at the end of this example we have an editable notebook area—where you can enter free-form notebook content (with cells, headings, code, output, etc.) that will all be captured when the form notebook is submitted.

Form notebooks are very powerful idea, and you’ll see them used all over the place. As a first example, the various submission notebooks for the Wolfram Function Repository, Wolfram Demonstrations Project, etc. are becoming form notebooks. We’re also expecting a lot of use of form notebooks in educational settings. And as part of that, we’re building a system that leverages Wolfram Language for assessing responses in form notebooks (and elsewhere).

You can see the beginnings of this in Version 12.2 with the experimental function AssessmentFunction—which can be hooked into form notebooks somewhat like Interpreter. But even without the full capabilities planned for AssessmentFunction there’s still an incredible amount that can be done—in educational settings and otherwise—using form notebooks.

It’s worth understanding, by the way, that form notebooks are ultimately very simple to use in any particular case. Yes, they have a lot of depth that allows them to do a very wide range of things. And they’re basically only possible because of the whole symbolic structure of the Wolfram Language, and the fact that Wolfram Notebooks are ultimately represented as symbolic expressions. But when it comes to using them for a particular purpose they’re very streamlined and straightforward, and it’s completely realistic to create a useful form notebook in just a few minutes.

Yet More Notebookery

We invented notebooks—with all their basic features of hierarchical cells, etc.—back in 1987. But for a third of a century, we’ve been progressively polishing and streamlining how they work. And in Version 12.2 there are all sorts of useful and convenient new notebook features.

Click to Copy

It’s a very simple feature, but it’s very useful. You see something in a notebook, and all you really want to be able to do with it is copy it (or perhaps copy something related to it). Well, then just use ClickToCopy:

ClickToCopy
&#10005
ClickToCopy[10!]

If you want to click-to-copy something unevaluated, use Defer:

ClickToCopy
&#10005
ClickToCopy[Plot[Sin[x], {x, 0, 10}], Defer[Plot[Sin[x], {x, 0, 10}]]]

Streamlined Hyperlinking (and Hyperlink Editing)

++h has inserted a hyperlink in a Wolfram Notebook since 1996. But in Version 12.2 there are two important new things with hyperlinks. First, automatic hyperlinking that handles a wide range of different situations. And second, a modernized and streamlined mechanism for hyperlink creation and editing.

Hyperlink creation and editing

Attached Cells

In Version 12.2 we’re exposing something that we’ve had internally for a while: the ability to attach a floating fully functional cell to any given cell (or box, or whole notebook). Accessing this feature needs symbolic notebook programming, but it lets you do very powerful things—particularly in introducing contextual and “just-in-time” interfaces. Here’s an example that puts a dynamic counter that counts in primes on right-bottom part of the cell bracket:

AttachCell
&#10005
obj=AttachCell[EvaluationCell[],Panel[Dynamic[i]],{"CellBracket",Bottom},0,{Right,Bottom}];
Do[PrimeQ[i],{i,10^7}];
NotebookDelete[obj]

Template Box Infrastructure

Sometimes it’s useful for what you see not to be what you have. For example, you might want to display something in a notebook as J0(x) but have it really be BesselJ[0, x]. For many years, we’ve had Interpretation as a way to set this up for specific expressions. But we’ve also had a more general mechanism—TemplateBox—that lets you take expressions, and separately specify how they should be displayed, and interpreted.

In Version 12.2 we’ve further generalized—and streamlined—TemplateBox, allowing it to incorporate arbitrary user interface elements, as well as allowing it to specify things like copy behavior. Our new TEX input mechanism, for example, is basically just an application of the new TemplateBox.

In this case, "TeXAssistantTemplate" refers to a piece of functionality defined in the notebook stylesheet—whose parameters are specified by the association given in the TemplateBox:

RawBoxes
&#10005
RawBoxes[TemplateBox[<|
   "boxes" -> FormBox[FractionBox["1", "2"], TraditionalForm], 
   "errors" -> {}, "input" -> "\\frac{1}{2}", "state" -> "Boxes"|>, 
  "TeXAssistantTemplate"]]

 

The Desktop Interface to the Cloud

An important feature of Wolfram Notebooks is that they’re set up to operate both on the desktop and in the cloud. And even between versions of Wolfram Language there’s lots of continued enhancement in the way notebooks work in the cloud. But in Version 12.2 there’s been some particular streamlining of the interface for notebooks between desktop and cloud.

A particularly nice mechanism already available for a couple of years in any desktop notebook is the File > Publish to Cloud… menu item, which allows you to take the notebook and immediately make it available as a published cloud notebook that can be accessed by anyone with a web browser. In Version 12.2 we’ve streamlined the process of notebook publishing.

When I’m giving a presentation I’ll usually be creating a desktop notebook as I go (or perhaps using one that already exists). And at the end of the presentation, it’s become my practice to publish it to the cloud, so anyone in the audience can interact with it. But how can I give everyone the URL for the notebook? In a virtual setting, you can just use chat. But in an actual physical presentation, that’s not an option. And in Version 12.2 we’ve provided a convenient alternative: the result of Publish to Cloud includes a QR code that people can capture with their phones, then immediately go to the URL and interact with the notebook on their phones.

Publish to cloud

There’s one other notable new item visible in the result of Publish to Cloud: “Direct JavaScript Embedding”. This is a link to the Wolfram Notebook Embedder which allows cloud notebooks to be directly embedded through JavaScript onto webpages.

It’s always easy to use an iframe to embed one webpage on another. But iframes have many limitations, such as requiring their size to be defined in advance. The Wolfram Notebook Embedder allows full-function fluid embedding of cloud notebooks—as well as scriptable control of the notebooks from other elements of a webpage. And since the Wolfram Notebook Embedder is set up to use the oEmbed embedding standard, it can immediately be used in basically all standard web content management systems.

We’ve talked about sending notebooks from the desktop to the cloud. But another thing that’s new in Version 12.2 is faster and easier browsing of your cloud file system from the desktop—as accessed from File > Open from Cloud… and File > Save to Cloud…

Save to cloud

Cryptography & Security

One of the things we want to do with Wolfram Language is to make it as easy as possible to connect with pretty much any external system. And in modern times an important part of that is being able to conveniently handle cryptographic protocols. And ever since we started introducing cryptography directly into the Wolfram Language five years ago, I’ve been surprised at just how much the symbolic character of the Wolfram Language has allowed us to clarify and streamline things to do with cryptography.

A particularly dramatic example of this has been how we’ve been able to integrate blockchains into Wolfram Language (and Version 12.2 adds bloxberg with several more on the way). And in successive versions we’re handling different applications of cryptography. In Version 12.2 a major emphasis is symbolic capabilities for key management. Version 12.1 already introduced SystemCredential for dealing with local “keychain” key management (supporting, for example, “remember me” in authentication dialogs). In 12.2 we’re also dealing with PEM files.

If we import a PEM file containing a private key we get a nice, symbolic representation of the private key:

private = First
&#10005
private = First[Import["ExampleData/privatesecp256k1.pem"]]

Now we can derive a public key:

public = PublicKey
&#10005
public = PublicKey[%]

If we generate a digital signature for a message using the private key

GenerateDigitalSignature
&#10005
GenerateDigitalSignature["Hello there", private]

then this verifies the signature using the public key we’ve derived:

VerifyDigitalSignature
&#10005
VerifyDigitalSignature[{"Hello there", %}, public]

An important part of modern security infrastructure is the concept of a security certificate—a digital construct that allows a third party to attest to the authenticity of a particular public key. In Version 12.2 we now have a symbolic representation for security certificates—providing what’s needed for programs to establish secure communication channels with outside entities in the same kind of way that https does:

Import
&#10005
Import["ExampleData/client.pem"]

 

Just Type SQL

In Version 12.0 we introduced powerful functionality for querying relational databases symbolically within the Wolfram Language. Here’s how we connect to a database:

db = DatabaseReference
&#10005
db = DatabaseReference[
  FindFile["ExampleData/ecommerce-database.sqlite"]]

Here’s how we connect the database so that its tables can be treated just like entity types from the built-in Wolfram Knowledgebase:

EntityRegister
&#10005
EntityRegister[EntityStore[RelationalDatabase[db]]]

Now we can for example ask for a list of entities of a given type:

EntityList
&#10005
EntityList["offices"]

What’s new in 12.2 is that we can conveniently go “under” this layer, to directly execute SQL queries against the underlying database, getting the complete database table as a Dataset expression:

ExternalEvaluate
&#10005
ExternalEvaluate[db, "SELECT * FROM offices"]

These queries can not only read from the database, but also write to it. And to make things even more convenient, we can effectively treat SQL just like any other “external language” in a notebook.

First we have to register our database, to say what we want our SQL to be run against:

RegisterExternalEvaluator
&#10005
RegisterExternalEvaluator["SQL", db]

And now we can just type SQL as input—and get back Wolfram Language output, directly in the notebook:

Type SQL as input

Microcontroller Support Goes 32 Bit

You’ve developed a control system or signal processing in Wolfram Language. Now how do you deploy it to a piece of standalone electronics? In Version 12.0 we introduced the Microcontroller Kit for compiling from symbolic Wolfram Language structures directly to microcontroller code.

We’ve had lots of feedback on this, asking us to expand the range of microcontrollers that we support. So in Version 12.2 I’m happy to say that we’re adding support for 36 new microcontrollers, particularly 32-bit ones:

Supported microcontrollers

Here’s an example in which we deploy a symbolically defined digital filter to a particular kind of microcontroller, showing the simplified C source code generated for that particular microcontroller:

Needs
&#10005
Needs["MicrocontrollerKit`"]
ToDiscreteTimeModel
&#10005
ToDiscreteTimeModel[ButterworthFilterModel[{3, 2}], 0.6] // Chop
MicrocontrollerEmbedCode
&#10005
MicrocontrollerEmbedCode[%, <|"Target" -> "AdafruitGrandCentralM4", 
   "Inputs" -> 0 -> "Serial", "Outputs" -> 1 -> "Serial"|>, 
  "/dev/cu.usbmodem14101"]["SourceCode"]

 

WSTPServer: A New Deployment of Wolfram Engine

Our long-term goal is to make the Wolfram Language and the computational intelligence it provides as ubiquitous as possible. And part of doing this is to set up the Wolfram Engine which implements the language so that it can be deployed in as broad a range of computational infrastructure settings as possible.

Wolfram Desktop—as well as classic Mathematica—primarily provide a notebook interface to the Wolfram Engine, running on a local desktop system. It’s also possible to run Wolfram Engine directly—as a command-line program (e.g. through WolframScript)—on a local computer system. And, of course, one can run the Wolfram Engine in the cloud, either through the full Wolfram Cloud (public or private), or through more lightweight cloud and server offerings (both existing and forthcoming).

But with Version 12.2 there’s a new deployment of the Wolfram Engine: WSTPServer. If you use Wolfram Engine in the cloud, you’re typically communicating with it through http or related protocols. But for more than thirty years, the Wolfram Language has had its own dedicated protocol for transferring symbolic expressions and everything around them. Originally we called it MathLink, but in more recent years, as it’s progressively been extended, we’ve called it WSTP: the Wolfram Symbolic Transfer Protocol. What WSTPServer does, as its name suggests, is to give you a lightweight server that delivers Wolfram Engines and lets you communicate with them directly in native WSTP.

Why is this important? Basically because it gives you a way to manage pools of persistent Wolfram Language sessions that can operate as services for other applications. For example, normally each time you call WolframScript you get a new, fresh Wolfram Engine. But by using wolframscript -wstpserver with a particular “WSTP profile name” you can keep getting the same Wolfram Engine every time you call WolframScript. You can do this directly on your local machine—or on remote machines.

And an important use of WSTPServer is to expose pools of Wolfram Engines that can be accessed through the new RemoteEvaluate function in Version 12.2. It’s also possible to use WSTPServer to expose Wolfram Engines for use by ParallelMap, etc. And finally, since WSTP has (for nearly 30 years!) been the way the notebook front end communicates with the Wolfram Engine kernel, it’s now possible to use WSTPServer to set up a centralized kernel pool to which you can connect the notebook front end, allowing you, for example, to keep running a particular session (or even a particular computation) in the kernel even as you switch to a different notebook front end, on a different computer.

RemoteEvaluate: Compute Someplace Else…

Along the lines of “use Wolfram Language everywhere” another new function in Version 12.2 is RemoteEvaluate. We’ve got CloudEvaluate which does a computation in the Wolfram Cloud, or an Enterprise Private Cloud. We’ve got ParallelEvaluate which does computations on a predefined collection of parallel subkernels. And in Version 12.2 we’ve got RemoteBatchSubmit which submits batch computations to cloud computation providers.

RemoteEvaluate is a general, lightweight “evaluate now” function that lets you do a computation on any specified remote machine that has an accessible Wolfram Engine. You can connect to the remote machine using ssh or wstp (or http with a Wolfram Cloud endpoint).

RemoteEvaluate
&#10005
RemoteEvaluate["ssh://byblis67.wolfram.com", 
 Labeled[Framed[$MachineName], Now]]

Sometimes you’ll want to use RemoteEvaluate to do things like system administration across a range of machines. Sometimes you might want to collect or send data to remote devices. For example, you might have a network of Raspberry Pi computers which all have Wolfram Engine—and then you can use RemoteEvaluate to do something like retrieve data from these machines. By the way, you can also use ParallelEvaluate from within RemoteEvaluate, so you’re having a remote machine be the master for a collection of parallel subkernels.

Sometimes you’ll want RemoteEvaluate to start a fresh instance of Wolfram Engine whenever you do an evaluation. But with WSTPServer you can also have it use a persistent Wolfram Language session. RemoteEvaluate and WSTPServer are the beginning of a general symbolic framework for representing running Wolfram Engine processes. Version 12.2 already has RemoteKernelObject and $DefaultRemoteKernel which provide symbolic ways to represent remote Wolfram Language instances.

And Yet More (AKA “None of the Above”)

I’ve at least touched on many of the bigger new features of Version 12.2. But there’s a lot more. Additional functions, enhancements, fixes and general rounding out and polishing.

Like in computational geometry, ConvexHullRegion now deals with regions, not just points. And there are functions like CollinearPoints and CoplanarPoints that test for collinearity and coplanarity, or give conditions for achieving them.

There are more import and export formats. Like there’s now support for the archive formats: “7z”, “ISO”, “RAR”, “ZSTD”. There’s also FileFormatQ and ByteArrayFormatQ for testing whether things correspond to particular formats.

In terms of core language, there are things like updates to the complicated-to-define ValueQ. There’s also RandomGeneratorState that gives a symbolic representation of random generator states.

In the desktop package (i.e. .wl file) editor, there’s a new (somewhat experimental) Format Cell button, that reformats code—with a control on how “airy” it should be (i.e. how dense it should be in newlines).

In Wolfram|Alpha-mode notebooks (as used by default in Wolfram|Alpha Notebook Edition) there are other new features, like function documentation targeted for particular function usage.

There’s also more in TableView, as well as a large suite of new paclet authoring tools that are included on an experimental basis.

To me it’s rather amazing how much we’ve been able to bring together in Version 12.2, and, as always, I’m excited that it’s now out and available to everyone to use….

Tini Veltman (1931–2021): From Assembly Language to a Nobel Prize

$
0
0
tini-veltman-1931-2021-icon

Tini Veltman (1931-2021): From Assembly Language to a Nobel Prize

It All Started with Feynman Diagrams

Any serious calculation in particle physics takes a lot of algebra. Maybe it doesn’t need to. But with the methods based on Feynman diagrams that we know so far, it does. And in fact it was these kinds of calculations that first led me to use computers for symbolic computation. That was in 1976, which by now is a long time ago. But actually the idea of doing Feynman diagram calculations by computer is even older.

So far as I know it all started from a single conversation on the terrace outside the cafeteria of the CERN particle physics lab near Geneva in 1962. Three physicists were involved. And out of that conversation there emerged three early systems for doing algebraic computation. One was written in Fortran. One was written in LISP. And one was written in assembly language.

I’ve told this story quite a few times, often adding “And which of those physicists do you suppose later won a Nobel Prize?” “Of course,” I explain, “it was the one who wrote their system in assembly language!”

That physicist was Martinus (Tini) Veltman, who died a couple of weeks ago, and who I knew for several decades. His system was called SCHOONSCHIP, and he wrote the first version of it in IBM 7000 series assembly language. A few years later he rewrote it in CDC 6000 series assembly language.

The emphasis was always on speed. And for many years SCHOONSCHIP was the main workhorse system for doing very large-scale Feynman diagram calculations—which could take months of computer time.

Back in the early 1960s when SCHOONSCHIP was first written, Feynman diagrams—and the quantum field theory from which they came—were out of fashion. Feynman diagrams had been invented in the 1940s for doing calculations in quantum electrodynamics (the quantum theory of electrons and photons)—and that had gone well. But attention had turned to the strong interactions which hold nuclei together, and the weak interactions responsible for nuclear beta decay, and in neither case did Feynman diagrams seem terribly useful.

There was, however, a theory of the weak interactions that involved as-yet-unobserved “intermediate vector bosons” (that were precursors of what we now call W particles). And in 1961—as part of his PhD thesis—Tini Veltman took on the problem of computing how photons would interact with intermediate vector bosons. And for this he needed elaborate Feynman diagram calculations.

I’m not sure if Tini already knew how to program, or whether he learned it for the purpose of creating SCHOONSCHIP—though I do know that he’d been an electronics buff since childhood.

I think I was first exposed to SCHOONSCHIP in 1976, and I used it for a few calculations. In my archives now, I can find only a single example of running it: a sample calculation someone did for me, probably in 1978, in connection with something I was writing (though never published):

Click to enlarge

By modern standards it looks a bit obscure. But it’s a fairly typical “old-style line printer output”. There’s a version of the input at the top. Then some diagnostics in the middle. And then the result appears at the bottom. And the system reports that this took .12 seconds to generate.

This particular result is for a very simple Feynman diagram involving the interaction of a photon and an electron—and involves just 9 terms. But SCHOONSCHIP could handle results involving millions of terms too (which allowed computations in QED to be done to 8-digit precision).

1979

Within days after finishing my PhD in theoretical physics at Caltech in November 1979, I flew to Geneva, Switzerland, to visit CERN for a couple of weeks. And it was during that visit that I started designing SMP (“Symbolic Manipulation Program”)—the system that would be the forerunner of Mathematica and the Wolfram Language.

And when I mentioned what I was doing to people at CERN they said “You should talk to Tini Veltman”.

And so it was that in December 1979 I flew to Amsterdam, and went to see Tini Veltman. The first thing that struck me was how incongruous the name “Tini” (pronounced “teeny”) seemed. (At the time, I didn’t even know why he was called Tini; I’d only seen his name as “M. Veltman”, and didn’t know “Tini” was short for “Martinus”.) But Tini was a large man, with a large beard—not “teeny” at all. He reminded me of pictures of European academics of old—and, for some reason, particularly of Ludwig Boltzmann.

He was 48 years old; I was 20. He was definitely a bit curious about the “newfangled computer ideas” I was espousing. But generally he took on the mantle of an elder statesman who was letting me in on the secrets of how to build a computer system like SCHOONSCHIP.

I pronounced SCHOONSCHIP “scoon-ship”. He got a twinkle in his eye, and corrected it to a very guttural “scohwn-scchhip” (IPA: [sxon][sxɪp]), explaining that, yes, he’d given it a Dutch name that was hard for non-Dutch people to say. (The Dutch word “schoonschip” means, roughly, “shipshape”—like SCHOONSCHIP was supposed to make one’s algebraic expressions.) Everything in SCHOONSCHIP was built for efficiency. The commands were short. Tini was particularly pleased with YEP for generating intermediate results, which, he said, was mnemonic in Dutch (yes, SCHOONSCHIP may be the only computer system with keywords derived from Dutch).

If you look at the sample SCHOONSCHIP output above, you might notice something a little strange in it. Every number in the (exact) algebraic expression result that’s generated has a decimal point after it. And there’s even a +0. at the end. What’s going on there? Well, that was one of the big secrets Tini was very keen to tell me.

“Floating-point computation is so much faster than integer”, he said. “You should do everything you can in floating point. Only convert it back to exact numbers at the end.” And, yes, it was true that the scientific computers of the time—like the CDC machines he used—had very much been optimized for floating-point arithmetic. He quoted instruction times. He explained that if you do all your arithmetic in “fast floating point”, and then get a 64-bit floating point number out at the end, you can always reverse engineer what rational number it was supposed to be—and that it’s much faster to do this than to keep track of rational numbers exactly through the computation.

It was a neat hack. And I bought it. And in fact when we implemented SMP its default “exact” arithmetic worked exactly this way. I’m not sure if it really made computations more efficient. In the end, it got quite tangled up with the rather abstract and general design of SMP, and became something of a millstone. But actually we use somewhat similar ideas in modern Wolfram Language (albeit now with formally verified interval arithmetic) for doing exact computations with things like algebraic numbers.

Tini as Physicist

I’m not sure I ever talked much to Tini about the content of physics; somehow we always ended up discussing computers (or the physics community) instead. But I certainly made use of Tini’s efforts to streamline not just the computer implementation but also the underlying theory of Feynman diagram calculation. I expect—as I have seen so often—that his efforts to streamline the underlying theory were driven by thinking about things in computational terms. But I made particular use of the “Diagrammar” he produced in 1972:

Click to enlarge

One of the principal methods that was introduced here was what’s called dimensional regularization: the concept of formally computing results in d-dimensional space (with d a continuous variable), then taking the limit d  4 at the end. It’s an elegant approach, and when I was doing particle physics in the late 1970s, I became quite an enthusiast of it. (In fact, I even came up with an extension of it—based on looking at the angular structure of Gegenbauer functions as d-dimensional spherical functions—that was further developed by Tony Terrano, who worked with me on Feynman diagram computation, and later on SMP.)

Back in the 1970s, “continuing to d dimensions” was just thought of as a formal trick. But, curiously enough, in our Physics Project, where dimension is an emergent property, one’s interested in “genuinely d-dimensional” space. And, quite possibly, there are experimentally observable signatures of d  3 dimensions of space. And in thinking about that, quite independent of Tini, I was just about to pull out my copy of “Diagrammar” again.

Tini’s original motivation for writing SCHOONSCHIP had been a specific calculation involving the interaction of photons and putative “intermediate vector bosons”. But by the late 1960s, there was an actual theoretical candidate for what the “intermediate vector boson” might be: a “gauge boson” basically associated with the “gauge symmetry group” SU(2)—and given mass through “spontaneous symmetry breaking” and the “Higgs mechanism”.

But what would happen if one did Feynman diagram calculations in a “gauge theory” like this? Would they have the renormalizability property that Richard Feynman had identified in QED, and that allowed one to not worry about infinities that were nominally generated in calculations? Tini Veltman wanted to figure this out, and soon suggested the problem to his student Gerard ’t Hooft (his coauthor on “Diagrammar”).

Tini had defined the problem and formalized what was needed, but it was ’t Hooft who figured out the math and in the end presented a rather elaborate proof of renormalizability of gauge theories in his 1972 PhD thesis. It was a major and much-heralded result—providing what was seen as key theoretical validation for the first part of what became the Standard Model of particle physics. And it launched ’t Hooft’s career.

Tini Veltman always gave me the impression of someone who wanted to interact—and collaborate—with people. Gerard ’t Hooft has always struck me as being more in the “lone wolf” model of doing physics. I’ve interacted with Gerard from time to time for years (in fact I first met him several years before I met Tini). And it’s been very impressive to see him invent a long sequence of some of the most creative ideas in physics over the past half century. And though it’s not my focus here, I should mention that Gerard got interested in cellular automata in the late 1980s, and a few years ago even wrote a book called The Cellular Automaton Interpretation of Quantum Mechanics. I’d never quite understood what he was talking about, and I suspected that—despite his use of Mathematica—he’d never explored the computational universe enough to develop a true intuition for what goes on there. But actually, quite recently, it looks as if there’s a limiting case of our Physics Project that may just correspond to what Gerard has been talking about—which would be very cool…

But I digress. Starting in 1966 Tini was a professor at Utrecht. And in 1974 Gerard became a professor there too. And even by the time I met Tini in 1979 there were already rumors of a falling-out. Gerard was reported as saying that Tini didn’t understand stuff. Tini was reported as saying that Gerard was “a monster”. And then there was the matter of the Nobel Prize.

As the Standard Model gained momentum, and was increasingly validated by experiments, the proof of renormalizability of gauge theories started to seem more and more like it would earn a Nobel Prize. But who would actually get the prize? Gerard was clear. But what about Tini? There were rumors of letters arguing one way and the other, and stories of scurrilous campaigning.

Prizes are always a complicated matter. They’re usually created to incentivize something, though realistically they’re often as much as anything for the benefit of the giver. But if they’re successful, they tend to come to represent objectives in themselves. Years ago I remember the wife of a well-known physicist advising me to “do something you can win a prize for”. It didn’t make sense to me, and then I realized why. “I want to do things”, I said, “for which nobody’s thought to invent a prize yet”.

Well, the good news is that in 1999, the Nobel Committee decided to award the Nobel Prize to both Gerard and Tini. “Thank goodness” was the general sentiment.

Tini Goes to Michigan

When I visited Tini in Utrecht in 1979 I got the impression that he and his family were very deeply rooted in the Netherlands and would always be there. I knew that Tini had spent time at CERN, and I think I vaguely knew that he’d been quite involved with the neutrino experiments there. But I didn’t know that SCHOONSCHIP wasn’t originally written when Tini was at Utrecht or at CERN: despite the version in the CERN Program Library saying it was “Written in 1967 by M. Veltman at CERN” the first version was actually written right in the heart of what would become Silicon Valley, during the time Tini worked at the then-very-new Stanford Linear Accelerator Center, in 1963.

He’d gone there along with John Bell (of Bell’s inequalities fame), whose “day job” was working on theoretical aspects of neutrino experiments. (Thinking about the foundations of quantum mechanics was not well respected by other physicists at the time.) Curiously, another person at Stanford at the time was Tony Hearn, who was one of the physicists in the discussion on the terrace at CERN. But unlike Tini, he fell into the computer science and John McCarthy orbit at Stanford, and wrote his REDUCE program in LISP.

By the way, in piecing together the story of Tini’s life and times, I just discovered another “small world” detail. It turns out back in 1961 an early version of the intermediate boson calculations that Tini was interested in had been done by two famous physicists, T. D. Lee and C. N. Yang—with the aid of a computer. And they’d been helped by a certain Peter Markstein at IBM—who, along with his wife Vicky Markstein, would be instrumental nearly 30 years later in getting Mathematica to run on IBM RISC systems. But in any case, back in 1961, Lee and Yang apparently wouldn’t give Tini access to the programs Peter Markstein had created—which was why Tini decided to make his own, and to write SCHOONSCHIP to do it.

But back to the main story. I suspect it was a result of the rift with Gerard ’t Hooft. But in 1980, at the age of 50, Tini transplanted himself and his family from Utrecht to the University of Michigan in Ann Arbor, Michigan. He spent quite a bit of his time at Fermilab near Chicago, in and around neutrino experiments.

But it was in Michigan that I had my next major interaction with Tini. I had started building SMP right after I saw Tini in 1979—and after a somewhat tortuous effort to choose between CERN and CaltechI had accepted a faculty position at Caltech. In early 1981 Version 1.0 of SMP was released. And in the effort to figure out how to develop it further—with the initial encouragement of Caltech—I ended up starting my first company. But soon (through a chain of events I’ve described elsewhere) Caltech had a change of heart, and in June 1982 I decided I was going to quit Caltech.

I wrote to Tini—and, somewhat to my surprise, he quickly began to aggressively try to recruit me to Michigan. He wrote me asking what it would take to get me there. Third on his list was the type of position, “research or faculty”. Second was “salary”. But first was “computers”—adding parenthetically “I understand you want a VAX; this needs some detailing”. The University of Michigan did indeed offer me a nice professorship, but—choosing among several possibilities—I ended up going to the Institute for Advanced Study in Princeton, with the result that I never had the chance to interact with Tini at close quarters.

A few years later, I was working on Mathematica, and what would become the Wolfram Language. And, no, I didn’t use a floating-point representation for algebraic coefficients again. Mathematica 1.0 was released in 1988, and shortly after that Tini told me he was writing a new version of SCHOONSCHIP, in a different language. “What language?”, I asked. “68000 assembler”, he said. “You can’t be serious!” I said. But he was, and soon thereafter a new version of SCHOONSCHIP appeared, written in 68000 assembler.

I think Tini somehow never really fully trusted anything higher level than assembler—proudly telling me things he could do by writing right down “at the metal”. I talked about portability. I talked about compiler optimizers. But he wasn’t convinced. And at the time, perhaps he was still correct. But just last week, for example, I got the latest results from benchmarking the symbolic compiler that we have under development for the Wolfram Language: the compiled versions of some pieces of top-level code run 30x faster than custom-written C code. Yes, the machine is probably now smarter even than Tini at being able to create fast code.

At Michigan, alongside his more directly experimentally related work (which, as I now notice, even included a paper related to a particle physics result of mine from 1978), Tini continued his longtime interest in Feynman diagrams. In 1989, he wrote a paper called “Gammatrica”, about the Dirac gamma matrix computations that are the core of many Feynman diagram calculations. And then in 1994 a textbook called Diagrammatica—kind of like “Diagrammar” but with a Mathematica-rhyming ending.

Tini didn’t publish all that many papers but spent quite a bit of time helping set directions for the US particle physics community. Looking at his list of publications, though, one that stands out is a 1991 paper written in collaboration with his daughter Hélène, who had just got her physics PhD at Berkeley (she subsequently went into quantitative finance): “On the Possibility of Resonances in Longitudinally Polarized Vector Boson Scattering”. It’s a nice paper, charmingly “resonant” with things Tini was thinking about in 1961, even comparing the interactions of W particles with interactions between pions of the kind that were all the rage in 1961.

The Later Tini

Tini retired from Michigan in 1996, returned to the Netherlands and set about building a house. The long-awaited Nobel Prize arrived in 1999.

In 2003 Tini published a book, Facts and Mysteries in Elementary Particle Physics, presenting particle physics and its history for a general audience. Interspersed through the book are one-page summaries of various physicists—often with charming little “gossip” tidbits that Tini knew from personal experience, or picked up from his time in the physics community.

One such page describing some experimental physicists ends:

“The CERN terrace, where you can see the Mont Blanc on the horizon, is very popular among high-energy physicists. You can meet there just about everybody in the business. Many initiatives were started there, and many ideas were born in that environment. So far you can still smoke a cigar there.”

The page has a picture, taken in June 1962—that I rather imagine must mirror what that “symbolic computation origin discussion” looked like (yes, physicists wore ties back then):

“Symbolic computation origin discussion”

Just after Tini won the Nobel Prize, he ran into Rolf Mertig, who was continuing Tini’s tradition of Feynman diagram computation by creating the FeynCalc system for the Wolfram Language. Tini apparently explained that had he not gone into physics, he would have gone into “business”.

I’m not sure if it was before Mathematica 1.0 or after, but I remember Tini telling me that he thought that maybe he should get into the software business. I think Tini felt in some ways frustrated with physics. I remember when I first met him back in 1979 he spent several hours telling me about issues at CERN. One of the important predictions of what became the Standard Model were so-called neutral currents (associated with the Z boson). In the end, neutral currents were discovered in 1973. But Tini explained that many years earlier he started telling people at CERN that they should be able to see neutral currents in their experiments. But for years they didn’t listen to him, and when they finally did, it turned out that—expensive as their earlier experiments had been—they’d thrown out the bubble chamber film that had been produced, and on which neutral currents should have been visible perhaps 15 years earlier.

When Tini won his Nobel Prize, I sent him a congratulations card. He sent a slightly stiff letter in response:

Click to enlarge

When Tini met me in 1979, I’m not sure he expected me to just take off and ultimately build something like Mathematica. But his input—and encouragement—back in 1979 was important in giving me the confidence to start down that road. So, thanks Tini for all you did for me, and for the advice—even though your PS advice I think I still haven’t taken….


Multiway Turing Machines

$
0
0

Wolfram Physics Bulletin

Informal updates and commentary on progress in the Wolfram Physics Project


Over the years I’ve studied the simplest ordinary Turing machines quite a bit, but I’ve barely looked at multiway Turing machines (also known as nondeterministic Turing machines or NDTMs). Recently, though, I realized that multiway Turing machines can be thought of as “maximally minimal” models both of concurrent computing and of the way we think about quantum mechanics in our Physics Project. So now this piece is my attempt to “do the obvious explorations” of multiway Turing machines. And as I’ve found so often in the computational universe, even cases with some of the very simplest possible rules yield some significant surprises....

Ordinary vs. Multiway Turing Machines

An ordinary Turing machine has a rule such as

RulePlot
&#10005
RulePlot[TuringMachine[2506]]

that specifies a unique successor for each configuration of the system (here shown going down the page starting from an initial condition consisting of a blank tape):

RulePlot
&#10005
RulePlot[TuringMachine[2506], {{1, 6}, Table[0, 10]}, 20, 
 Mesh -> True, Frame -> False]

After 100 Years, Can We Finally Crack Post’s Problem of Tag? A Story of Computational Irreducibility, and More

$
0
0
blog-icon-post

“[Despite] Considerable Effort… [It Proved] Intractable”

In the early years of the twentieth century it looked as if—if only the right approach could be found—all of mathematics might somehow systematically be solved. In 1910 Whitehead and Russell had published their monumental Principia Mathematica showing (rather awkwardly) how all sorts of mathematics could be represented in terms of logic. But Emil Post wanted to go further. In what seems now like a rather modern idea (with certain similarities to the core structure of the Wolfram Language, and very much like the string multiway systems in our Physics Project), he wanted to represent the logic expressions of Principia Mathematica as strings of characters, and then have possible operations correspond to transformations on these strings.

In the summer of 1920 it was all going rather well, and Emil Post as a freshly minted math PhD from Columbia arrived in Princeton to take up a prestigious fellowship. But there was one final problem. Having converted everything to string transformations, Post needed to have a theory of what such transformations could do.

He progressively simplified things, until he reached what he called the problem of “tag”. Take a string of 0s and 1s. Drop its first ν elements. Look at the first dropped element. If it’s a 0 add a certain block of elements at the end of the string, and if it’s a 1 add another block. Post solved several cases of this problem.

But then he came across the one he described as 000, 11101 with ν=3. Here’s an example of its behavior:

Style
&#10005
Style[Text[
  Column[Row /@ 
    NestList[
     Replace[#, {{0, _, _, s___} -> {s, 0, 0}, {1, _, _, s___} -> {s, 
          1, 1, 0, 1}}] &, IntegerDigits[5, 2, 3], 10], 
   Spacings -> .2]], FontFamily -> "Roboto"]

After a few steps it just ends up in a simple loop, alternating forever between two strings. Here’s another example, starting now from a different string:

Style
&#10005
Style[Text[
  Column[Row /@ 
    NestList[
     Replace[#, {{0, _, _, s___} -> {s, 0, 0}, {1, _, _, s___} -> {s, 
          1, 1, 0, 1}}] &, IntegerDigits[18, 2, 5], 30]]], 
 FontFamily -> "Roboto"]

Again this ends up in a loop, now involving 6 possible strings.

But what happens in general? To Post, solving this problem was a seemingly simple stepping stone to his program of solving all of mathematics. And he began on it in the early summer of 1921, no doubt expecting that such a simple-to-state problem would have a correspondingly simple solution.

But rather than finding a simple solution, he instead discovered that he could make little real progress. And after months of work he finally decided that the problem was in fact, as he later said, “hopeless”—and as a result, he concluded, so was his whole approach to “solving mathematics”.

What had happened? Well, Post had seen a glimpse of a completely unanticipated but fundamental feature of what we now call computation. A decade later what was going on became a little clearer when Kurt Gödel discovered Gödel’s theorem and undecidability. (As Post later put it: “I would have discovered Gödel’s theorem in 1921—if I had been Gödel.”) Then as the years went by, and Turing machines and other kinds of computational systems were introduced, tag systems began to seem more about computation than about mathematics, and in 1961 Marvin Minsky proved that in fact a suitably constructed tag system could be made to do any computation that any Turing machine could do.

But what about Post’s particular, very simple tag system? It still seemed very surprising that something so simple could behave in such complicated ways. But sixty years after Post’s work, when I started to systematically explore the computational universe of simple programs, it began to seem a lot less surprising. For—as my Principle of Computational Equivalence implies—throughout the computational universe, above some very low threshold, even in systems with very simple rules, I was seeing the phenomenon of computational irreducibility, and great complexity of behavior.

But now a century has passed since Emil Post battled with his tag system. So armed with all our discoveries—and all our modern tools and technology—what can we now say about it? Can we finally crack Post’s problem of tag? Or—simple as it is—will it use the force of computational irreducibility to resist all our efforts?

This is the story of my recent efforts to wage my own battle against Post’s tag system.

The Basic Setup

The Wolfram Language can be seen in part as a descendent of Post’s idea of representing everything in terms of transformation rules (though for symbolic expressions rather than strings). So it’s no surprise that Post’s problem of tag is very simple to set up in the Wolfram Language:

NestList
&#10005
NestList[Replace[{
    {0, _, _, s___} -> {s, 0, 0},
    {1, _, _, s___} -> {s, 1, 1, 0, 1}
    }], {1, 0, 0, 1, 0}, 10] // Column

NestList

Given the initial string, the complete behavior is always determined. But what can happen? In the examples above, what we saw is that after some “transient” the system falls into a cycle which repeats forever.

Here’s a plot for all possible initial strings up to length 7. In each case there’s a transient and a cycle, with lengths shown in the plot (with cycle length stacked on top of transient length):

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{list = Catenate[Table[Tuples[{0, 1}, n], {n, 7}]]}, 
 ListStepPlot[
  Transpose[((Length /@ 
        FindTransientRepeat[TSDirectEvolveList[#, 1000], 4]) & /@ 
     list)], Center, PlotRange -> {0, 28}, 
  PlotStyle -> {Hue[0.1, 1, 1], Hue[0.02, 0.92, 0.8200000000000001]}, 
  PlotLayout -> "Stacked", Joined -> True, Filling -> Automatic, 
  Frame -> True, AspectRatio -> 1/5, 
  FrameTicks -> {{Automatic, 
     None}, {Extract[
      MapThread[
       List[#1, 
         Rotate[Style[StringJoin[ToString /@ #2], 
           FontFamily -> "Roboto", Small], 90 Degree]] &, {Range[0, 
         253], list}], 
      Position[list, 
       Alternatives @@ 
        Select[list, 
         IntegerExponent[FromDigits[#, 2], 2] > Length[#]/2 && 
           Length[#] > 1 &]]], None}}]]

(Note that if the system reaches 00—or another string with less than 3 characters—one can either say that it has a cycle of length 1, or that it stops completely, effectively with a cycle of length 0.) For initial strings up to length 7, the nontrivial cycles observed are of lengths 2 and 6.

Starting from 10010 as above, we can show the behavior directly—or we can try to compensate for the removal of elements from the front at each step by rotating at each step:

MapIndexed
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						MapIndexed[
 With[{func = #1, ind = #2}, 
   ArrayPlot[
    MapIndexed[func, 
     PadRight[TSDirectEvolveList[{1, 0, 0, 1, 0}, 40], 
      If[First[ind] == 1, {Automatic, 17}, Automatic], .25]], 
    Mesh -> True, MeshStyle -> GrayLevel[0.75, 0.75], Frame -> False, 
    ImageSize -> {Automatic, 240}]] &, {# &, 
  RotateLeft[#, 3 (First[#2] - 1)] &}]

We can also show only successive “generations” in which the rule has effectively “gone through the whole string”:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSGenerationEvolveList[{1, 0, 0, 1, 0}, 30], {Automatic, 
   17}, .25], Mesh -> True, MeshStyle -> GrayLevel[0.75, .75], 
 Frame -> False, ImageSize -> {100, Automatic}]

Let’s continue to longer initial sequences. Here are the lengths of transients and cycles for initial sequences up to length 12:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{list = Catenate[Table[Tuples[{0, 1}, n], {n, 12}]]}, 
 ListStepPlot[
  Transpose[((Length /@ 
        FindTransientRepeat[TSDirectEvolveList[#, 1000], 4]) & /@ 
     list)], Center, PlotRange -> All, 
  PlotStyle -> {Hue[0.1, 1, 1], Hue[0.02, 0.92, 0.8200000000000001]}, 
  PlotLayout -> "Stacked", Joined -> True, Filling -> Automatic, 
  Frame -> True, AspectRatio -> 1/6, 
  FrameTicks -> {{Automatic, 
     None}, {Extract[
      MapThread[
       List[#1, 
         Rotate[Style[StringJoin[ToString /@ #2], 
           FontFamily -> "Roboto", Small], 90 Degree]] &, {Range[0, 
         8189], list}], 
      Position[list, 
       Alternatives @@ 
        Select[list, 
         IntegerExponent[FromDigits[#, 2], 2] > Length[#]/1.3 && 
           Length[#] > 7 &]]], None}}]]

All the cycles are quite short—in fact they’re all of lengths 0, 2, 4, 6 or 10. And for initial strings up to length 11, the transients (which we can think of as “halting times”) are at most of length 28. But at length 12 the string 100100100000 suddenly gives a transient of length 419, before finally evolving to the string 00.

Here’s a plot of the sequence of lengths of intermediate strings produced in this case (the maximum length is 56):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSDirectEvolveList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 501], 
 Filling -> Axis, Frame -> True, AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1]]

And, by the way, this gives an indication of why Post called this the “problem of tag” (at the suggestion of his colleague Bennington Gill). Elements keep on getting removed from the “head” of the string, and added to its “tail”. But will the head catch up with the tail? When it does, it’s like someone winning a game of tag, by being able to “reach the last person”.

Here’s a picture of the detailed behavior in the case above:

(Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						(Row[ArrayPlot[#, ImageSize -> {100, Automatic}] & /@ 
     Partition[
      MapIndexed[#, 
       PadRight[
        TSDirectEvolveList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 501],
         Automatic, .25]], UpTo[210]]]) & /@ {# &, 
  RotateLeft[#, 3 (First[#2] - 1)] &}

And here’s the “generational” plot, now flipped around to go from left to right:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose@
   PadRight[
    TSGenerationEvolveList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 
     50], {Automatic, 58}, .25], Mesh -> True, 
 MeshStyle -> GrayLevel[0.75, .75], Frame -> False]

By the way, we can represent the complete history of the tag system just by concatenating the original string with all the blocks of elements that are added to it, never removing blocks of elements at the beginning. In this case this is the length-1260 string we get:

Style
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Style[StringJoin[
  ToString /@ 
   TSDirectEvolveSequence[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 440]],
  FontFamily -> "Roboto", 8]

Plotting the “walk” obtained by going up at each 1 and down at each 0 we get (and not surprisingly, this is basically the same curve as the sequence of total string lengths above):

ListLinePlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListLinePlot[
 Accumulate[
  2 TSDirectEvolveSequence[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 
     440] - 1], Frame -> True, AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1]]

How “random” is the sequence of 0s and 1s? There are a total of 615 1s and 645 0s in the whole sequence—so roughly equal. For length-2 blocks, there are only about 80% as many 01s and 10s as 00s and 11s. For length-3 blocks, the disparities are larger, with only 30% as many 001 blocks occurring as 000 blocks.

And then at length 4, there is something new: none of the blocks

Text
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Text[Row /@ 
  Complement[Tuples[{1, 0}, 4], 
   Union[Partition[
     TSDirectEvolveSequence[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 
      450], 4, 1]]]]

ever appear at all, and 0010 appears only twice, both at the beginning of the sequence. Looking at the rule, it’s easy to see why, for example, 1111 can never occur—because no sequence of the 00s and 1101s inserted by the rule can ever produce it. (We’ll discuss block occurrences more below.)

OK, so we’ve found some fairly complicated behavior even with initial strings of length 12. But what about longer strings? What can happen with them? Before exploring this, it’s useful to look in a little more detail at the structure of the underlying problem.

The Space of Possible States

To find out what can happen in our tag system, we’ve enumerated all possible initial strings up to certain lengths. But it turns out that there’s a lot of redundancy in this—as our plots of “halting times” above might suggest. And the reason is that the way the tag system operates, only every third element in the initial string actually ever matters. As far as the rule is concerned we can just fill in _ for the other elements:

Style
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Style[Text[
  Column[Row /@ 
    NestList[
     Join[Drop[#, 3], {{0, 0}, {1, 1, 0, 1}}[[
        1 + First[#]]]] &, {0, _, _, 1, _, _, 1, _, _, 1, _, _, 
      1, _, _}, 10]]], FontFamily -> "Roboto"]

The _’s will steadily be “eaten up”, and whether they were originally filled in with 0s or 1s will never matter. So given this, we don’t lose any information by using a compressed representation of the strings, in which we specify only every third element:

Style
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Style[Text[
  Grid[Transpose@{Row /@ (MapAt[
          Style[#1, Bold] &, #, {1 ;; -1 ;; 3}] & /@ 
        NestList[
         Join[Drop[#, 3], {{0, 0}, {1, 1, 0, 1}}[[
            1 + First[#]]]] &, {0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 0, 0, 1,
           0, 0}, 10]), 
     Row[{Style[Row[Take[#, 1 ;; -1 ;; 3]], Bold], 
         Style[Row[{Style[":", Gray], Mod[Length[#], 3]}], 
          Small]}] & /@ 
      NestList[
       Join[Drop[#, 3], {{0, 0}, {1, 1, 0, 1}}[[1 + First[#]]]] &, {0,
         0, 1, 1, 1, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0}, 10]}, 
   Dividers -> Center, FrameStyle -> LightGray, Alignment -> Left]], 
 FontFamily -> "Roboto"]

But actually this isn’t quite enough. We also need to say the “phase” of the end of the string: the number of trailing elements after the last block of 3 elements (i.e. the length of the original string mod 3).

So now we can start enumerating non-redundant possible initial strings, specifying them in the compressed representation:

Grid
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Grid[Transpose@
  Partition[
   Text[Style[#, FontFamily -> "Roboto"]] & /@ 
    PhasedStringForm /@ EnumerateInits[3], 6], Spacings -> {1.5, .2}]

Given a string in compressed form, we can explicitly compute its evolution. The effective rules are a little more complicated than for the underlying uncompressed string, but for example the following will apply one step of evolution to any compressed string (represented in the form {phase, elements}):

Replace
&#10005
Replace[
 {{0, {0, s___}} -> {2, {s, 0}}, {0, {1, s___}} -> {1, {s, 1, 1}},
  {1, {0, s___}} -> {0, {s}}, {1, {1, s___}} -> {2, {s, 0}},
  {2, {0, s___}} -> {1, {s, 0}}, {2, {1, s___}} -> {0, {s, 1}}}]

Can we reconstruct an uncompressed string from a compressed one? Well, no, not uniquely. Because the “intermediate” elements that will be ignored by the rule aren’t specified in the compressed form. Given, say, the compressed string 10:2 we know the uncompressed string must be of the form 1__0_ but the _’s aren’t determined. However, if we actually run the rule, we get

Style
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Style[Text[
  Column[Row /@ 
    TSDirectEvolveList[FromPhaseForm[{2, {1, 0}}, _], 3]]], 
 FontFamily -> "Roboto"]

so that the blanks in effect quickly resolve. (By the way, given a compressed string s:0 the uncompressed one is __, for s:1 it is just , and for s:2 it is , with the uncompressed string length mod 3 being equal to the phase.)

So taking all compressed strings up to length 4 here is the sequence of transient and cycle lengths obtained:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Transpose[((Length /@ 
       FindTransientRepeat[TSDirectEvolveList[#, 1000], 4]) & /@ 
    Catenate[Table[DistinctInits[i], {i, 4}]])], Center, 
 PlotRange -> {0, Automatic}, PlotLayout -> "Stacked", 
 PlotStyle -> {Hue[0.1, 1, 1], Hue[0.02, 0.92, 0.8200000000000001]}, 
 Joined -> True, Filling -> Automatic, Frame -> True, 
 AspectRatio -> 1/5]

The first case that is cut off in the plot has halting time 419; it corresponds to the compressed string 1110:0.

We can think of compressed strings as corresponding to possible non-redundant “states” of the tag system. And then we can represent the global evolution of the system by constructing a state transition graph that connects each state to its successor in the evolution. Here is the result starting from distinct length-3 strings (here shown in uncompressed form; the size of each node reflects the length of the string):

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{g = 
   VertexDelete[NestGraph[TSStep, {{0, 0, 0}, {1, 0, 0}}, 400], 0]}, 
 HighlightGraph[
  Graph[g, VertexSize -> (# -> .1 Sqrt[Length[#]] & /@ VertexList[g]),
    VertexStyle -> Hue[0.58, 0.65, 1], 
   EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
   VertexLabels -> (# -> Placed[Row[#], Above] & /@ 
      VertexList[g])], {Style[
    Subgraph[g, FindCycle[g, {1, Infinity}, All]], Thick, Hue[
    0.02, 0.92, 0.8200000000000001]], 
   Pick[VertexList[g], VertexOutDegree[g], 0]}]]

There is a length-2 cycle, indicated in red, and also a “terminating state” indicated in yellow. Here’s the state transition graph starting with all length-1 compressed strings (i.e. non-redundant uncompressed strings with lengths between 3 and 5)—with nodes now labeled just with the (uncompressed) length of the string that they represent:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{g = VertexDelete[NestGraph[TSStep, DistinctInits[1], 400], 0]}, 
 HighlightGraph[
  Graph[g, VertexSize -> (# -> .1 Sqrt[Length[#]] & /@ VertexList[g]),
    VertexStyle -> Hue[0.58, 0.65, 1], 
   EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
   VertexLabels -> (# -> Placed[Length[#], Above] & /@ 
      VertexList[g])], {Style[
    Subgraph[g, FindCycle[g, {1, Infinity}, All]], Thick, Hue[
    0.02, 0.92, 0.8200000000000001]], 
   Pick[VertexList[g], VertexOutDegree[g], 0]}]]

We see the same length-2 cycle and terminating state as we saw before. But now there is also a length-6 cycle. The original “feeder” for this length-6 cycle is the string 10010 (compressed: 11:2), which takes 16 steps to reach the cycle.

Here are the corresponding results for compressed initial strings up to successively greater lengths n, with the lengths of cycles labeled:

GraphicsRow
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						GraphicsRow[
 Table[Labeled[
   Framed[Show[
     With[{g = 
        VertexDelete[
         NestGraph[TSStep, Catenate[Table[DistinctInits[i], {i, n}]], 
          700], 0]}, 
      With[{c = FindCycle[g, {1, Infinity}, All]}, 
       HighlightGraph[
        Graph[g, 
         VertexLabels -> 
          Join[(#[[1, 1]] -> 
               Placed[Style[Length[#], 11, 
                 Darker[Hue[
                  0.02, 0.92, 0.8200000000000001], .2]], {Before, 
                 Below}] & /@ 
             c), # -> Style[1, 11, Darker[Yellow, .4]] & /@ 
            Pick[VertexList[g], VertexOutDegree[g], 0]], 
         VertexSize -> (# -> .3 Sqrt[Length[#]] & /@ VertexList[g]), 
         VertexStyle -> Hue[0.58, 0.65, 1], 
         EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]], {Style[
          Subgraph[g, c], Thick, Hue[0.02, 0.92, 0.8200000000000001]],
          Pick[VertexList[g], VertexOutDegree[g], 0]}]]], 
     ImageSize -> {UpTo[250], UpTo[250]}], FrameStyle -> LightGray], 
   Style[Text[
     Row[{Style["n", Italic], " \[LessEqual] ", ToString[n]}]], 
    10]], {n, 2, 3}]]
GraphicsColumn
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						GraphicsColumn[
 Table[Labeled[
   Framed[Show[
     With[{g = 
        VertexDelete[
         NestGraph[TSStep, Catenate[Table[DistinctInits[i], {i, n}]], 
          700], 0]}, 
      With[{c = FindCycle[g, {1, Infinity}, All]}, 
       HighlightGraph[
        Graph[g, VertexStyle -> Hue[0.58, 0.65, 1], 
         EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
         VertexLabels -> 
          Join[(#[[1, 1]] -> 
               Placed[Style[Length[#], 11, 
                 Darker[Hue[
                  0.02, 0.92, 0.8200000000000001], .2]], {After, 
                 Above}] & /@ 
             c), # -> Style[1, 11, Darker[Yellow, .4]] & /@ 
            Pick[VertexList[g], VertexOutDegree[g], 0]], 
         VertexSize -> (# -> .6 Sqrt[Length[#]] & /@ VertexList[g]), 
         GraphStyle -> "Default"], {Style[Subgraph[g, c], Thick, Red],
          Pick[VertexList[g], VertexOutDegree[g], 0]}]]], 
     ImageSize -> {UpTo[500], UpTo[200]}], FrameStyle -> LightGray], 
   Style[Text[
     Row[{Style["n", Italic], " \[LessEqual] ", ToString[n]}]], 
    10]], {n, 4, 5}], ImageSize -> {550, Automatic}]

A notable feature of these graphs is that at compressed length 4, a long “highway” appears that goes for about 400 steps. The highway basically represents the long transient first seen for the initial string 11:2. There is one “on-ramp” for this string, but then there is also a tree of other states that enter the same highway.

Why is there a “highway” in the first place? Basically because the length-419 transient involves strings that are long compared to any we are starting from—so nothing can feed into it after the beginning, and it basically just has to “work itself through” until it reaches whatever cycle it ends up in.

When we allow initial strings with compressed length up to 6 a new highway appears, dwarfing the previous one (by the way, most of the wiggliness we see is an artifact of the graph layout):

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{n = 6}, 
 Labeled[Framed[
   With[{g = 
      VertexDelete[
       NestGraph[TSStep, Catenate[Table[DistinctInits[i], {i, n}]], 
        20000], 0]}, 
    With[{c = FindCycle[g, {1, Infinity}, All]}, 
     HighlightGraph[
      Graph[g, VertexStyle -> Hue[0.58, 0.65, 1], 
       EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
       VertexLabels -> 
        Join[(#[[1, 1]] -> 
             Placed[Style[Length[#], 11, 
               Darker[Hue[
                0.02, 0.92, 0.8200000000000001], .2]], {Before, 
               Above}] & /@ 
           c), # -> Style[1, 11, Darker[Yellow, .4]] & /@ 
          Pick[VertexList[g], VertexOutDegree[g], 0]], 
       VertexSize -> (# -> .6 Sqrt[Length[#]] & /@ VertexList[g]), 
       GraphStyle -> "Default"], {Style[Subgraph[g, c], Thick, Red], 
       Pick[VertexList[g], VertexOutDegree[g], 0]}]]], 
   FrameStyle -> LightGray], 
  Style[Text[
    Row[{Style["n", Italic], " \[LessEqual] ", ToString[n]}]], 10]]]

The first initial state to reach this highway is 111010:0 (uncompressed: 100100100000100000)—which after 2141 steps evolves to a cycle of length 28. Here are the lengths of the intermediate strings along this highway (note the cycle at the end):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSDirectEvolveList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
    0, 0}, 2300], Filling -> Axis, Frame -> True, AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1]]

And here are the “generational states” reached (note that looking only at generations makes the final 28-cycle show up as a 1-cycle):

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose@
   PadRight[
    TSGenerationEvolveList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 
      0, 0, 0, 0}, 80], {Automatic, 180}, .25], Frame -> False]

Or looking at “compressed strings” (i.e. including only every third element of each string):

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose@
   PadRight[
    Last /@ TSGenerationPhaseEvolveList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0,
        0, 0, 1, 0, 0, 0, 0, 0}, 80], {Automatic, 70}, .25], 
 Frame -> False]

If we consider all initial strings up to compressed length 6, we get the following transient+cycle lengths:

Transpose
&#10005

And what we see is that there are particular lengths of transients—corresponding to the highways in the state transition graph above—to which certain strings evolve. If we plot the distribution of halting (i.e. transient) times for all the strings we find, then, as expected, it peaks around the lengths of the main highways:

Histogram
>
&#10005

So given a particular “on-ramp to a highway”—or, for that matter, a state on a cycle—what states will evolve to it? In general there’ll be a tree of states in the state transition graph that are the “predecessors” of a given state—in effect forming its “basin of attraction”.

For any particular string the rule gives a unique successor. But we can also imagine “running the rule backwards”. And if we do this, it turns out that any given compressed string can have 0, 1 or 2 immediate predecessors. For example, 000:0 has the unique predecessor 0000:1. But 001:0 has both 0001:1 and 100:2 as predecessors. And for example 001:1 has no predecessors. (For uncompressed strings, there are always either 0 or 4 immediate predecessors.)

Any state that has no predecessors can occur only as the initial string; it can never be generated in the evolution. (There are similar results for substrings, as we’ll discuss later.)

And if we start from a state that does have at least one predecessor, we can in general construct a whole tree of “successively further back” predecessors. Here, for example, is the 10-step tree for 000:2:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{g = 
   Graph[# -> TSPhaseStep[#] & /@ 
     Union[Flatten[
       NestList[
        Flatten[PhaseStepBackwards[#] & /@ #, 1] &, {{0, {0, 0, 0}}}, 
        10], 1]]]}, 
 Graph[g, VertexLabels -> (# -> PhasedStringForm[#] & /@ 
     VertexList[g]), AspectRatio -> 1, 
  VertexStyle -> Hue[0.58, 0.65, 1], 
  EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]]]

Here it is after 30 steps, in two different renderings:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{g = 
   Graph[# -> TSPhaseStep[#] & /@ 
     Union[Flatten[
       NestList[
        Flatten[PhaseStepBackwards[#] & /@ #, 1] &, {{0, {0, 0, 0}}}, 
        30], 1]]]}, 
 Graph[g, VertexStyle -> Hue[0.58, 0.65, 1], 
  EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
  GraphLayout -> "LayeredDigraphEmbedding", AspectRatio -> 1/2]]
With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{g = 
   Graph[# -> TSPhaseStep[#] & /@ 
     Union[Flatten[
       NestList[
        Flatten[PhaseStepBackwards[#] & /@ #, 1] &, {{0, {0, 0, 0}}}, 
        30], 1]]]}, 
 Graph[g, VertexStyle -> Hue[0.58, 0.65, 1], 
  EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]]]

If we continue this particular tree we’ll basically get a state transition graph for all states that eventually terminate. Not surprisingly, there’s considerable complexity in this tree—though the number of states after t steps does grow roughly exponentially (apparently like ):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  NestList[Flatten[PhaseStepBackwards[#] & /@ #, 
     1] &, {{0, {0, 0, 0}}}, 100], Center, Frame -> True, 
 Filling -> Axis, ScalingFunctions -> "Log", AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1]]

By the way, there are plenty of states that have finite predecessor trees. For example 1100:0 yields a tree which grows only for 21 steps, then stops:

Rotate
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Rotate[With[{g = 
    Graph[# -> TSPhaseStep[#] & /@ 
      Union[Flatten[
        NestList[
         Flatten[PhaseStepBackwards[#] & /@ #, 
           1] &, {{0, {1, 1, 0, 0}}}, 21], 1]]]}, 
  Graph[g, GraphLayout -> "LayeredDigraphEmbedding", 
   VertexStyle -> Hue[0.58, 0.65, 1], 
   EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]]], 90 Degree]

The Cycle Structure

At least in all the cases we’ve seen so far, our tag system always evolves to a cycle (or terminates in a trivial state). But what cycles are possible? In effect any cycle state S must be a solution to a “tag eigenvalue equation” of the form S = S for some p, where T is the “tag evolution operator”.

Starting with compressed strings of length 1, only one cycle can ever be reached:

First
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						First[With[{n = 1}, 
  With[{g = Graph[DirectedEdge @@@ Partition[#, 2, 1, 1]]}, 
     Graph[g, 
      VertexLabels -> (# -> Placed[Row[#], Above] & /@ 
         VertexList[g]), VertexStyle -> Hue[0.58, 0.65, 1], 
      EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]]] & /@ (Last[
       FindTransientRepeat[TSDirectEvolveList[FromPhaseForm[#], 1000],
         3]] & /@ (First /@ 
       With[{v = PostTagSystem[AllInits[n]]}, 
        Map[v["State", #] &, 
         Map[First] /@ 
          FindCycle[v["StateGraph"], {1, Infinity}, All], {2}]]))]]

Starting with compressed strings of length 2 a 6-cycle appears (here shown labeled respectively with uncompressed and with compressed strings):

{Last
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						{Last[With[{n = 2}, 
   With[{g = Graph[DirectedEdge @@@ Partition[#, 2, 1, 1]]}, 
      Graph[g, 
       VertexLabels -> (# -> Placed[Row[#], Above] & /@ 
          VertexList[g]), VertexStyle -> Hue[0.58, 0.65, 1], 
       EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]]] & /@ (Last[
        FindTransientRepeat[
         TSDirectEvolveList[FromPhaseForm[#], 1000], 3]] & /@ (First /@
         With[{v = PostTagSystem[AllInits[n]]}, 
         Map[v["State", #] &, 
          Map[First] /@ 
           FindCycle[v["StateGraph"], {1, Infinity}, All], {2}]]))]], 
 Last[With[{n = 2}, 
   With[{g = Graph[DirectedEdge @@@ Partition[#, 2, 1, 1]]}, 
      Graph[g, VertexStyle -> Hue[0.58, 0.65, 1], 
       EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
       VertexLabels -> (# -> 
            Placed[PhasedStringForm[ToPhaseForm[#]], Above] & /@ 
          VertexList[g])]] & /@ (Last[
        FindTransientRepeat[
         TSDirectEvolveList[FromPhaseForm[#], 1000], 3]] & /@ (First /@
         With[{v = PostTagSystem[AllInits[n]]}, 
         Map[v["State", #] &, 
          Map[First] /@ 
           FindCycle[v["StateGraph"], {1, Infinity}, All], {2}]]))]]}

No new cycles appear until one has initial strings of compressed length 4, but then one gets (where now the states are labeled with their uncompressed lengths):

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{n = 4}, 
 Framed[GraphicsRow[
   Sort[With[{g = Graph[DirectedEdge @@@ Partition[#, 2, 1, 1]]}, 
       Graph[g, VertexStyle -> Hue[0.58, 0.65, 1], 
        EdgeStyle -> Hue[0.58, 1, 0.7000000000000001], 
        VertexLabels -> (# -> Placed[Length[#], Above] & /@ 
           VertexList[g])]] & /@ (Last[
         FindTransientRepeat[
          TSDirectEvolveList[FromPhaseForm[#], 1000], 
          3]] & /@ (First /@ 
         With[{v = PostTagSystem[AllInits[n]]}, 
          Map[v["State", #] &, 
           Map[First] /@ 
            FindCycle[v["StateGraph"], {1, Infinity}, All], {2}]]))]],
   FrameStyle -> LightGray]]

The actual cycles are as follows

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{n = 4}, 
 ArrayPlot[PadRight[#, Automatic, .25], Mesh -> True, 
    MeshStyle -> GrayLevel[0.75, 0.75], 
    ImageSize -> {Automatic, Length[#] 11}] & /@ 
  Sort[(ResourceFunction["CanonicalListRotation"][
       Last[FindTransientRepeat[
         TSDirectEvolveList[FromPhaseForm[#], 1000], 
         3]]] & /@ (First /@ 
       With[{v = PostTagSystem[AllInits[n]]}, 
        Map[v["State", #] &, 
         Map[First] /@ 
          FindCycle[v["StateGraph"], {1, Infinity}, All], {2}]]))]]

while the ones from length-5 initial strings are:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{n = 5}, 
 ArrayPlot[PadRight[#, Automatic, .25], Mesh -> True, 
    MeshStyle -> GrayLevel[0.75, 0.75], 
    ImageSize -> {Automatic, Length[#] 7}] & /@ 
  Sort[(ResourceFunction["CanonicalListRotation"][
       Last[FindTransientRepeat[
         TSDirectEvolveList[FromPhaseForm[#], 1000], 
         3]]] & /@ (First /@ 
       With[{v = PostTagSystem[AllInits[n]]}, 
        Map[v["State", #] &, 
         Map[First] /@ 
          FindCycle[v["StateGraph"], {1, Infinity}, All], {2}]]))]]

What larger cycles can occur? It is fairly easy to see that a compressed string consisting of any sequence of the blocks 01 and 1100 will yield a state on a cycle. To find out about uncompressed strings on cycles, we can just apply the rule 000, 11101, with the result that we conclude that any sequence of the length-6 and length-12 blocks 001101 and 110111010000 will give a state on a cycle.

If we plot the periods of cycles against the lengths of their “seed” strings, we get:

ListPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListPlot[Style[
  Catenate[Table[{Length[Flatten[#]], 
       Length[FindRepeat[TSDirectEvolveList[Flatten[#], 1000]]]} & /@ 
     Tuples[{{0, 0, 1, 1, 0, 1}, {1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0, 
        0}}, n], {n, 10}]], Hue[0.02, 0.92, 0.8200000000000001]], 
 Frame -> True, PlotStyle -> PointSize[.02]]

If we generate cycles from sequences of, say, b of our 01, 1100 blocks, how many of the cycles we get will be distinct? Here are the periods of the distinct cycles for successive b:

Text
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Text[Grid[
  Table[{b, 
    Length /@ 
     Union[ResourceFunction["CanonicalListRotation"][
         FindRepeat[TSDirectEvolveList[Flatten[#], 1000]]] & /@ 
       Tuples[{{0, 0, 1, 1, 0, 1}, {1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0, 
          0}}, b]]}, {b, 6}], Frame -> All, FrameStyle -> Gray]]

The total number of cycles turns out to be:

DivisorSum
&#10005
DivisorSum[n, k |-> EulerPhi[k] 2^(n/k)]/n
Table
&#10005
Table[DivisorSum[n, k |-> EulerPhi[k] 2^(n/k)]/n, {n, 15}]

We can also ask an inverse question: of all 2n (uncompressed) strings of length n, how many of them lie on cycles of the kind we have identified? The answer is the same as the number of distinct “cyclic necklaces” with n beads, each 0 or 1, with no pair of 0s adjacent:

DivisorSum
&#10005
DivisorSum[n, k |-> EulerPhi[n/k] LucasL[k]]/n
Table
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
					Table[DivisorSum[n, k |-> EulerPhi[n/k] LucasL[k]]/n, {n, 20}]

Asymptotically this is about —implying that of all strings of length n, only a fraction of them will be on cycles, so that for large n the overwhelming majority of strings will not be on cycles, at least of this kind.

But are there other kinds of cycles? It turns out there are, though they do not seem to be common or plentiful. One family—always of period 6—are seeded by compressed strings of the form (with uncompressed length 16 + 18m):

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[PadRight[#, Automatic, .25], Mesh -> True, 
   MeshStyle -> GrayLevel[0.75, 0.75], 
   ImageSize -> {Automatic, Length[#] 8}] & /@ 
 Table[FindRepeat[
   TSDirectEvolveList[
    Flatten[Flatten[{{0, 0, 1, 1, 1}, 
        Table[{0, 0, 0, 1, 1, 1}, m]}] /. {1 -> {1, 1, 0, 1}, 
       0 -> {0, 0}}], 100]], {m, 3}]

But there are other cases too. The first example appears with initial compressed strings of length 9. The length-13 compressed string 0011111110100 (with uncompressed length 39) yields the period-40 cycle (with uncompressed string lengths between 37 and 44):

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[PadRight[#, Automatic, .25], Mesh -> True, 
   MeshStyle -> GrayLevel[0.75, 0.75], 
   ImageSize -> {Automatic, Length[#] 4}] &[
 FindRepeat[
  TSDirectEvolveList[{0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 1, 
    1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0, 0, 0, 
    0}, 400]]]

The next example occurs with an initial compressed string of length 15, and a compressed “seed” of length 24—and has period 282:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[PadRight[#, Automatic, .25], Frame -> False] &[
 FindRepeat[
  TSDirectEvolveList[
   Flatten[{0, 1, 1, 1, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 1, 1, 1, 
      1, 1, 1, 0, 0} /. {1 -> {1, 1, 0, 1}, 0 -> {0, 0}}], 1000]]]

And I’ve found one more example (that arises from an initial compressed string of length 18) and has period 66:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[PadRight[#, Automatic, .25], Frame -> False] &[
 FindRepeat[
  TSDirectEvolveList[
   Flatten[{0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 0, 
      0, 0, 1, 1} /. {1 -> {1, 1, 0, 1}, 0 -> {0, 0}}], 1000]]]

If we look at these cycles in “generational” terms, they are of lengths 3, 11 and 14, respectively (note that the second two pictures above start with “incomplete generations”):

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[PadRight[#, Automatic, .25], Frame -> False, 
     ImageSize -> {Automatic, 140}] &[
   TSGenerationEvolveList[#, 
    60]] & /@ ((First[Last[#]] &@
      FindTransientRepeat[TSGenerationEvolveList[#, 100], 
       3]) & /@ {{0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 1, 1, 
     0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0, 0, 0, 
     0}, Flatten[{0, 1, 1, 1, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 1, 
       1, 1, 1, 1, 1, 0, 0} /. {1 -> {1, 1, 0, 1}, 0 -> {0, 0}}], 
    Flatten[{0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 0, 
       0, 0, 1, 1} /. {1 -> {1, 1, 0, 1}, 0 -> {0, 0}}]})

Exploring Further

I don’t know how far Emil Post got in exploring his tag system by hand a century ago. And I rather suspect that we’ve already gone a lot further here than he ever did. But what we’ve seen has just deepened the mystery of what tag systems can do. So far, every initial string we’ve tried has evolved to a cycle (or just terminated). But will this always happen? And how long can it take?

So far, the longest transient we’ve seen is 2141 steps—from the length-6 compressed string 111010:0. Length-7 and length-8 strings at most just “follow the same highway” in the state transition graph, and don’t give longer transients. But at length 9 something different happens: 111111010:0 takes 24,552 steps to evolve a 6-cycle (with string length 12), with the lengths of intermediate (compressed) strings being:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Quotient[Length /@ 
   TSDirectEvolveList[PuffOut[{1, 1, 1, 1, 1, 1, 0, 1, 0}], 25300], 
  3], Center, Frame -> True, Filling -> Axis, AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1], MaxPlotPoints -> 4000]

Plotting (from left to right) the actual elements in compressed strings in each “generation” this shows in more detail what’s “going on inside”:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose[
   PadRight[
    Last /@ TSGenerationPhaseEvolveList[
      PuffOut[{1, 1, 1, 1, 1, 1, 0, 1, 0}], 400], {Automatic, 
     230}, .25]], Frame -> False]

In systematically exploring what can happen in tag systems, it’s convenient to specify initial compressed strings by converting their sequences of 1s and 0s to decimal numbers—but because our strings can have leading 0s we have to include the length, say as a prefix. So with this setup our length-9 “halting time winner” 111111010:0 becomes 9:506:0.

The next “winner” is 12:3962:0, which takes 253,456 steps to evolve to a 6-cycle:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[{First[#], Length[#[[2, 2]]]} & /@ 
  With[{re = 
     PostTagSystem[{0, {1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 1, 0}}]}, 
   Table[{i, re["State", i]}, {i, 1, 253456 + 100, 100}]], Center, 
 Frame -> True, Filling -> Axis, AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1]]

In generational form the explicit evolution in this case is:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose[
   PadRight[
    Last /@ TSGenerationPhaseEvolveList[
      PuffOut[{1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 1, 0}], 950], 
    Automatic, .25]], Frame -> False]

The first case to take over a million steps is 15:30166:0—which terminates after 20,858,103 steps:

Show
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Show[LengthsPlotDecimal[{0, 30166}, 15, 20858103, 4000, 10^6], 
 FrameTicks -> {{Automatic, 
    None}, {Thread[{Range[0, 20][[1 ;; -1 ;; 5]], 
      Append[Range[0, 15][[1 ;; -1 ;; 5]], "20 million"]}], None}}]

The first case to take over a billion steps is 20:718458:0—which leads to a 6-cycle after 2,586,944,112 steps:

Show
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Show[LengthsPlotDecimal[{0, 718458}, 20, 2586944112, 1000000], 
 FrameTicks -> {{Automatic, 
    None}, {Thread[{Range[0, 2500][[1 ;; -1 ;; 500]], 
      Append[Range[0, 2000][[1 ;; -1 ;; 500]], "2500 million"]}], 
    None}}]

Here’s table of all the “longest-so-far” winners through compressed initial length-28 strings (i.e. covering all  2 × 1025 ordinary initial strings up to length 84):

Text
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Text[Grid[
  Prepend[{DecimalStringForm[{First[#], #[[2, 1]]}], #[[2, 2, 1]], 
      If[# == 0, Style[#, Gray], #] &[#[[2, 2, 2]]]} & /@ {{4, {0, 
        14} -> {419, 0}}, {6, {0, 58} -> {2141, 28}}, {9, {0, 
        506} -> {24552, 6}}, {12, {0, 3962} -> {253456, 6}}, {13, {0, 
        5854} -> {341992, 6}}, {15, {0, 16346} -> {20858069, 
        0}}, {15, {0, 30074} -> {357007576, 6}}, {20, {0, 
        703870} -> {2586944104, 6}}, {22, {0, 3929706} -> {2910925472,
         6}}, {24, {0, 12410874} -> {50048859310, 0}}, {25, {0, 
        33217774} -> {202880696061, 
        6, {0, {0, 1, 1, 1, 0, 0}}}}, {27, {0, 
        125823210} -> {259447574536, 
        6, {0, {0, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1}}}}, {28, {2, 
        264107671} -> {643158954877, 
        10, {0, {0, 1, 1, 1, 0, 0, 1, 1, 0, 0}}}}}, 
   Style[#, Italic] & /@ {"initial state", "steps", "cycle length"}], 
  Frame -> All, Alignment -> {{Left, Right, Right}}, 
  FrameStyle -> GrayLevel[.7], Background -> {None, {GrayLevel[.9]}}]]

And here are their “size traces”:

GraphicsGrid
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						GraphicsGrid[
 Partition[
  ParallelMap[
   Show[If[#[[1]] < 9, 
      LengthsPlotDecimalSmall[#[[2, 1]], #[[1]], #[[2, 2, 1]]], 
      LengthsPlotDecimal[#[[2, 1]], #[[1]], #[[2, 2, 1]], 
       8 Quotient[#[[2, 2, 1]], 8000]]], 
     FrameTicks -> 
      None] &, {{4, {0, 14} -> {419, 0}}, {6, {0, 58} -> {2141, 
       28}}, {9, {0, 506} -> {24552, 6}}, {12, {0, 3962} -> {253456, 
       6}}, {13, {0, 5854} -> {341992, 6}}, {15, {0, 
       16346} -> {20858069, 0}}, {15, {0, 30074} -> {357007576, 
       6}}, {20, {0, 703870} -> {2586944104, 6}}, {22, {0, 
       3929706} -> {2910925472, 6}}, {24, {0, 
       12410874} -> {50048859310, 0}}, {25, {0, 
       33217774} -> {202880696061, 6}}, {27, {0, 
       125823210} -> {259447574536, 6}}, {28, {2, 
       264107671} -> {643158954877, 10}}}], UpTo[3]]]

One notable thing here—that we’ll come back to—is that after the first few cases, it’s very difficult to tell the overall scale of these pictures. On the first row, the longest x axis is about 20,000 steps; on the last row it is about 600 billion.

But probably the most remarkable thing is that we now know that for all (uncompressed) initial strings up to length 75, the system always eventually evolves to a cycle (or terminates).

Are They Like Random Walks?

Could the sequences of lengths in our tag system be like random walks? Obviously they can’t strictly be random walks because given an initial string, each entire “walk” is completely determined, and nothing probabilistic or random is introduced.

But what if we look at a large collection of initial conditions? Could the ensemble of observed walks somehow statistically be like random walks? From the basic construction of the tag system we know that at each step the (uncompressed) string either increases or decreases in length by one element depending on whether its first element is 1 or 0.

But if we just picked increase or decrease at random here are two typical examples of ordinary random walks we’d get:

(SeedRandom
&#10005
(SeedRandom[#]; 
   ListStepPlot[Accumulate[RandomChoice[{-1, 1}, 2000]], 
    Frame -> True, Filling -> Axis, AspectRatio -> 1/3, 
    ImageSize -> 300, PlotStyle -> Hue[0.07, 1, 1]]) & /@ {3442, 3447}

One very obvious difference from our tag system case is these walks can go below 0, whereas in the tag system case once one’s reached something at least close to 0 (corresponding to a cycle), the walk stops. (In a market analogy, the time series ends if there’s “bankruptcy” where the price hits 0.)

An important fact about random walks (at least in one dimension) is that with probability 1 they always eventually reach any particular value, like 0. So if our tag system behaved enough like a random walk, we might have an argument that it must “terminate with probability 1” (whatever that might mean given its discrete set of possible initial conditions).

But how similar can the sequence generated by a tag system actually be to an ordinary random walk? An important fact is that—beyond its initial condition—any tag system sequence must always consist purely of concatenations of the blocks 00 and 1101, or in other words, the sequence must be defined by a path through the finite automaton:

And from this we can see that—while all 2-grams and 3-grams can occur—the 4-grams 1111,1100, 0101 and 0010 can never occur. In addition, if we assume that 0s and 1s occur with equal probability at the beginning of the string, then the blocks 00 and 1101 occur with equal probability, but the 3-grams 000, 011 occur with double the probabilities of the others.

In general the numbers of possible m-grams for successive m are 2, 4, 8, 12, 15, 20, 25, 33, 41, … or for all m ≥ 3:

5 + \!\(\*UnderoverscriptBox
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						\!\(
\*UnderoverscriptBox[\(\[Sum]\), \(i\), \(m\)]\(Fibonacci[Ceiling[
\*FractionBox[\(i\), \(2\)] + 2]]\)\) + 5 == 
 If[EvenQ[m], 2 Fibonacci[m/2 + 4], Fibonacci[(m + 11)/2]] - 1

Asymptotically this is —implying a limiting set entropy of per element. The relative frequencies of m-grams that appear (other than 0000…) are always of the form . The following lists for each m the number of m-grams that appear at given multiplicities (as obtained from Flatten[DeBruijnSequence[{{0,0},{1,1,0,1}},m]]):

&#10005

(This implies a “p log p” measure entropy of below 0.1.)

So what happens in actual tag system sequences? Once clear of the initial conditions, they seem to quite accurately follow these probabilistic (“mean-field theory”) estimates, though with various fluctuations. In general, the results are quite different from a pure ordinary random walk with every element independent, but in agreement with the estimates for a “00, 1101 random walk”.

Another difference from an ordinary random walk is that our walks end whenever they reach a cycle—and we saw above that there are an infinite number of cycles, of progressively greater sizes. But the density of such “trap” states is small: among all size-n strings, only perhaps of them lie on cycles.

The standard theory of random walks says, however, that in the limit of infinitely large strings and long walks, if there is indeed a random process underneath, these things will not matter: we’ll have something that is in the same universality class as the ordinary ±1 random walk, with the same large-scale statistical properties.

But what about our tag systems that survive billions of steps before hitting 0? Could genuine random walks plausibly survive that long? The standard theory of first passage times (or “stopping times”) tells us that the probability for a random walk starting at 0 to first reach x (or, equivalently, for a walk starting at x to reach 0) at time t is:

P(t) = (x exp(-(x^2/(2 t))))/Sqrt
&#10005
P(t) = (x exp(-(x^2/(2 t))))/Sqrt[2 \[Pi] t^3]

This shows the probability of starting from x and first reaching 0 as a function of the number of steps:

Off
&#10005
Off[General::munfl]; Plot[

 Evaluate[Table[
   If[x < 4, Callout, #1 &][(E^(-(x^2/(2 t))) x)/(
    Sqrt[2 \[Pi]] Sqrt[t^3]), x], {x, 5}]], {t, 0, 1000}, 
 ScalingFunctions -> {"Log", "Log"}, AspectRatio -> 1/3, 
 Frame -> True, Axes -> False]

The most likely stopping time is , but there is a long tail, and the probability of surviving for a time longer than t is:

erf(x/Sqrt
&#10005
erf(x/Sqrt[2 t]) \[TildeTilde] Sqrt[2/(\[Pi] t)] x

How does this potentially apply to our systems? Assume we start from a string of (compressed) length n. This implies that the probability to survive for t steps (before “reaching x = 0”) is about Sqrt[2/(Pi t)] n. But there are 3 possible strings of length n. So we can roughly estimate that one of them might survive for about 18 /Pi steps, or at least a number of steps that increases roughly exponentially with n.

And our results for “longest-so-far winners” above do in fact show roughly exponential increase with n (the dotted line is ):

Show
&#10005
Show[ListPlot[{{4, 419}, {6, 2141}, {9, 24552}, {12, 253456}, {13, 
    341992}, {15, 20858069}, {15, 357007576}, {20, 2586944104}, {22, 
    2910925472}, {24, 50048859310}, {25, 202880696061}}, 
  ScalingFunctions -> "Log", Frame -> True], 
 Plot[4^(.75 n), {n, 1, 25}, ScalingFunctions -> "Log", 
  PlotStyle -> Directive[LightGray, Dotted]]]

We can do a more detailed comparison with random walks by looking at the complete distribution of halting (AKA stopping) times for tag systems. Here are the results for all n = 15 and 25 initial strings:

GraphicsRow
&#10005

Plotting these on a log scale we get

GraphicsRow
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						GraphicsRow[
 ListStepPlot[Transpose[{Most[#1], #2} & @@ #2], Frame -> True, 
    Filling -> Axis, ScalingFunctions -> "Log", 
    PlotRange -> {{1, #1}, Automatic}, PlotStyle -> Hue[0.07, 1, 1], 
    FrameTicks -> {{None, 
       None}, {Thread[{Range[2, 10, 
          2], {"\!\(\*SuperscriptBox[\(10\), \(2\)]\)", 
          "\!\(\*SuperscriptBox[\(10\), \(4\)]\)", 
          "\!\(\*SuperscriptBox[\(10\), \(6\)]\)", 
          "\!\(\*SuperscriptBox[\(10\), \(8\)]\)",  
          "\!\(\*SuperscriptBox[\(10\), \(10\)]\)"}}], 
       None}}] & @@@ {{5, 
    hist15}, {9, {{
     9/10, 23/25, 47/50, 24/25, 49/50, 1, 51/50, 26/25, 53/50, 27/25, 
      11/10, 28/25, 57/50, 29/25, 59/50, 6/5, 61/50, 31/25, 63/50, 
      32/25, 13/10, 33/25, 67/50, 34/25, 69/50, 7/5, 71/50, 36/25, 
      73/50, 37/25, 3/2, 38/25, 77/50, 39/25, 79/50, 8/5, 81/50, 
      41/25, 83/50, 42/25, 17/10, 43/25, 87/50, 44/25, 89/50, 9/5, 
      91/50, 46/25, 93/50, 47/25, 19/10, 48/25, 97/50, 49/25, 99/50, 
      2, 101/50, 51/25, 103/50, 52/25, 21/10, 53/25, 107/50, 54/25, 
      109/50, 11/5, 111/50, 56/25, 113/50, 57/25, 23/10, 58/25, 
      117/50, 59/25, 119/50, 12/5, 121/50, 61/25, 123/50, 62/25, 5/2, 
      63/25, 127/50, 64/25, 129/50, 13/5, 131/50, 66/25, 133/50, 
      67/25, 27/10, 68/25, 137/50, 69/25, 139/50, 14/5, 141/50, 71/25,
       143/50, 72/25, 29/10, 73/25, 147/50, 74/25, 149/50, 3, 151/50, 
      76/25, 153/50, 77/25, 31/10, 78/25, 157/50, 79/25, 159/50, 16/5,
       161/50, 81/25, 163/50, 82/25, 33/10, 83/25, 167/50, 84/25, 
      169/50, 17/5, 171/50, 86/25, 173/50, 87/25, 7/2, 88/25, 177/50, 
      89/25, 179/50, 18/5, 181/50, 91/25, 183/50, 92/25, 37/10, 93/25,
       187/50, 94/25, 189/50, 19/5, 191/50, 96/25, 193/50, 97/25, 
      39/10, 98/25, 197/50, 99/25, 199/50, 4, 201/50, 101/25, 203/50, 
      102/25, 41/10, 103/25, 207/50, 104/25, 209/50, 21/5, 211/50, 
      106/25, 213/50, 107/25, 43/10, 108/25, 217/50, 109/25, 219/50, 
      22/5, 221/50, 111/25, 223/50, 112/25, 9/2, 113/25, 227/50, 
      114/25, 229/50, 23/5, 231/50, 116/25, 233/50, 117/25, 47/10, 
      118/25, 237/50, 119/25, 239/50, 24/5, 241/50, 121/25, 243/50, 
      122/25, 49/10, 123/25, 247/50, 124/25, 249/50, 5, 251/50, 
      126/25, 253/50, 127/25, 51/10, 128/25, 257/50, 129/25, 259/50, 
      26/5, 261/50, 131/25, 263/50, 132/25, 53/10, 133/25, 267/50, 
      134/25, 269/50, 27/5, 271/50, 136/25, 273/50, 137/25, 11/2, 
      138/25, 277/50, 139/25, 279/50, 28/5, 281/50, 141/25, 283/50, 
      142/25, 57/10, 143/25, 287/50, 144/25, 289/50, 29/5, 291/50, 
      146/25, 293/50, 147/25, 59/10, 148/25, 297/50, 149/25, 299/50, 
      6, 301/50, 151/25, 303/50, 152/25, 61/10, 153/25, 307/50, 
      154/25, 309/50, 31/5, 311/50, 156/25, 313/50, 157/25, 63/10, 
      158/25, 317/50, 159/25, 319/50, 32/5, 321/50, 161/25, 323/50, 
      162/25, 13/2, 163/25, 327/50, 164/25, 329/50, 33/5, 331/50, 
      166/25, 333/50, 167/25, 67/10, 168/25, 337/50, 169/25, 339/50, 
      34/5, 341/50, 171/25, 343/50, 172/25, 69/10, 173/25, 347/50, 
      174/25, 349/50, 7, 351/50, 176/25, 353/50, 177/25, 71/10, 
      178/25, 357/50, 179/25, 359/50, 36/5, 361/50, 181/25, 363/50, 
      182/25, 73/10, 183/25, 367/50, 184/25, 369/50, 37/5, 371/50, 
      186/25, 373/50, 187/25, 15/2, 188/25, 377/50, 189/25, 379/50, 
      38/5, 381/50, 191/25, 383/50, 192/25, 77/10, 193/25, 387/50, 
      194/25, 389/50, 39/5, 391/50, 196/25, 393/50, 197/25, 79/10, 
      198/25, 397/50, 199/25, 399/50, 8, 401/50, 201/25, 403/50, 
      202/25, 81/10, 203/25, 407/50, 204/25, 409/50, 41/5, 411/50, 
      206/25, 413/50, 207/25, 83/10, 208/25, 417/50, 209/25, 419/50, 
      42/5, 421/50, 211/25, 423/50, 212/25, 17/2, 213/25, 427/50, 
      214/25, 429/50, 43/5, 431/50, 216/25, 433/50, 217/25, 87/10, 
      218/25, 437/50, 219/25, 439/50, 44/5, 441/50, 221/25, 443/50, 
      222/25, 89/10, 223/25, 447/50, 224/25, 449/50, 9, 451/50, 
      226/25, 453/50, 227/25, 91/10, 228/25, 457/50, 229/25, 459/50, 
      46/5, 461/50, 231/25, 463/50, 232/25, 93/10, 233/25, 467/50, 
      234/25, 469/50, 47/5, 471/50, 236/25, 473/50, 237/25, 19/2, 
      238/25, 477/50, 239/25, 479/50, 48/5, 481/50, 241/25, 483/50, 
      242/25, 97/10, 243/25, 487/50, 244/25, 489/50, 49/5, 491/50, 
      246/25, 493/50, 247/25, 99/10, 248/25, 497/50, 249/25, 499/50, 
      10, 501/50, 251/25, 503/50, 252/25, 101/10, 253/25, 507/50, 
      254/25, 509/50, 51/5, 511/50, 256/25, 513/50, 257/25, 103/10, 
      258/25, 517/50, 259/25, 519/50, 52/5, 521/50, 261/25, 523/50, 
      262/25, 21/2, 263/25, 527/50, 264/25, 529/50, 53/5, 531/50, 
      266/25, 533/50, 267/25, 107/10, 268/25, 537/50, 269/25, 539/50, 
      54/5, 541/50, 271/25, 543/50, 272/25, 109/10, 273/25, 547/50, 
      274/25, 549/50, 11, 551/50, 276/25, 553/50, 277/25, 111/10, 
      278/25, 557/50, 279/25, 559/50, 56/5, 561/50, 281/25, 563/50, 
      282/25, 113/10, 283/25}, CompressedData["
1:eJy1lA9M1VUUxz+/3+893ns8eLwHjz8PQ0IQSERE/mhRSZIbkoBCjAmYoEli
yr/+MCij1Yp0hlRzmYvVzCZOoiFDUsM2/2xRNDbX2qxgZRTFmkZJw0zWeTwI
yIJq82xn53vPPfecc8+554ZsLMsqVQCTCksN/G8KD5jdZlPu3+s/bbpRV7Nv
+vq7wUmcskf5E9fnq3iILN2v8YHcYYM71C3TcT4WzHfpGXpE5JNu3J1g4K2t
CtVXDHS+ZuRCiEpvi4ltI+5cjDRT0KFhHzUzt8qT/Lct7Kjwou6YFa3Cxqmi
AD5aaGVJhJWj7RayVD9GVlloeMyH5DPehP/swaNtPszts+OGnUPXfLi804Yj
w0ZmsC/2Z228UOLLyVgbxa/7M/iSP4nzAjjSEkBJYhBDvg5sNYFkmQI4/oQf
jjw7gf2+LG21US74jbAQIrIdRHTYKWidQ/AhB8832Sh8z8b2TE+CVptxHDST
0+SBbchMaqMHC1oE/24i189A9eNGmqwGFg8YCb9sZt8KMzsq3Xg63sCZAQc1
70Sxck4woSkOBmK9eE4z0RdgJKhfx6VXNKqbVCri3DjRYySpS8fhKCN8pmev
WePooMbnpSpfr9BIeEoj6rRGfafGlVA9frUqKQMKZbUKqWtViqIVfjQoeI/K
8SAdbiUW3vzWnTWvqjTnKVgrTVikn/pl4i9cYXWoQrf07NY6hej3pbdtCpYN
0H1WZbgTrq9SuTRfY1GmRu1v0HhSeuylYItT+GodbI2BvoMKOYIr71RJ74Fr
aQodD8OvF+D0ehjNgOYcUOMhOx+WLwEfPzgufg4sgC+3Qbmc/zBRoTZP7OdD
tEyMPhUiz8F2k4LPYYWuEDhxj9gHwSJ5g01V4icJPjZDifi86gZpVvhlWEdB
FOR6wu5I2Fyl0L5Q8rLLDIbBQ1U6NkncbxYrxEgMPy/IkLyPXdXxiZwZkae/
fjOs84Y7TJAsrUgrBIfE2r8Rzop9sSbrPdAgMRPErqQXvpcZGZJ1lgVWip9z
kmOgc67k/oE6KBS/2emwU2aoWfSSFveLbJW5bhdcL7hyDfSL75dvYWzmtoht
ieydF5wjuXzh/Adk3SicKnF2yf5u4SQ99Eq8taIXNQekJvLExmie6EKdMYXf
lZxvFxkn+lPCRcIXpSaFkl+5YIHcJxwzPvvXbXBEzncJ/kFkmOTlL7YvCt4r
OgnLxE+hjeOJdYOwhGKX8E+yuUVkj861myU88RUmC0ubaBd+RjjR3aV3Wjrz
DFNccYqd9V3u8ukkueJYjZx2Vib1U76xaeSsg/uEYyZjTOTuJP9xee/4nq/w
bX/xUzYu5QlhnKJ/8B/iOik+ehIbp6dwAw1rrjw8p+geUFz3ktKTPsPhNFw9
jJ3B/1TSjUv9FF2bxyTuVpmVZrrLfyVtdpObTv+2djeDnLX8A7Tp63c=
"]}}}]

showing at least a rough approximation to the behavior expected for a random walk.

In making distributions like these, we’re putting together all the initial strings of length n, and asking about the statistical properties of this ensemble. But we can also imagine seeing whether initial strings with particular properties consistently behave differently from others. This shows the distribution of halting times as a function of the number of 1s in the initial string; no strong correlations are seen (here for n = 20), even though at least at the beginning the presence of 1s leads to growth:

data20 = Import
&#10005

Analogies & Expectations

How should we think about what we’re seeing? To me it in many ways just seems a typical manifestation of the ubiquitous phenomenon of computational irreducibility. Plenty of systems show what seems like random walk behavior. Even in rule 30, for example, the dividing line between regularity and randomness appears to follow a (biased) random walk:

data = CloudGet
&#10005

ListStepPlot
&#10005

If we changed the initial conditions, we’d get a different random walk. But in all cases, we can think of the evolution of rule 30 as intrinsically generating apparent randomness, “seeded” by its initial conditions.

Even more directly analogous to our tag system are cellular automata whose boundaries show apparent randomness. An example is the k = 2, r = 3/2 rule 7076:

ArrayPlot
&#10005
ArrayPlot[CellularAutomaton[{7076, 2, 3/2}, {{1}, 0}, #], 
   ImageSize -> 300] & /@ {100, 400}
LHEdge
&#10005

Will this pattern go on growing forever, or will it eventually become very narrow, and either enter a cycle or terminate entirely? This is analogous to asking whether our tag system will halt.

There are other cellular automata that show even more obvious examples of these kinds of questions. Consider the k = 3, r = 1 totalistic code 1329 cellular automaton. Here is its behavior for a sequence of simple initial conditions. In some cases the pattern dies out (“it halts”); in some cases it evolves to a (rather elaborate) period-78 cycle. And in one case here it evolves to a period-7 cycle:

GraphicsRow
&#10005
GraphicsRow[
 Table[ArrayPlot[
   CellularAutomaton[{1329, {3, 1}, 1}, {IntegerDigits[i, 3], 
     0}, {220, {-13, 13}}], 
   ColorRules -> {0 -> White, 1 -> Hue[.03, .9, 1], 
     2 -> Hue[.6, .9, .7]}], {i, 1, 64, 3}]]

But is this basically all that can happen? No. Here are the various persistent structures that occur with the first 10,000 initial conditions—and we see that in addition to getting ordinary “cycles”, we also get “shift cycles”:

Row
&#10005
Row[ArrayPlot[CellularAutomaton[{1329, {3, 1}, 1}, {#Cells, 0}, 200], 
    ImageSize -> {Automatic, 250}, 
    ColorRules -> {0 -> White, 1 -> Hue[.03, .9, 1], 
      2 -> Hue[.6, .9, .7]}] & /@ 
  Normal[Take[ResourceData["728d1c07-8892-4673-bab3-d889cc6c4623"], 
    7]], Spacer[7]]

But if we go a little further, there’s another surprise: initial condition 54,889 leads to a structure that just keeps growing forever—while initial condition 97,439 also does this, but in a much more trivial way:

GraphicsRow
&#10005
GraphicsRow[
 ArrayPlot[
    CellularAutomaton[{1329, {3, 1}, 1}, {IntegerDigits[#, 3], 0}, 
     1000], ColorRules -> {0 -> White, 1 -> Hue[.03, .9, 1], 
      2 -> Hue[.6, .9, .7]}, 
    ImageSize -> {Automatic, 400}] & /@ {54889, 97439}]

In our tag system, the analog of these might be particular strings that produce patterns that “obviously grow forever”.

One might think that there could be a fundamental difference between a cellular automaton and a tag system. In a cellular automaton the rules operate in parallel, in effect connecting a whole grid of neighboring cells, while in a tag system the rules only specifically operate on the very beginning and end of each string.

But to see a closer analogy we can consider every update in the tag system as an “event”, then draw a causal graph that shows the relationships between these events. Here is a simple case:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{evol = TSDirectEvolveList[IntegerDigits[18464, 2, 15], 25]}, 
 Show[ArrayPlot[PadRight[evol, Automatic, .1], Mesh -> True, 
   Frame -> False, MeshStyle -> GrayLevel[0.9, 0.9], 
   ColorRules -> {0 -> White, 1 -> GrayLevel[.5]}], 
  Graphics[{Hue[0, 1, 0.56], Opacity[0.2], 
    Rectangle[{0, 0}, {3, Length[evol]}]}], 
  MapIndexed[Graphics[{FaceForm[Opacity[0]], EdgeForm[
Hue[0.11, 1, 0.97]], 
      Rectangle[{0, First[Length[evol] - #2 + 1]}, {1, 
        First[Length[evol] - #2]}]}] &, evol],
  Rest[MapIndexed[
    Graphics[{FaceForm[Opacity[0]], 
       EdgeForm[Directive[Thick, Hue[0.11, 1, 0.97]]], 
       Rectangle[{If[Quiet[First[First[evol[[#2 - 1]]]] == 0], 
          Length[#1] - 2, Length[#1] - 4], 
         First[Length[evol] - #2 + 1]}, {Length[#1], 
         First[Length[evol] - #2]}]}] &, evol]],
  MapIndexed[
   Graphics[{Hue[0, 1, 0.56], Thick, Arrowheads[Small], 
      Arrow[BezierCurve[{{0, First[Length[evol] + 0.5 - #2]}, {-1, 
          First[Length[evol] - #2]}, {0, 
          First[Length[evol] - 0.5 - #2]}}]]}] &, Most[evol]],
  Module[{quo, rem, src},
   {quo, rem} = 
    Transpose[QuotientRemainder[(Length[#] - 3), 3] & /@ evol];
   MapIndexed[
    If[First[#1] === 1, 
      Switch[First[rem[[#2]]], 0, 
       Graphics[{Hue[0, 1, 0.56], Thick, Arrowheads[Small], 
         If[First[Length[evol] - (#2 + 1) + .5 - quo[[#2]]] > 0, 
          Arrow[{{Length[#1] - 3, 
             First[Length[evol] - (#2 + 1) + 0.5]}, {1, 
             First[Length[evol] - (#2 + 1) + 0.5 - quo[[#2]]]}}], 
          Nothing], 
         If[First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]] > 0, 
          Arrow[{{Length[#1] - 3, 
             First[Length[evol] - (#2 + 1) + 0.5]}, {1, 
             First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]]}}], 
          Nothing]}], 1 | 2, 
       Graphics[{Hue[0, 1, 0.56], Thick, Arrowheads[Small], 
         If[First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]] > 0, 
          Arrow[{{Length[#1] - 3, 
             First[Length[evol] - (#2 + 1) + 0.5]}, {1, 
             First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]]}}], 
          Nothing]}]], 
      Switch[First[rem[[#2]]], 0, 
       If[First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]] > 0, 
        Graphics[{Hue[0, 1, 0.56], Thick, Arrowheads[Small], 
          Arrow[{{Length[#1] - 3, 
             First[Length[evol] - (#2 + 1) + 0.5]}, {1, 
             First[Length[evol] - (#2 + 1) + 0.5 - quo[[#2]]]}}]}], 
        Nothing], 1, Nothing, 2, 
       Graphics[{Hue[0, 1, 0.56], Thick, Arrowheads[Small], 
         If[First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]] > 0, 
          Arrow[{{Length[#1] - 3, 
             First[Length[evol] - (#2 + 1) + 0.5]}, {1, 
             First[Length[evol] - (#2 + 1) - 0.5 - quo[[#2]]]}}], 
          Nothing]}]]] &, evol]], 
  Drop[MapIndexed[
    Graphics[{Hue[0, 1, 0.56], Thick, Arrowheads[Small], 
       Arrow[{{1, 
          First[Length[evol] + 0.5 - #2]}, {If[
           Quiet[First[evol[[#2 - 1]]] == 0], Length[#1] - 1, 
           Length[#1] - 3], First[Length[evol] - 0.5 - #2]}}]}] &, 
    Most[evol]], -1]
  ]]

Extracting the pure causal graph we get:

Graph
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Graph[TagSystemCausalGraph @ With[{
        system = PostTagSystem[{0, {1, 1, 0, 1, 0}}]},
     system["State", #] & /@ Range[system["StateCount"]]
   ], AspectRatio -> 1.4]

For the string 4:14:0 which takes 419 steps to terminate, the causal graph is:

ggg = TagSystemCausalGraph @ With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ggg = TagSystemCausalGraph @ 
  With[{ system = PostTagSystem[{0, {1, 1, 1, 0}}]}, 
   system["State", #] & /@ Range[system["StateCount"]]]

Or laid out differently, and marking expansion (11101) and contraction (000) events with red and blue:

Graph
&#10005

Here is the causal graph for the 2141-step evolution of 6:58:0

ggg = TagSystemCausalGraph @ With
&#10005

and what is notable is that despite the “spatial localization” of the underlying operation of the tag system, the causal graph in effect connects events in something closer to a uniform mesh.

Connecting to Number Theory

When Emil Post was first studying tag systems a hundred years ago he saw them as the last hurdle in finding a systematic way to “solve all of mathematics”, and in particular to solve all problems in number theory. Of course, they turned out to be a very big hurdle. But having now seen how complex tag systems can be, it’s interesting to go back and connect again with number theory.

It’s straightforward to convert a tag system into something more obviously number theoretical. For example, if one represents each string of length n by a pair of integers {n,i} in which the binary digits of i give the elements of the string, then each step in the evolution can be obtained from:

TagStep
&#10005
TagStep[{n_, i_}] :=
 
 With[{j = 2^(n - 1) FractionalPart[(8 i)/2^n]}, 
  If[i < 2^(n - 1), {n - 1, j}, {n + 1, 4 j + 13}]]

Starting from the 4:14:0 initial condition (here represented in uncompressed form by {12, 2336}) the first few steps are then:

NestList
&#10005
NestList[TagStep, {12, 2336}, 10]

For compressed strings, the corresponding form is:

TagStep
&#10005
TagStep[{n_, i_, p_}] := 
 With[{j = 2^n FractionalPart[i/2^(n - 1)]}, 
  If[i < 2^(
     n - 1), {{n, j, 2}, {n - 1, j/2, 0}, {n, j, 1}}, {{n + 1, 
      2 j + 3, 1}, {n, j, 2}, {n, j + 1, 0}}][[p + 1]]]

There are different number theoretical formulations one can imagine, but a core feature is that at each step the tag system is making a choice between two arithmetic forms, based on some essentially arithmetic property of the number obtained so far. (Note that the type of condition we have given here can be further “compiled” into “pure arithmetic” by extracting it as a solution to a Diophantine equation.)

A widely studied system similar to this is the Collatz or 3n + 1 problem, which generates successive integers by applying the function:

n |-> If
&#10005
n |-> If[EvenQ[n], n/2, 3 n + 1]

Starting, say, from 27, the sequence of numbers obtained is 27, 82, 41, 124, 62, 31, …

ListStepPlot
&#10005
ListStepPlot[
 NestList[n |-> If[EvenQ[n], n/2, 3 n + 1], 27, 120], Center, 
 Frame -> True, AspectRatio -> 1/3, Filling -> Axis, 
 PlotRange -> All, PlotStyle -> Hue[0.07, 1, 1]]

where after 110 steps the system reaches the cycle 4, 2, 1, 4, 2, 1, …. As a closer analog to the plots for tag systems that we made above we can instead plot the lengths of the successive integers, represented in base 2:

ListStepPlot
&#10005
ListStepPlot[
 IntegerLength[#, 2] & /@ 
  NestList[n |-> If[EvenQ[n], n/2, 3 + 1], 27, 130], Center, 
 Frame -> True, AspectRatio -> 1/3, Filling -> Axis, 
 PlotRange -> All, PlotStyle -> Hue[0.07, 1, 1]]

The state transition graph starting from integers up to 10 is

With
&#10005
With[{g = NestGraph[n |-> If[EvenQ[n], n/2, 3 n + 1], Range[10], 50]},
  HighlightGraph[
  Graph[g, VertexLabels -> Automatic, 
   VertexStyle -> Hue[0.58, 0.65, 1], 
   EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]], {Style[
    Subgraph[g, FindCycle[g, {1, Infinity}, All]], Thick, Hue[
    0.02, 0.92, 0.8200000000000001]], 
   Pick[VertexList[g], VertexOutDegree[g], 0]}]]

and up to 1000 it is:

With
&#10005
With[{g = 
   NestGraph[n |-> If[EvenQ[n], n/2, 3 n + 1], Range[1000], 10000, 
    VertexStyle -> Hue[0.58, 0.65, 1], 
    EdgeStyle -> Hue[0.58, 1, 0.7000000000000001]]}, 
 HighlightGraph[
  g, {Style[Subgraph[g, FindCycle[g, {1, Infinity}, All]], 
    Thickness[.01], Hue[0.02, 0.92, 0.8200000000000001]], 
   Pick[VertexList[g], VertexOutDegree[g], 0]}]]

Unlike for Post’s tag system, there is only one connected component (and one final cycle), and the “highways” are much shorter. For example, among the first billion initial conditions, the longest transient is just 986 steps. It occurs for the initial integer 670617279—which yields the following sequence of integer lengths:

ListStepPlot
&#10005
ListStepPlot[
 IntegerLength[#, 2] & /@ 
  NestList[n |-> If[EvenQ[n], n/2, 3 n + 1], 670617279, 1100], Center,
  Frame -> True, AspectRatio -> 1/3, Filling -> Axis, 
 PlotRange -> All, PlotStyle -> Hue[0.07, 1, 1]]

Despite a fair amount of investigation since the 1930s, it’s still not known whether the 3n + 1 problem always terminates on its standard cycle—though this is known to be the case for all integers up to .

For Post’s tag system the most obvious probabilistic estimate suggests that the sequence of string lengths should follow an unbiased random walk. For the 3n + 1 problem, a similar analysis suggests a random walk with an average bias of binary digits per step, as suggested by this collection of walks from initial conditions + k:

ListStepPlot
&#10005
ListStepPlot[
 Table[IntegerLength[#, 2] & /@ 
   NestList[n |-> If[EvenQ[n], n/2, 3 n + 1], 10^8 + i, 200], {i, 0, 
   40}], Center, Frame -> True, AspectRatio -> 1/3, PlotRange -> All]

The rule (discussed in A New Kind of Science)

n |-> If
&#10005
n |-> If[EvenQ[n], n/2, 5 n + 1]

instead implies a bias of +0.11 digits per step, and indeed most initial conditions lead to growth:

Function
&#10005
Function[{i}, 
  ListStepPlot[
   IntegerLength[#, 2] & /@ 
    NestList[n |-> If[EvenQ[n], n/2, 5 n + 1], i, 200], Center, 
   Frame -> True, AspectRatio -> 1/3, Filling -> Axis, 
   PlotRange -> All, Epilog -> Inset[i, Scaled[{.1, .8}]], 
   PlotStyle -> Hue[0.07, 1, 1]]] /@ {7, 37}

But there are still some that—even though they grow for a while—have “fluctuations” that cause them to “crash” and end up in cycles:

Function
&#10005
Function[{i}, 
  ListStepPlot[
   IntegerLength[#, 2] & /@ 
    NestList[n |-> If[EvenQ[n], n/2, 5 n + 1], i, 100], Center, 
   Frame -> True, AspectRatio -> .45, Filling -> Axis, 
   PlotRange -> All, Epilog -> Inset[i, Scaled[{.9, .8}]], 
   PlotStyle -> Hue[0.07, 1, 1]]] /@ {181, 613, 9818}

What is the “most unbiased” a n + b system? If we consider mod 3 instead of mod 2, we have systems like:

n |-> \!\(\*SubscriptBox
&#10005
n |-> 
\!\(\*SubscriptBox[\({n, 
\*SubscriptBox[\(a\), \(1\)] n + 
\*SubscriptBox[\(b\), \(1\)], 
\*SubscriptBox[\(a\), \(2\)] n + 
\*SubscriptBox[\(b\), \(2\)]}\), \(\([\)\(\([\)\(Mod[n, 3] + 
    1\)\(]\)\)\(]\)\)]\)/3

We need to be divisible by 3 when n = i mod 3. In our approximation, the bias will be . This is closest to zero (with value +0.05) when ai are 4 and 7. An example of a possible iteration is then:

n |-><br />
\!\(\*SubscriptBox
&#10005
n |-> 
\!\(\*SubscriptBox[\({n, 4  n + 2, 
    7  n + 1}\), \(\([\)\(\([\)\(Mod[n, 3] + 1\)\(]\)\)\(]\)\)]\)/3

Starting from a sequence of initial conditions this clearly shows less bias than the 3n + 1 case:

ListStepPlot
&#10005
ListStepPlot[Table[IntegerLength[#, 2] & /@ NestList[n |-> 
\!\(\*SubscriptBox[\({n, 4  n + 2, 
        7  n + 1}\), \(\([\)\(\([\)\(Mod[n, 3] + 1\)\(]\)\)\(]\)\)]\)/
      3, 10^8 + i, 100], {i, 0, 40}], Center, Frame -> True, 
 AspectRatio -> 1/3, PlotRange -> All]

Here are the halting times for initial conditions up to 1000:

ListStepPlot
&#10005
ListStepPlot[
 Transpose[
  ParallelTable[Length /@ FindTransientRepeat[NestList[n |-> 
\!\(\*SubscriptBox[\({n, 4  n + 2, 
          7  n + 1}\), \(\([\)\(\([\)\(Mod[n, 3] + 
          1\)\(]\)\)\(]\)\)]\)/3, i, 5000], 3], {i, 1000}]], Center, 
 PlotRange -> {0, 4000}, PlotLayout -> "Stacked", Joined -> True, 
 Filling -> Automatic, Frame -> True, AspectRatio -> 1/4, 
 PlotStyle -> Hue[0.1, 1, 1]]

Most initial conditions quickly evolve to cycles of length 5 or 20. But initial condition 101 takes 2604 steps to reach the 20-cycle:

Function
&#10005
Function[{i}, ListStepPlot[IntegerLength[#, 2] & /@ NestList[n |-> 
\!\(\*SubscriptBox[\({n, 4  n + 2, 
         7  n + 1}\), \(\([\)\(\([\)\(Mod[n, 3] + 
         1\)\(]\)\)\(]\)\)]\)/3, i, 3000], Center, Frame -> True, 
   AspectRatio -> 1/3, Filling -> Axis, PlotRange -> All, 
   Epilog -> Inset[i, Scaled[{.06, .9}]], 
   PlotStyle -> Hue[0.07, 1, 1]]] /@ {101, 469}

And initial condition 469 does not appear to reach a cycle at all—and instead appears to systematically grow at about 0.018 bits per step:

ListStepPlot
&#10005
ListStepPlot[
 MapIndexed[{1 + (First[#2] - 1)*1000, #} &, (IntegerLength[#, 2] & /@
     NestList[Nest[n |-> 
\!\(\*SubscriptBox[\({n, 4  n + 2, 
           7  n + 1}\), \(\([\)\(\([\)\(Mod[n, 3] + 
           1\)\(]\)\)\(]\)\)]\)/3, #, 1000] &, 469, 1000])], Center, 
 Frame -> True, AspectRatio -> 1/3, Filling -> Axis, 
 PlotRange -> All, PlotStyle -> Hue[0.07, 1, 1]]

In other words, unlike the 3n + 1 problem—or our tag system—this iteration usually leads to a cycle, but just sometimes appears to “escape” and continue to increase, presumably forever.

(In general, for modulus m, the minimum bias will typically be , and the “smoothest” iterations will be ones whose multipliers involve similar-sized factors of numbers close to . For m = 4, for example, {n, 3n – 3, 5n – 2, 17n + 1} is the best.)

One might wonder how similar our tag system—or the 3n + 1 problem—is to classic unsolved problems in number theory, like the Riemann Hypothesis. In essence the Riemann Hypothesis is an assertion about the statistical randomness of primes, normally stated in terms of complex zeroes of the Riemann zeta function, or equivalently, that all the maxima of RiemannSiegelZ[t] (for any value of t) lie above the axis:

Plot
&#10005
Plot[RiemannSiegelZ[t], {t, 0, 400}, Frame -> True, 
 AspectRatio -> 1/6, PlotPoints -> 500, PlotStyle -> Hue[0.07, 1, 1]]

But it’s known (thanks to extensive work by Yuri Matiyasevich) that an equivalent—much more obviously integer-related—statement is that

(2 n + 3)!!/15 - ((2 n - 2)!! PrimePi
&#10005
(2 n + 3)!!/15 - ((2 n - 2)!! PrimePi[n]^2)/
  2 ((BitLength[Fold[LCM, Range[n]]] - 1) \!\(
\*UnderoverscriptBox[\(\[Sum]\), \(k\), \(n - 1\)]
\*FractionBox[
SuperscriptBox[\((\(-1\))\), \(k + 1\)], \(k\)]\) - 2 n)

is positive for all positive n. And this then turns out to be equivalent to the surprisingly simple statement that the iteration

NestWhile
&#10005
NestWhile[x |-> {
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) + 1, If[GCD[
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) + 1, 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(3\)\(\
\[RightDoubleBracket]\)\)]\)] == 1, 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(2\)\(\
\[RightDoubleBracket]\)\)]\) + 1, 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(2\)\(\
\[RightDoubleBracket]\)\)]\)], (
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) + 1) Quotient[
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(3\)\(\
\[RightDoubleBracket]\)\)]\), GCD[
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) + 1, 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(3\)\(\
\[RightDoubleBracket]\)\)]\)]], 2 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(4\)\(\
\[RightDoubleBracket]\)\)]\) - (-1)^x[[1]] 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(6\)\(\
\[RightDoubleBracket]\)\)]\), 2 (
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) + 1) 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(5\)\(\
\[RightDoubleBracket]\)\)]\), 2 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(5\)\(\
\[RightDoubleBracket]\)\)]\), (2 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(1\)\(\
\[RightDoubleBracket]\)\)]\) + 5) 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(7\)\(\
\[RightDoubleBracket]\)\)]\)}, {1, 0, 1, 0, 1, 1, 1}, x |-> 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(7\)\(\
\[RightDoubleBracket]\)\)]\) > 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(2\)\(\
\[RightDoubleBracket]\)\)]\)^2 (
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(4\)\(\
\[RightDoubleBracket]\)\)]\) (BitLength[
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(3\)\(\
\[RightDoubleBracket]\)\)]\)] - 1) - 
\!\(\*SubscriptBox[\(x\), \(\(\[LeftDoubleBracket]\)\(5\)\(\
\[RightDoubleBracket]\)\)]\))]

will never terminate.

For successive n the quantity above is given by:

Table
&#10005
Table[(2 n + 3)!!/
  15 - ((2 n - 2)!! PrimePi[n]^2)/
   2 ((IntegerLength[Fold[LCM, Range[n]], 2] - 1) \!\(
\*UnderoverscriptBox[\(\[Sum]\), \(k\), \(n - 1\)]
\*FractionBox[
SuperscriptBox[\((\(-1\))\), \(k + 1\)], \(k\)]\) - 2 n), {n, 10}]

At least at the beginning the numbers are definitely positive, as the Riemann Hypothesis would suggest. But if we ask about the long-term behavior we can see something of the complexity involved by looking at the differences in successive ratios:

GraphicsRow
&#10005
GraphicsRow[
 ListStepPlot[
    Differences[
     Ratios[Table[(2 n + 3)!!/5!! - 
        PrimePi[n]^2 ((2 n - 2)!! Sum[(-1)^(k + 1)/k, {k, n - 1}]/
             2 Floor[Log2[ Apply[LCM, Range[n] ]]] - (2 n)!!/
            2!!), {n, #}]]], Frame -> True, 
    PlotStyle -> Hue[0.07, 1, 1], AspectRatio -> 1/3] & /@ {100, 
   1000}]

The Riemann Hypothesis effectively says that there aren’t too many negative differences here.

Other Tag Systems

So far we’ve been talking specifically about Emil Post’s particular 00, 1101 tag system. But as Post himself observed, one can define plenty of other tag systems—including ones that involve not just 0 and 1 but any number of possible elements (Post called the number of possible elements μ, but I’ll call it k), and delete not just 3 but any number of elements at each step (Post called this ν, but I’ll call it r).

It’s easy to see that rules which delete only one element at each step (r = 1) cannot involve real “communication” (or causal connections) between different parts of the string, and must be equivalent to neighbor-independent substitution systems—so that they either have trivial behavior, or grow without bound to produce at most highly regular nested sequences. (001, 110 will generate the Thue–Morse string, while 001, 10 will generate the Fibonacci string.)

Things immediately get more complicated when two elements are deleted at each step (r = 2). Post correctly observed that with just 0 and 1 (k = 2) there are no rules that show the kind of sometimes-expanding, sometimes-contracting behavior of his 00, 1101 rule. But back in 2007—as part of a live experiment at our annual Summer School—I looked at the r = 2 rule 01, 1110. Here’s what it does starting with 10:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSGDirectEvolveList[{2, {{1}, {1, 1, 0}}}, {1, 0}, 25], 
  Automatic, .25], Mesh -> True, MeshStyle -> GrayLevel[0.75, 0.75]]

And here’s how the sequence of string lengths behaves:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 TagLengthFunction[{2, {{1}, {1, 1, 0}}}][{1, 0}, 60], Center, 
 AspectRatio -> 1/3, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

If we assume that 0 and 1 appear randomly with certain probabilities, then a simple calculation shows that 1 should occur about times as often as 0, and the string should grow an average of elements at each step. So “detrending” by this, we get:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 MapIndexed[# - (Sqrt[2] - 1) First[#2] &, 
  TagLengthFunction[{2, {{1}, {1, 1, 0}}}][{1, 0}, 300]], Center, 
 AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

Continuing for more steps we see a close approximation to a random walk:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 MapIndexed[# - (Sqrt[2] - 1) First[#2] &, 
  TagLengthFunction[{2, {{1}, {1, 1, 0}}}][{1, 0}, 10000]], Center, 
 AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

So just like with Post’s 00, 1101 rule—and, of course, with rule 30 and all sorts of other systems in the computational universe—we have here a completely deterministic system that generates what seems like randomness. And indeed among tag systems of the type we’re discussing here this appears to be the very simplest rule that shows this kind of behavior.

But does this rule show the same kind of growth from all initial conditions? It can show different random sequences, for example here for initial conditions 5:17 and 7:80:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
   MapIndexed[# - (Sqrt[2] - 1) First[#2] &, 
    TagLengthFunction[{2, {{1}, {1, 1, 0}}}][#, 300]], Center, 
   AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
   PlotStyle -> Hue[0.07, 1, 1]] & /@ {IntegerDigits[17, 2, 5], 
  IntegerDigits[80, 2, 7]}

And sometimes it just immediately enters a cycle. But it has some “surprises” too. Like with initial condition 9:511 (i.e. 111111111) it grows not linearly, but like (shown here without any detrending):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 TagLengthFunction[{2, {{1}, {1, 1, 0}}}][{1, 1, 1, 1, 1, 1, 1, 1, 1},
   150], Center, AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

But what about a tag system that doesn’t seem to “typically grow forever”? When I was working on A New Kind of Science I studied generalized tag systems that don’t just look at their first elements, but instead use the whole block of elements they’re deleting to determine what elements to add at the end (and so work in a somewhat more “cellular-automaton-style” way).

One particular rule that I showed in A New Kind of Science (as case (c) on page 94) is:

Text
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Text[Map[Row, {{0, 0} -> {0}, {1, 0} -> {1, 0, 1}, {0, 1} -> {0, 0, 
     0}, {1, 1} -> {0, 1, 1}}, {2}]]

Starting with 11 this rule gives

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[GCSSEvolveList[{2, {{0, 0} -> {0}, {1, 0} -> {1, 0, 1}, {0, 
       1} -> {0, 0, 0}, {1, 1} -> {0, 1, 1}}}, {1, 1}, 25], 
  Automatic, .25], Mesh -> True, MeshStyle -> GrayLevel[0.75, 0.75]]

and grows for a while—but then terminates after 289 steps:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  GCSSEvolveList[{2, {{0, 0} -> {0}, {1, 0} -> {1, 0, 1}, {0, 
       1} -> {0, 0, 0}, {1, 1} -> {0, 1, 1}}}, {1, 1}, 300], Center, 
 AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

The corresponding generational evolution is:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose[
   PadRight[
    GCSSGenerationEvolveList[{2, {{0, 0} -> {0}, {1, 0} -> {1, 0, 
         1}, {0, 1} -> {0, 0, 0}, {1, 1} -> {0, 1, 1}}}, {1, 1}, 
     35], {Automatic, 38}, .25]], Mesh -> True, 
 MeshStyle -> GrayLevel[.75, .75], Frame -> False]

(Note that the kind of “phase decomposition” that we did for Post’s tag system doesn’t make sense for a block tag system like this.)

Here are the lengths of the transients+cycles for possible initial conditions up to size 7:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{list = Catenate[Table[Tuples[{0, 1}, n], {n, 7}]]}, 
 ListStepPlot[
  Transpose[((Length /@ 
        FindTransientRepeat[
         GCSSEvolveList[{2, {{0, 0} -> {0}, {1, 0} -> {1, 0, 1}, {0, 
              1} -> {0, 0, 0}, {1, 1} -> {0, 1, 1}}}, #, 1000], 
         4]) & /@ list)], Center, 
  PlotStyle -> {Hue[0.1, 1, 1], Hue[0.02, 0.92, 0.8200000000000001]}, 
  PlotRange -> {0, 800}, PlotLayout -> "Stacked", Joined -> True, 
  Filling -> Automatic, Frame -> True, AspectRatio -> 1/5, 
  FrameTicks -> {{Automatic, 
     None}, {Extract[
      MapThread[
       List[#1, 
         Rotate[Style[StringJoin[ToString /@ #2], 
           FontFamily -> "Roboto", Small], 90 Degree]] &, {Range[0, 
         253], list}], 
      Position[list, 
       Alternatives @@ 
        Select[list, 
         IntegerExponent[FromDigits[#, 2], 2] > Length[#]/2 && 
           Length[#] > 1 &]]], None}}]]

This looks more irregular—and “livelier”—than the corresponding plot for Post’s tag system, but not fundamentally different. At size 5 the initial string 11010 (denoted 5:12) yields

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  GCSSEvolveList[{2, {{0, 0} -> {0}, {1, 0} -> {1, 0, 1}, {0, 
       1} -> {0, 0, 0}, {1, 1} -> {0, 1, 1}}}, {1, 1, 0, 1, 0}, 
   800], Center, AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

which terminates after 706 steps in a length-8 cycle. Going further one sees a sequence of progressively longer transients:

Text
&#10005
Text[Grid[
  Prepend[{Row[{#[[1, 1]], ":", #[[1, 2]]}], #[[2, 1]], #[[2, 
       2]]} & /@ {{2, 3} -> {288, 1}, {5, 12} -> {700, 8}, {6, 
       62} -> {4184, 1}, {8, 175} -> {20183, 8}, {9, 345} -> {26766, 
       1}, {9, 484} -> {51680, 8}, {10, 716} -> {100285, 1}, {10, 
       879} -> {13697828, 8}, {13, 7620} -> {7575189088, 1}, {17, 
       85721} -> {14361319032, 8}}, 
   Style[#, Italic] & /@ {"initial state", "steps", "cycle length"}], 
  Frame -> All, Alignment -> {{Left, Right, Right}}, 
  FrameStyle -> GrayLevel[.7], Background -> {None, {GrayLevel[.9]}}]]
xevollist
&#10005

But like with Post’s tag system, the system always eventually reaches a cycle (or terminates)—at least for all initial strings up to size 17. But what will happen for the longest initial strings is not clear, and the greater “liveliness” of this system relative to Post’s suggests that if exotic behavior occurs, it will potentially do so for smaller initial strings than in Post’s system.

Another way to generalize Post’s 00, 1101 tag system is to consider not just elements 0, 1, but, say, 0, 1, 2 (i.e. k = 3). And in this case there is already complex behavior even with rules that consider just the first element, and delete two elements at each step (r = 2).

As an example, consider the rule:

#1 -> Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						#1 -> Row[#2] & @@@ 
 Thread[Range[0, 2] -> TakeList[IntegerDigits[76, 3, 6], {1, 2, 3}]]

Starting, say, with 101 this gives

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSGDirectEvolveList[{2, 
    TakeList[IntegerDigits[76, 3, 6], {1, 2, 3}]}, 
   IntegerDigits[10, 3, 3], 20], Automatic, .25], Mesh -> True, 
 MeshStyle -> GrayLevel[.85, .75], 
 ColorRules -> {0 -> White, 1 -> Hue[.03, .9, 1], 
   2 -> Hue[.7, .8, .5], -1 -> GrayLevel[.85]}]

which terminates after 74 steps:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSGDirectEvolveList[{2, 
    TakeList[IntegerDigits[76, 3, 6], {1, 2, 3}]}, 
   IntegerDigits[10, 3, 3], 250], Center, AspectRatio -> 1/4, 
 Filling -> Axis, Frame -> True, PlotStyle -> Hue[0.07, 1, 1]]

Here are the lengths of transients+cycles for this rule up to length-6 initial (ternary) strings:

With
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						With[{r = 76, 
  list = Catenate[
    Table[IntegerDigits[i, 3, n],  {n, 1, 6}, {i, 0, 3^n - 1}]]}, 
 ListStepPlot[
  Transpose[
   Last /@ Monitor[
     Flatten[Table[
       ParallelTable[{n, i} -> 
         Length /@ 
          FindTransientRepeat[
           Length /@ 
            TSGDirectEvolveList[{2, 
              TakeList[IntegerDigits[r, 3, 6], {1, 2, 3}]}, 
             IntegerDigits[i, 3, n], 1000], 10], {i, 0, 3^n - 1}], {n,
         6}]], n]], Center, PlotRange -> {0, 125}, 
  PlotStyle -> {Hue[0.1, 1, 1], Hue[0.02, 0.92, 0.8200000000000001]}, 
  PlotLayout -> "Stacked", Joined -> True, Filling -> Automatic, 
  Frame -> True, AspectRatio -> 1/5, 
  FrameTicks -> {{Automatic, 
     None}, {Extract[
      MapThread[
       List[#1, 
         Rotate[Style[StringJoin[ToString /@ #2], 
           FontFamily -> "Roboto", Small], 90 Degree]] &, {Range[0, 
         1091], list}], 
      Position[list, 
       Alternatives @@ 
        Select[list, 
         IntegerExponent[FromDigits[#, 3], 3] > Length[#]/2 && 
           Length[#] =!= 3 && Length[#] > 1 &]]], None}}]]

The initial string 202020 (denoted 6:546, where now this indicates ternary rather than binary) terminates after 6627 steps

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSGDirectEvolveList[{2, 
    TakeList[IntegerDigits[76, 3, 6], {1, 2, 3}]}, 
   IntegerDigits[546, 3, 6], 10000], Center, AspectRatio -> 1/4, 
 Filling -> Axis, Frame -> True, PlotStyle -> Hue[0.07, 1, 1]]

with (phase-reduced) generational evolution:

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 Reverse@Transpose[
   PadRight[
    Take[#, 1 ;; -1 ;; 2] & /@ 
     TSGGenerationEvolveList[{2, 
       TakeList[IntegerDigits[76, 3, 6], {1, 2, 3}]}, 
      IntegerDigits[546, 3, 6], 180], {Automatic, 95}, .25]], 
 Frame -> False, 
 ColorRules -> {0 -> White, 1 -> Hue[.03, .9, 1], 
   2 -> Hue[.7, .8, .5], -1 -> GrayLevel[.85]}]

And once again, the overall features of the behavior are very similar to Post’s system, with the longest halting times seen up to strings of length 14 being:

Text
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Text[Grid[
  Prepend[{DecimalStringForm[{#[[1, 1]], #[[1, 2]]}], #[[2, 1]], 
      If[# == 0, Style[#, Gray], #] &@#[[2, 2]]} & /@ {{3, {0, 
        10}} -> {74, 0}, {5, {0, 91}} -> {122, 
       0}, {6, {0, 546}} -> {6627, 0}, {9, {0, 499}} -> {9353, 
       0}, {9, {0, 610}} -> {12789, 0}, {9, {0, 713}} -> {20175, 
       0}, {9, {0, 1214}} -> {175192, 0}, {9, {0, 18787}} -> {336653, 
       0}, {10, {0, 17861}} -> {519447, 
       0}, {10, {0, 29524}} -> {21612756, 
       6}, {10, {0, 52294}} -> {85446023, 
       0}, {11, {0, 93756}} -> {377756468, 
       6}, {12, {0, 412474}} -> {30528772851, 0}}, 
   Style[#, Italic] & /@ {"initial state", "steps", "cycle length"}], 
  Frame -> All, Alignment -> {{Left, Right, Right}}, 
  FrameStyle -> GrayLevel[.7], Background -> {None, {GrayLevel[.9]}}]]

But what about other possible rules? As an example, we can look at all 90 possible k = 3, r = 2 rules of the form 0_, 1__, 2___ in which the right-hand sides are “balanced” in the sense that in total they all contain two 0s, 1s and 2s. This shows the evolution (for 100 steps) for each of these rules that has the longest transient for any initial string with less than 7 elements:

GraphicsGrid
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						GraphicsGrid[
 Partition[
  ParallelMap[
   ListStepPlot[
     Length /@ 
      TSGDirectEvolveList[{2, 
        TakeList[IntegerDigits[#[[1]], 3, 6], {1, 2, 3}]}, 
       IntegerDigits[#[[2, 2]], 3, #[[2, 1]]], 100], Center, 
     PlotRange -> {{0, 100}, Automatic}, AspectRatio -> 1/3, 
     Filling -> Axis, Frame -> True, FrameTicks -> False, 
     PlotStyle -> Hue[0.07, 1, 1]] &, {44 -> {5, 182}, 50 -> {6, 492},
     52 -> {3, 20}, 68 -> {2, 6}, 70 -> {5, 19}, 76 -> {6, 546}, 
    98 -> {3, 20}, 104 -> {3, 2}, 106 -> {5, 182}, 116 -> {5, 182}, 
    128 -> {6, 492}, 132 -> {5, 182}, 140 -> {6, 540}, 
    142 -> {5, 181}, 146 -> {4, 60}, 150 -> {5, 163}, 154 -> {3, 10}, 
    156 -> {5, 100}, 176 -> {6, 270}, 178 -> {6, 540}, 
    184 -> {6, 270}, 194 -> {5, 173}, 196 -> {6, 57}, 200 -> {5, 182},
     204 -> {6, 543}, 208 -> {5, 173}, 210 -> {6, 486}, 
    220 -> {5, 91}, 226 -> {5, 100}, 228 -> {5, 91}, 260 -> {5, 182}, 
    266 -> {6, 492}, 268 -> {5, 182}, 278 -> {5, 182}, 
    290 -> {6, 492}, 294 -> {5, 164}, 302 -> {6, 519}, 304 -> {6, 30},
     308 -> {6, 492}, 312 -> {6, 489}, 316 -> {6, 546}, 
    318 -> {6, 546}, 332 -> {6, 540}, 344 -> {6, 492}, 
    348 -> {5, 182}, 380 -> {6, 519}, 384 -> {6, 270}, 
    396 -> {6, 276}, 410 -> {5, 101}, 412 -> {6, 543}, 
    416 -> {6, 543}, 420 -> {6, 57}, 424 -> {6, 489}, 426 -> {5, 164},
     434 -> {6, 273}, 438 -> {6, 513}, 450 -> {6, 543}, 
    460 -> {6, 516}, 462 -> {5, 99}, 468 -> {6, 30}, 500 -> {6, 546}, 
    502 -> {5, 181}, 508 -> {6, 6}, 518 -> {5, 99}, 520 -> {6, 516}, 
    524 -> {6, 543}, 528 -> {5, 99}, 532 -> {3, 9}, 534 -> {6, 546}, 
    544 -> {5, 181}, 550 -> {6, 519}, 552 -> {5, 181}, 
    572 -> {6, 540}, 574 -> {5, 181}, 578 -> {3, 10}, 582 -> {5, 172},
     586 -> {6, 546}, 588 -> {6, 513}, 596 -> {5, 180}, 
    600 -> {5, 18}, 612 -> {6, 546}, 622 -> {6, 519}, 624 -> {6, 513},
     630 -> {6, 519}, 652 -> {6, 270}, 658 -> {5, 19}, 
    660 -> {6, 540}, 676 -> {6, 57}, 678 -> {6, 297}, 
    684 -> {6, 30}}], 6]]

Many lead quickly to cycles or termination. Others after 100 steps seem to be growing irregularly, but all the specific evolutions shown here eventually halt. There are peculiar cases, like 00, 102, 2112 which precisely repeats the initial string 20 after 18,255 steps:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSGDirectEvolveList[{2, 
    TakeList[IntegerDigits[68, 3, 6], {1, 2, 3}]}, 
   IntegerDigits[6, 3, 2], 40000], Center, AspectRatio -> 1/5, 
 Filling -> Axis, Frame -> True, PlotStyle -> Hue[0.07, 1, 1]]

And then there are cases like 00, 101, 2212, say starting from 200020, which either halt quickly, or generate strings of ever-increasing length (here like ) and can easily be seen never to halt:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSGDirectEvolveList[{2, 
    TakeList[IntegerDigits[50, 3, 6], {1, 2, 3}]}, 
   IntegerDigits[492, 3, 6], 100], Center, AspectRatio -> 1/3, 
 Filling -> Axis, Frame -> True, PlotStyle -> Hue[0.07, 1, 1]]

(By the way, the situation with “non-balanced” k = 3 rules is not fundamentally different from balanced ones; 00, 122, 2102, for example, shows very “Post-like” behavior.)

The tag systems we’ve been discussing are pretty simple. But an even simpler version considered in A New Kind of Science are what I called cyclic tag systems. In a cyclic tag system one removes the first element of the string at each step. On successive steps, one cycles through a collection of possible blocks to add, adding one if the deleted element was a 1 (and otherwise adding nothing).

If the possible blocks to add are 111 and 0, then the behavior starting from the string 1 is as follows

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[CTEvolveList[{{1, 1, 1}, {0}}, {1}, 25], {Automatic, 
   18}, .25], Mesh -> True, MeshStyle -> GrayLevel[0.75, 0.75]]

with the lengths “detrended by t/2” behaving once again like an approximate random walk:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 MapIndexed[# - First[#2]/2 &, 
  Length /@ CTEvolveList[{{1, 1, 1}, {0}}, {1}, 20000]], Center, 
 AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

With cycles of just 2 blocks, one typically sees either quick cycling or termination, or what seems like obvious infinite growth. But if one allows a cycle of 3 blocks, more complicated halting behavior becomes possible.

Consider for example 01, 0, 011. Starting from 0111 one gets

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[CTEvolveList[{{0, 1}, {0}, {0, 1, 1}}, {0, 1, 1, 1}, 
   20], {Automatic, 8}, .25], Mesh -> True, 
 MeshStyle -> GrayLevel[0.75, 0.75]]

with the system halting after 169 steps:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  CTEvolveList[{{0, 1}, {0}, {0, 1, 1}}, {0, 1, 1, 1}, 200], Center, 
 AspectRatio -> 1/4, Filling -> Axis, Frame -> True, 
 PlotStyle -> Hue[0.07, 1, 1]]

Here are the transient+cycle times for initial strings up to size 8 (the system usually just terminates, but for example 001111 goes into a cycle of length 18):

With
&#10005
With[{r = {{0, 1}, {0}, {0, 1, 1}}, 
  list = Catenate[
    Table[IntegerDigits[i, 2, n],  {n, 1, 8}, {i, 0, 2^n - 1}]]}, 
 ListStepPlot[
  Transpose[
   Last /@ Monitor[
     Flatten[Table[
       ParallelTable[{n, i} -> 
         Length /@ 
          FindTransientRepeat[
           CTLengthList[r, IntegerDigits[i, 2, n], 800], 3], {i, 0, 
         2^n - 1}], {n, 8}]], n]], Center, PlotRange -> {0, 500}, 
  PlotStyle -> {Hue[0.1, 1, 1], Hue[0.02, 0.92, 0.8200000000000001]}, 
  PlotLayout -> "Stacked", Joined -> True, Filling -> Automatic, 
  Frame -> True, AspectRatio -> 1/5, 
  FrameTicks -> {{Automatic, 
     None}, {Extract[
      MapThread[
       List[#1, 
         Rotate[Style[StringJoin[ToString /@ #2], 
           FontFamily -> "Roboto", Small], 90 Degree]] &, {Range[0, 
         509], list}], 
      Position[list, 
       Alternatives @@ 
        Select[list, 
         IntegerExponent[FromDigits[#, 2], 2] > Length[#]/1.5 && 
           Length[#] > 2 &]]], None}}]]

The behavior of the longest-to-halt-so-far “winners” are again similar to what we have seen before—except perhaps for the rather huge jump in halting time at length 13—that isn’t surpassed until size 16:

Text
&#10005
Text[Grid[
  Prepend[MapIndexed[{Style[Row[{#[[1, 1]], ":", #[[1, 2]]}], 
       If[First[#2] > 6, Gray, Black]], 
      Style[#[[2]], If[First[#2] > 6, Gray, Black]]} &, {{1, 1} -> 
      59, {4, 7} -> 169, {5, 21} -> 1259, {7, 126} -> 
      6470, {10, 687} -> 134318, {13, 7655} -> 10805957330 (* ,{13,
     7901}\[Rule]180044,{13,7903}\[Rule]2431313,{14,
     12270}\[Rule]7490186,{16,14999}\[Rule]3367712,{16,
     15055}\[Rule]12280697,{16,43961}\[Rule]27536759 *)}], 
   Style[#, Italic] & /@ {"initial state", "steps"}], Frame -> All, 
  Alignment -> {{Left, Right, Right}}, FrameStyle -> GrayLevel[.7], 
  Background -> {None, {GrayLevel[.9]}}]]
&#10005

What Can It Compute?

When Post originally invented tag systems in 1920 he intended them as a string-based idealization of the operations in mathematical proofs. But a decade and a half later, once Turing machines were known, it started to be clear that tag systems were better framed as being computational systems. And by the 1940s it was known that at least in principle string-rewriting systems of the kind Post used were capable of doing exactly the same types of computations as Turing machines—or, as we would say now, that they were computation universal.

At first what was proved was that a fairly general string-rewriting system was computation universal. But by the early 1960s it was known that a tag system that looks only at its first element is also universal. And in fact it’s not too difficult to write a “compiler” that takes any Turing machine rule and converts it to a tag system rule—and page 670 of A New Kind of Science is devoted to showing a pictorial example of how this works:

Emulating a Turing machine with a tag system

For example we can take the simplest universal Turing machine (which has 2 states and 3 colors) and compile it into a 2-element-deletion tag system with 32 possible elements (the ones above 9 represented by letters) and rules:

alt
&#10005

But what about a tag system like Post’s 00, 1101 one—with much simpler rules? Could it also be universal?

Our practical experience with computers might make us think that to get universality we would necessarily have to have a system with complicated rules. But the surprising conclusion suggested by the Principle of Computational Equivalence is that this is not correct—and that instead essentially any system whose behavior is not obviously simple will actually be capable of universal computation.

For any particular system it’s usually extremely difficult to prove this. But we now have several examples that seem to validate the Principle of Computational Equivalence—in particular the rule 110 cellular automaton and the 2,3 Turing machine. And this leads us to the conjecture that even tag systems with very simple rules (at least ones whose overall behavior is not obviously simple) should also be computation universal.

How can we get evidence for this? We might imagine that we could see a particular tag system “scanning over” a wide range of computations as we change its initial conditions. Of course, computation universality just says that it must be possible to construct an initial condition that performs any given computation. And it could be that to perform any decently sophisticated computation would require an immensely complex initial condition, that would never be “found naturally” by scanning over possible initial conditions.

But the Principle of Computational Equivalence actually goes further than just saying that all sorts of systems can in principle do sophisticated computations; it says that such computations should be quite ubiquitous among possible initial conditions. There may be some special initial conditions that lead to simple behavior. But other initial conditions should produce behavior that corresponds to a computation that is in a sense as sophisticated as any other computation.

And a consequence of this is that the behavior we see will typically be computationally irreducible: that in general there will be no way to compute its outcome much more efficiently than just by following each of its steps. Or, in other words, when we observe the system, we will have no way to computationally reduce it—and so its behavior will seem to us complex.

So when we find behavior in tag systems that seems to us complex—and that we do not appear able to analyze or predict—the expectation is that it must correspond to a sophisticated computation, and be a sign that the tag system follows the Principle of Computational Equivalence and is computation universal.

But what actual computations do particular tag systems do? Clearly they do the computations that are defined by their rules. But the question is whether we can somehow also interpret the overall computations they do in terms of familiar concepts, say in mathematics or computer science.

Consider for example the 2-element-deletion tag system with rules 1111. Starting it off with 11 we get

alt
&#10005

and we can see that the tag in effect just “counts up in unary”. (The 1-element-deletion rule 111 does the same thing.)

Now consider the tag system with rules:

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {2, 2}, 2 -> {1, 1, 1, 1}}

Starting it with 11 we get

Column
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Column[Row /@ 
  TSEvolveList[{2, {1 -> {2, 2}, 2 -> {1, 1, 1, 1}}}, {1, 1}, 8]]

or more pictorially (red is 1, blue is 2):

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSEvolveList[{2, {1 -> {2, 2}, 2 -> {1, 1, 1, 1}}}, {1, 1}, 
   34], Automatic], Mesh -> True, MeshStyle -> GrayLevel[.75, .75], 
 ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
   2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}]

But now look at steps where strings of only 1s appear. The number of 1s in these strings forms the sequence

Total /@ Cases
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Total /@ Cases[
  TSEvolveList[{2, {1 -> {2, 2}, 2 -> {1, 1, 1, 1}}}, {1, 1}, 
   1000], {1 ...}]

of successive powers of 2. (The 1-element-deletion rule 12, 211 gives the same sequence.)

The rule

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {2, 2}, 2 -> {1, 1, 1}}

starting from 11 yields instead

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSEvolveList[{2, {1 -> {2, 2}, 2 -> {1, 1, 1}}}, {1, 1}, 
   80], Automatic], MeshStyle -> GrayLevel[.75, .75], Frame -> False, 
 ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
   2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}]

and now the lengths of the sequences of 1s form the sequence:

Total /@ Cases
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Total /@ Cases[
  TSEvolveList[{2, {1 -> {2, 2}, 2 -> {1, 1, 1}}}, {1, 1}, 
   10000], {1 ...}]

This sequence is not as familiar as powers of 2, but it still has a fairly traditional “mathematical interpretation”: it is the result of iterating

n |-> Ceiling
&#10005
n |-> Ceiling[(3 n)/2]

or

n |-> If
&#10005
n |-> If[EvenQ[n], (3 n)/2, (3 n + 1)/2 ]

(and this same iteration applies for any initial string of 1s of any length).

But consider now the rule:

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {1, 2}, 2 -> {1, 1, 1}}

Here is what it does starting with sequences of 1s of different lengths:

Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Row[Table[
  ArrayPlot[
   PadRight[
    TSEvolveList[{2, {1 -> {1, 2}, 2 -> {1, 1, 1}}}, Table[1, k], 
     100]], ImageSize -> {Automatic, 150}, 
   ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
     2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}], {k, 2, 20}], 
 Spacer[2]]

In effect it is taking the initial number of 1s n and computing the function:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 ParallelTable[
  Last[Total /@ 
    Cases[TSEvolveList[{2, {1 -> {1, 2}, 2 -> {1, 1, 1}}}, 
      Table[1, k], 100000], {1 ...}]], {k, 1, 100}], Center, 
 Filling -> Axis, Frame -> True, PlotRange -> All, 
 AspectRatio -> 1/3, PlotStyle -> Hue[0.07, 1, 1]]

But what “is” this function? In effect it depends on the binary digits of n, and turns out to be given (for n > 1) by:

With
&#10005
With[{e = IntegerExponent[n + 1, 2]}, (3^e (n + 1))/2^e - 1]

What other “identifiable functions” can simple tag systems produce? Consider the rules:

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {2, 3}, 2 -> {1}, 3 -> {1, 1, 1}}

Starting with a string of five 1s this gives (3 is white)

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSEvolveList[{2, {1 -> {2, 3}, 2 -> {1}, 3 -> {1, 1, 1}}}, 
   Table[1, 5], 100], {22, 10}], Mesh -> True, 
 ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
   2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}, 
 MeshStyle -> GrayLevel[0.85, 0.75]]

in effect running for 21 steps and then terminating. If one looks at the string of 1s produced here, their sequence of lengths is 5, 8, 4, 2, 1, and in general the sequence is determined by the iteration

n |-> If
&#10005
n |-> If[EvenQ[n], n/2, 3 n + 1 ]

except that if n reaches 1 the tag system terminates, while the iteration keeps going.

So if we ask what this tag system is “doing”, we can say it’s computing 3n + 1 problem iterations, and we can explicitly “see it doing the computation”. Here it’s starting with n = 7

ArrayPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ArrayPlot[
 PadRight[TSEvolveList[{2, {1 -> {2, 3}, 2 -> {1}, 3 -> {1, 1, 1}}}, 
   Table[1, 7], 200]], Frame -> False, 
 ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
   2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}]

and here it’s starting with successive values of n:

Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Row[Table[
  ArrayPlot[
   PadRight[
    TSEvolveList[{2, {1 -> {2, 3}, 2 -> {1}, 3 -> {1, 1, 1}}}, 
     Table[1, k], 150], {150, Automatic}], 
   ImageSize -> {Automatic, 160}, Frame -> False, 
   ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
     2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}], {k, 2, 21}], 
 Spacer[2]]

Does the tag system always eventually halt? This is exactly the 3n + 1 problem—which has been unsolved for the better part of a century.

It might seem remarkable that even such a simple tag system rule can in effect give us such a difficult mathematical problem. But the Principle of Computational Equivalence makes this seem much less surprising—and in fact it tells us that we should expect tag systems to quickly “ascend out of” the range of computations to which we can readily assign traditional mathematical interpretations.

Changing the rule to

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {2, 3}, 2 -> {1, 1, 1}, 3 -> {1}}

yields instead the iteration

Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Row[Table[
  ArrayPlot[
   PadRight[
    TSEvolveList[{2, {1 -> {2, 3}, 2 -> {1, 1, 1}, 3 -> {1}}}, 
     Table[1, k], 150], {150, Automatic}], 
   ImageSize -> {Automatic, 160}, Frame -> False, 
   ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
     2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}], {k, 2, 21}], 
 Spacer[2]]

which again is “interpretable” as corresponding to the iteration:

n |-> If
&#10005
n |-> If[EvenQ[n], 3 n/2, (n - 1)/2]

But what if we consider all possible rules, say with the very simple form 1__, 2___? Here is what each of the 32 of these does starting from 1111:

alt
&#10005

For some of these we’ve been able to identify “traditional mathematical interpretations”, but for many we have not. And if we go even further and look at the very simplest nontrivial rules—of the form 1_, 2___—here is what happens starting from a string of 10 1s:

Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Row[ArrayPlot[
    PadRight[
     TSNEvolveList[{2, #}, Table[1, 10], 40], {40, Automatic}], 
    ImageSize -> {Automatic, 120}, Frame -> False, 
    ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
      2 -> Hue[.7, .8, .5], 
      0 -> GrayLevel[.85]}] & /@ (TakeList[#, {1, 3}] & /@ 
    Tuples[{1, 2}, 4]), Spacer[1]]

One of these rules we already discussed above:

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {2}, 2 -> {2, 2, 1}}

and we found that it seems to lead to infinite irregular growth (here shown “detrended” by ):

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 MapIndexed[# - (Sqrt[2] - 1) First[#2] &, 
  TagLengthFunction[{2, {{1}, {1, 1, 0}}}][Table[0, 10], 
   10000]], Center, AspectRatio -> 1/4, Filling -> Axis, 
 Frame -> True, PlotStyle -> Hue[0.07, 1, 1]]

But even in the case of

First
&#10005
First[#] -> Row[Last[#]] & /@ {1 -> {2}, 2 -> {1, 1, 1}}

which appears always to halt

Row
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Row[Table[
  ArrayPlot[
   PadRight[
    TSNEvolveList[{2, {{2}, {1, 1, 1}}}, Table[1, k], 40], {40, 
     Automatic}], ImageSize -> {Automatic, 120}, Frame -> False, 
   ColorRules -> {3 -> White, 1 -> Hue[.03, .9, 1], 
     2 -> Hue[.7, .8, .5], 0 -> GrayLevel[.85]}], {k, 17}], Spacer[1]]

the differences between halting times with successive sizes of initial strings form a surprisingly complex sequence

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Differences[
  First /@ Table[
    Length /@ 
     FindTransientRepeat[
      TSNEvolveList[{2, {{2}, {1, 1, 1}}}, Table[1, k], 600], 3], {k, 
     150}]], Center, PlotRange -> {0, 21}, AspectRatio -> 1/5, 
 Filling -> Axis, Frame -> True, PlotStyle -> Hue[0.07, 1, 1]]

that does not seem to have any simple traditional mathematical interpretation. (By the way, in a case like this it’s perfectly possible that there will be some kind of “mathematical interpretation”— though it might be like the page of weird definitions that I found for halting times of Turing machine 600720 in A New Kind of Science.)

So Does It Always Halt?

When Emil Post was studying his tag system back in 1921, one of his big questions was: “Does it always halt?” Frustratingly enough, I must report that even a century later I still haven’t been able to answer this question.

Running Post’s tag system on my computer I’m able to work out what it does billions of times faster than Post could. And I’ve been able to look at billions of possible initial strings. And I’ve found that it can take a very long time—like half a trillion steps—for the system to halt:

Show
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						Show[LengthsPlotDecimal[{2, 264107671}, 28, 643158954877, 100000000], 
 FrameTicks -> {{Automatic, 
    None}, {Thread[{Range[0, 643000][[1 ;; -1 ;; 100000]], 
      Append[Range[0, 500][[1 ;; -1 ;; 100]], "600 billion"]}], 
    None}}]

But so far—even with all the computation I’ve done—I haven’t found a single example where it doesn’t eventually halt.

If we were doing ordinary natural science, billions of examples that all ultimately work the same would normally be far more than enough to convince us of something. But from studying the computational universe we know that this kind of “scientific inference” won’t always be correct. Gödel’s theorem from 1931 introduced the idea of undecidability (and it was sharpened by Turing machines, etc.). And that’s what can bite us in the computational universe.

Because one of the consequences of undecidability as we now understand it is that there can be questions where there may be no bound on how much computation will be needed to answer them. So this means that even if we have failed to see something in billions of examples that doesn’t mean it’s impossible; it may just be that we haven’t done enough computation to see it.

In practice it’s tended to be assumed, though, that undecidability is something rare and exotic, that one will only run into if one asks some kind of awkward—or “meta”—question. But my explorations in the computational universe—and in particular my Principle of Computational Equivalence—imply that this is not correct, and that instead undecidability is quite ubiquitous, and occurs essentially whenever a system can behave in ways that are not obviously simple.

And this means that—despite the simplicity of its construction—it’s actually to be expected that something like the 00, 1101 tag system could show undecidability, and so that questions about it could require arbitrary amounts of computational effort to answer. But there’s something of a catch. Because the way one normally proves the presence of undecidability is by proving computation universality. But at least in the usual way of thinking about computation universality, a universal system cannot always halt—since otherwise it wouldn’t be able to emulate systems that themselves don’t halt.

So with this connection between halting and computation universality, we have the conclusion that if the 00, 1101 tag system always halts it cannot be computation universal. So from our failure to find a non-halting example the most obvious conclusion might be that our tag system does in fact always halt, and is not universal.

And this could then be taken as evidence against the Principle of Computational Equivalence, or at least its application to this case. But I believe strongly enough in the Principle of Computational Equivalence that I would tend to draw the opposite conclusion: that actually the 00, 1101 tag system is universal, and won’t always halt, and it’s just that we haven’t gone far enough in investigating it to see a non-halting example yet.

But how far should we have to go? Undecidability says we can’t be sure. But we can still potentially use experience from studying other systems to get some sense. And this in fact tends to suggest that we might have to go a long way to get our first non-halting example.

We saw above an example of cellular automata in which unbounded growth (a rough analog of non-halting) does occur, but we have to look through nearly 100,000 initial conditions before we find it. A New Kind of Science contains many other examples. And in number theory, it is quite routine to have Diophantine equations where the smallest solutions are very large.

How should we think about these kinds of things? In essence, we are taking computation universal systems and trying to “program them” (by setting up appropriate initial conditions) to have a particular form of behavior, say non-halting. But there is nothing to say these programs have to be short. Yes, non-halting might seem to us like a simple objective. And, yes, the universal system should in the end be able to achieve it. But given the particular components of the universal system, it may be complicated to get.

Let me offer two analogies. The first has to do with mathematical proofs. Having found the very simplest possible axiom system for Boolean algebra ((p·qr)·(p·((p·rp))==r, we know that in principle we can prove any theorem in Boolean algebra. But even something like p·q=q·p—that might seem simple to us—can take hundreds of elaborate steps to prove given our particular axiom system.

As a more whimsical example, consider the process of self-reproduction. It seems simple enough to describe this objective, yet to achieve it, say with the components of molecular biology, may be complex. And maybe on the early Earth it was only because there were so many molecules, and so much time, that self-reproduction could ever be “discovered”.

One might think that, yes, it could be difficult to find something (like a non-halting initial condition, or a configuration with particular behavior in a cellular automaton) by pure search, but that it would still be possible to systematically “engineer” one. And indeed there may be ways to “engineer” initial conditions for the 00, 1101 tag system. But in general it is another consequence of the Principle of Computational Equivalence (and computational irreducibility) that there is no guarantee that there will be any “simple engineering path” to reach any particular capability.

By the way, one impression from looking at tag systems and many other kinds of systems is that as one increases the sizes of initial conditions, one crosses a sequence of thresholds for different behaviors. Only at size 14, for example, might some long “highway” in our tag system’s state transition graph appear. And then nothing longer might appear until size 17. Or some particular period of final cycle might only appear at size-15 initial conditions. It’s as if there’s a “minimum program length” needed to achieve a particular objective, in a particular system. And perhaps similarly there’s a minimum initial string length necessary to achieve non-halting in our tag system—that we just don’t happen to have reached yet. (I’ve done random searches in longer initial conditions, though, so we at least know it’s not common there.)

OK, but let’s try a different tack. Let’s ask what would be involved in proving that the tag system doesn’t always halt. We’re trying to prove essentially the following statement: “There exists an initial condition i such that for all steps t the tag system has not halted”. In the language of mathematical logic this is a ∃∀ statement, that is at the level in the arithmetic hierarchy.

One way to prove it is just explicitly to find a string whose evolution doesn’t halt. But how would one show that the evolution doesn’t halt? It might be obvious: there might for example just be something like a fixed block that is getting added in a simple cycle of some kind, as in:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];ListStepPlot[

   Length /@ 
    TSGDirectEvolveList[{2, 
      TakeList[IntegerDigits[#[[1]], 3, 6], {1, 2, 3}]}, 
     IntegerDigits[#[[2, 2]], 3, #[[2, 1]]], 100], Center, 
   PlotRange -> {{0, 100}, Automatic}, AspectRatio -> 1/3, 
   Filling -> Axis, Frame -> True, FrameTicks -> False, 
   PlotStyle -> Hue[0.07, 1, 1]] &[52 -> {3, 20}]

But it also might not be obvious. It could be like some of our examples above where there seems to be systematic growth, but where there are small fluctuations:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 TagLengthFunction[{2, {{1}, {1, 1, 0}}}][{1, 0}, 200], Center, 
 AspectRatio -> 1/3, Filling -> Axis, Frame -> True, 
 FrameTicks -> False, PlotStyle -> Hue[0.07, 1, 1]]

Will these fluctuations suddenly become big and lead the system to halt? Or will they always stay somehow small enough that that cannot happen? There are plenty of questions like this that arise in number theory. And sometimes (as, for example, with the Skewes number associated with the distribution of primes) there can be surprises, with very long-term trends getting reversed only in exceptionally large cases.

By the way, even identifying “halting” can be difficult, especially if (as we do for our tag system) we define “halting” to include going into a cycle. For example, we saw above a tag system that does cycle, but takes more than 18,000 steps to do so:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ 
  TSGDirectEvolveList[{2, 
    TakeList[IntegerDigits[68, 3, 6], {1, 2, 3}]}, 
   IntegerDigits[6, 3, 2], 40000], Center, AspectRatio -> 1/5, 
 Filling -> Axis, Frame -> True, FrameTicks -> False, 
 PlotStyle -> Hue[0.07, 1, 1]]

Conversely, just because something takes a long time to halt doesn’t mean that it will be difficult to show this. For example, it is quite common to see Turing machines that take a huge number of steps to halt, but behave in basically systematic and predictable ways (this one takes 47,176,870 steps):

alt
&#10005

But to “explain why something halts” we might want to have something like a mathematical proof: a sequence of steps consistent with a certain set of axioms that derives the fact that the system halts. In effect the proof is a higher-level (“symbolic”) way of representing aspects of what the system is doing. Instead of looking at all the individual values at each step in the evolution of the system we’re just calling things x and y (or whatever) and deriving relationships between them at some kind of symbolic level.

And given a particular axiom system it may or may not be possible to construct this kind of symbolic proof of any given fact. It could be that the axiom system just doesn’t have the “derivational power” to represent faithfully enough what the system we are studying is doing.

So what does this mean for tag systems? It means, for example, that it could perfectly well be that a given tag system evolution doesn’t halt—but that we couldn’t prove that using, say, the axiom system of Peano Arithmetic.

And in fact as soon as we have a system that is computation universal it turns out that any finite axiom system must eventually fail to be able to give a finite proof of some fact about the system. We can think of the axioms as defining certain relations about the system. But computational irreducibility implies that eventually the system will be able to do things which cannot be “reduced” by any finite set of relations.

Peano Arithmetic contains as an axiom the statement that mathematical induction works, in the sense that if a statement s[0] is true, and s[n] implies s[n + 1], then any statement s[n] must be true. But it’s possible to come up with statements that entail for example nested collections of recursions that effectively grow too quickly for this axiom alone to be able to describe symbolically “in one go” what they can do.

If one uses a stronger axiom system, however, then one will be able to do this. And, for example, Zermelo–Fraenkel set theory—which allows not only ordinary induction but also transfinite induction—may succeed in being able to give a proof even when Peano Arithmetic fails.

But in the end any finitely specified axiom system will fail to be able to prove everything about a computationally irreducible system. Intuitively this is because making proofs is a form of computational reduction, and it is inevitable that this can only go so far. But more formally, one can imagine using a computational system to encode the possible steps that can be made with a given axiom system. Then one would construct a program in the computational system that would systematically enumerate all theorems in the axiom system. (It may be easier to think of first creating a multiway system in which each possible application of the axiom rules is made, and then “unrolling” the multiway system to be “run sequentially”.)

And for example we could set things up so that the computational system halts if it ever finds an inconsistency in the theorems derived from the axiom system. But then we know that we won’t be able to prove that the computational system does not halt from within the axiom system because (by Gödel’s second incompleteness theorem) no nontrivial axiom system can prove its own consistency.

So if we chose to work, say, purely within Peano Arithmetic, then it might be that Post’s original question is simply unanswerable. We might have no way to prove or disprove that his tag system always halts. To know that might require a finer level of analysis—or, in effect, a higher degree of reduction—than Peano Arithmetic can provide. (Picking a particular model of Peano Arithmetic would resolve the question, but to home in on a particular model can in effect require infinite computational effort.)

If we have a tag system that we know is universal then it’s inevitable that certain things about it will not be provable within Peano Arithmetic, or any other finitely specified axiom system. But for any given property of the system it may be very difficult to determine whether that property is provable within Peano Arithmetic.

The problem is similar to proving computation universality: in effect one has to see how to encode some specified structure within a particular formal system—and that can be arbitrarily difficult to do. So just as it may be very hard to prove that the 00, 1101 tag system is computation universal, it may also be very difficult to prove that some particular property of it is not “accessible” through Peano Arithmetic.

Could it be undecidable whether the 00, 1101 tag system always halts? And if we could prove this, would this actually have proved that it in fact doesn’t halt? Recall that above we mentioned that at least the obvious statement of the problem is at the level in the arithmetic hierarchy. And it turns out that statements at this level don’t have “default truth values”, so proving undecidability wouldn’t immediately give us a conclusion. But there’s nothing to say that some clever reformulation might not reduce the problem to or , at which point proving undecidability would lead to a definite conclusion.

(Something like this in fact happened with the Riemann Hypothesis. At first this seemed like a statement, but it was reformulated as a statement—and eventually reduced to the specific statement several sections above that a particular computation should not terminate. But now if the termination of this is proved undecidable, it must in fact not terminate, and the Riemann Hypothesis must be true.)

Can one prove undecidability without proving computation universality? There are in principle systems that show “intermediate degrees”: they exhibit undecidability but cannot directly be used to do universal computation (and Post was in fact the person who suggested that this might be possible). But actual examples of systems with intermediate degree still seem to involve having computation universality “inside”, but then limiting the input-output capabilities to prevent the universality from being accessed, beyond making certain properties undecidable.

The most satisfying (and ultimately satisfactory) way to prove universality for the 00, 1101 tag system would simply be to construct a compiler that takes a specification of some other system that is known to support universality (say a particular known-to-be-universal tag system, or the set of all possible tag systems) and then turns this into an initial string for the 00, 1101 tag system. The tag system would then “run” the string, and generate something that could readily be “decoded” as the result of the original computation.

But there are ways one might imagine establishing what amounts to universality, that could be enough to prove halting properties, even though they might not be as “practical” as actual ways to do computations. (Yes, one could conceivably imagine a molecular-scale computer that works just like a tag system.)

In the current proofs of universality for the simplest cellular automata and Turing machines, for example, one assumes that their initial configurations contain “background” periodic patterns, with the specific input for a particular computation being a finite-size perturbation to this background. For a cellular automaton or Turing machine it seems fairly unremarkable to imagine such a background: even though it extends infinitely across the cells of the system it somehow does not seem to be adding more than a small amount of “new information” to the system.

But for a tag system it’s more complicated to imagine an infinite periodic “background”, because at every step the string the system is dealing with is finite. One could consider modifying the rules of the tag system so that, for example, there is some fixed background that acts as a “mask” every time the block of elements is added at the end of the string. (For example, the mask could flip the value of every element, relative to a fixed “coordinate system”.)

But with the original tag system rules the only way to have an infinite background seems to be to have an infinite string. But how could this work? The rules of the tag system add elements at the end of the string, and if the string is infinitely long, it will take an infinite number of steps before the values of these elements ever matter to the actual behavior of the system.

There is one slightly exotic possibility, however, which is to think about transfinite versions of the tag system. Imagine that the string in the tag system has a length given by a transfinite number, say the ordinal ω. Then it is perfectly meaningful in the context of transfinite arithmetic to imagine additional elements being added at positions ω + 1 etc. And if the tag system then runs for ω steps, its behavior can start to depend on these added elements.

And even though the strings themselves would be infinite, there can still be a finite (“symbolic”) way to describe the system. For example, there could be a function f[i] which defines the value of the element. Then we can formally write down the rules for the tag system in terms of this function. And even though it would take an infinite time to explicitly generate the strings that are specified, it can still be possible to “reason” about what happens, just by doing symbolic operations on the function f.

Needless to say, the various issues I’ve discussed above about provability in particular axiom systems may come into play. But there may still be cases where definite results about computation universality could be established “symbolically” about transfinite tag systems. And conceivably such results could then be “projected down” to imply undecidability or other results about tag systems with finite initial strings.

Clearly the question of proving (or disproving) halting for the 00, 1101 tag system is a complicated one. We might be lucky, and be able to find with our computers (or conceivably engineer) an initial string that we can see doesn’t halt. Or we might be able to construct a symbolic representation in which we can carry out a proof.

But ultimately we are in a sense at the mercy of the Principle of Computational Equivalence. There is presumably computational irreducibility in the 00, 1101 tag system that we can’t systematically outrun.

Yes, the trace of the tag system seems to be a good approximation to a random walk. And, yes, as a random walk it will halt with probability 1. But in reality it’s not a “truly random” random walk; it’s a walk determined by a specific computational process. We can turn our questions about halting to questions about the randomness of the walk (and to do so may provide interesting connections with the foundations of probability theory). But in the end we’re back to the same issues, and we’re still confronted by computational irreducibility.

More about the History

Emil Post

Tag systems are simple enough that it’s conceivable they might have arisen in something like games even millennia ago. But for us tag systems—and particularly the specific 00, 1101 tag system we’ve mostly been studying—were the invention of Emil Post, in 1921.

Emil Post lived most of his life in New York City, though he was born (into a Jewish family) in 1897 in Augustow, Poland (then part of the Russian Empire). (And, yes, it’s truly remarkable how many of the notable contributors to mathematical logic in the early part of the 20th century were born to Jewish families in a fairly small region of what’s now eastern Poland and western Ukraine.)

As a child, Post seems to have at first wanted to be an astronomer, but having lost his left arm in a freak car-related street accident at age 12 he was told this was impractical—and turned instead to mathematics. Post went to a public high school for gifted students and then attended City College of New York, graduating with a bachelor’s degree in math in 1917. Perhaps presaging a lifelong interest in generalization, he wrote his first paper while in college (though it wasn’t published until 15+ years later), on the subject of fractional differentiation.

He enrolled in the math PhD program at Columbia, where he got involved in a seminar studying Whitehead and Russell’s recently published Principia Mathematica, run by Cassius Keyser, who was one of the early American mathematicians interested in the foundations of math (and who wrote many books on history and philosophy around mathematics; a typical example being his 1922 Mathematical Philosophy, a Study of Fate and Freedom). Early in graduate school, Post wrote a paper about functional equations for the gamma function (related to fractional differentiation), but soon he turned to logic, and his thesis—written in 1920—included early versions of what became his signature ideas.

Post’s main objective in his thesis was to simplify, streamline and further formalize Principia Mathematica. He started by looking at propositional calculus, and tried to “drill down” to find out more of what logic was really about. He invented truth tables (as several other people also independently did) and used them to prove completeness and consistency results. He investigated how different logic functions could be built up from one another through composition, classifying different elements of what’s now called the Post lattice. (He commented on Nand and an early simple axiom system for it—and might well have gone further with it if he’d known the minimal axiom system for Nand that I finally discovered in 2000. In another small-intellectual-world story, I realize now his lattice is also similar to my “cellular automaton emulation network”.) Going in the direction of “what’s logic really about” Post also considered multivalued logic, and algebraic structures around it.

Post published the core of his thesis in 1921 as “Introduction to a General Theory of Elementary Propositions”, but—in an unfortunate and recurring theme—didn’t publish the whole thing for another 20 years. But even in 1920 Post had what he called “generalization by postulation” and this quickly turned into the idea that all operations in Principia Mathematica (or mathematics in general) could ultimately be represented as transformations (“production rules”) on strings of characters.

When he finally ended up publishing this in 1943 he called the resulting formal structures “canonical systems”. And already by 1920 he’d discovered that not all possible production rules were needed; it was sufficient to have only ones in “normal form” g$$h, where $ is a “pattern variable”. (The idea of $ representing a pattern became common in early computer string-manipulation systems, and in fact I used it for expression patterns in my SMP system in 1979—probably without at the time knowing it came from Post.)

Post was close to the concept of universal computation, and the notion that anything (in his case, any string transformation) could be built up from a fixed set of primitives. And in 1920 —in the effort to “reduce his primitives” he came up with tag systems. At the time—11 years before Gödel’s theorem—Post and others still thought that it might somehow be possible to “solve mathematics” in some finite way. Post felt he had good evidence that Principia Mathematica could be reduced to string rewriting, so now he just had to solve that.

One basic question was how to tell when two strings should be considered equivalent under the string rewriting rules. And in formulating a simple case of this Post came up with tag systems. In particular, he wanted to determine whether the “iterative process [of tag] was terminating, periodic, or divergent”. And Post made “the problem of ‘tag’… the major project of [his] tenure of a Procter fellowship in mathematics at Princeton during the academic year 1920–21.”

Post later reported that a “major success of the project was the complete solution of the problem for all bases in which μ and ν were both 2”, though stated that “even this special case… involved considerable labor”. But then, as he later wrote, “while considerable effort was expanded [sic] on the case μ = 2, ν > 2… little progress resulted… [with] such a simple basis as 000, 11101, ν = 3, proving intractable”. Post makes a footnote “Numerous initial sequences… tried [always] led… to termination or periodicity, usually the latter.” Then he added, reflecting our random walk observations, “It might be noted that an easily derived probability ‘prognostication’ suggested… that periodicity was to be expected.” (I’m curious how he could tell it should be periodicity rather than termination.)

But by the end of the summer of 1921, Post had concluded that “the solution of the general problem of ‘tag’ appeared hopeless, and with it [his] entire program of the solution of finiteness problems”. In other words, the seemingly simple problem of tag had derailed Post’s whole program of “solving mathematics”.

In 1920 Princeton had a top American mathematics department, and Post went there on a prestigious fellowship (recently endowed by the Procter of Procter & Gamble). But—like the problem of tag—things did not work out so well there for Post, and in 1921 he had the first of what would become a sequence of “runaway mind” manic episodes, in what appears to have been a cycle of what was then called manic depression.

It’s strange to think that the problem of tag might have “driven Post crazy”, and probably the timing of the onset of manic depression had more to do with his age—though Post later seems to have believed that the excitement of research could trigger manic episodes (which often involved talking intensely about streams of poorly connected ideas, like the “psychic ether” from which new ideas come, discovering a new star named “Post”, etc.) But in any case, in late 1921 Post—who had by then returned to Columbia—was institutionalized.

By 1924 he had recovered enough to take up an instructorship at Cornell, but then relapsed. Over the years that followed he supported himself by teaching high school in New York, but continued to have mental health issues. He married in 1929, had a daughter in 1932, and in 1935 finally became a professor at City College, where he remained for the rest of his life.

Post published nothing from the early 1920s until 1936. But in 1936—with Gödel’s theorem known, and Alonzo Church’s “An Unsolvable Problem of Elementary Number Theory” recently published—Post published a 3-page paper entitled “Finite Combinatory Processes—Formulation 1”. Post comes incredibly close to defining Turing machines (he talks about “workers” interacting with a potentially infinite sequence of “marked” and “unmarked boxes”). And he says that he “expects [his] formulation to be logically equivalent to recursiveness in the sense of the Gödel–Church development”, adding “Its purpose, however, is not only to present a system of a certain logical potency but also, in its restricted field, of psychological fidelity”. Post doesn’t get too specific, but he does make the comment (rather resonating with my own work, and particularly our Physics Project) that the hypothesis of global success of these formalisms would be “not so much… a definition or an axiom but… a natural law”.

In 1936 Post also published his longest-ever paper: 142 pages on what he called “polyadic groups”. It’s basically about abstract algebra, but in typical Post style, it’s a generalization, involving looking not at binary “multiplication” operations but for example ternary ones. It’s not been a popular topic, though, curiously, I also independently got interested in it in the 1990s, eventually discovering Post’s work on it.

By 1941 Post was publishing more, including several now-classic papers in mathematical logic, covering things like degrees of unsolvability, the unsolvability of the word problem for semigroups, and what’s now called the Post Correspondence Problem. He managed his time in a very precise way, following a grueling teaching schedule (with intense and precise lectures planned to the minute) and—apparently to maintain his psychological wellbeing—restricting his research activities to three specific hours each day (interspersed with walks). But by then he was a respected professor, and logic had become a more popular field, giving him more of an audience.

In 1943, largely summarizing his earlier work, Post published “Formal Reductions of the General Combinatorial Decision Problem”, and in it, the “problem of tag” makes its first published appearance:

Post’s problem of tag

Post notes that “the little progress made in [its] solution” makes it a “candidate for unsolvability”. (Notice the correction in Post’s handwriting “intensely” “intensively” in the copy of his paper reproduced in his collected works.)

Through all this, however, Post continued to struggle with mental illness. But by the time he reached the age of 50 in 1947 he began to improve, and even loosened up on his rigid schedule. But in 1954 depression was back, and after receiving electroshock therapy (which he thought had helped him in the past), he died of a heart attack at the age of 57.

His former undergraduate student, Martin Davis, eventually published Post’s “Absolutely Undecidable Problems”, subtitled “Account of an Anticipation”, which describes the arc of Post’s work—including more detail on the story of tag systems. And in hindsight we can see how close Post came to discovering Gödel’s theorem and inventing the idea of universal computation. If instead of turning away from the complexity he found in tag systems he had embraced and explored it, I suspect he would have discovered not only foundational ideas of the 1930s, but also some of what I found half a century later in my by-then-computer-assisted explorations of the computational universe.

When Post died, he left many unpublished notes. A considerable volume of them concern a major project he launched in 1938 that he planned to call “Creative Logic”. He seemed to feel that “extreme abstraction” as a way of exploring mathematics would give way to something in which it’s recognized that “processes of deduction are themselves essentially physical and hence subject to formulations in a physical science”. And, yes, there’s a strange resonance here with my own current efforts—informed by our Physics Project—to “physicalize” metamathematics. And perhaps I’ll discover that here too Post anticipated what was to come.

So what happened to tag systems? By the mid-1950s Post’s idea of string rewriting (“production systems”) was making its way into many things, notably both the development of generative grammars in linguistics, and formal specifications of early computer languages. But tag systems—which Post had mentioned only once in his published works, and then as a kind of aside—were still basically unknown.

Post had come to his string rewriting systems—much as Turing had come to his Turing machines—as a way to idealize the processes of mathematics. But by the 1950s there was increasing interest in using such abstract systems as a way to represent “general computations”, as well as brains. And one person drawn in this direction was Marvin Minsky. After a math PhD in 1954 at Princeton on what amounted to analog artificial neural networks, he started exploring more discrete systems, initially finite automata, essentially searching for the simplest elements that would support universal computation (and, he hoped, thinking-like behavior).

Near the end of the 1950s he looked at Turing machines—and in trying to find the simplest form of them that would be universal started looking at their correspondence with Post’s string rewriting systems. Marvin Minsky knew Martin Davis from their time together as graduate students at Princeton, and by 1958 Davis was fully launched in mathematical logic, with a recently published book entitled Computability and Unsolvability.

As Davis tells it now, Minsky phoned him about some unsolvability results he had about Post’s systems, asking if they were of interest. Davis told him about tag systems, and that Post had thought they might be universal. Minsky found that indeed they were, publishing the result in 1960 in “Recursive Unsolvability of Post’s Problem of ‘Tag’ and Other Topics in Theory of Turing Machines”.

Minsky had recently joined the faculty at MIT, but also had a position at MIT’s Lincoln Laboratory, where in working on computing for the Air Force there was a collaboration with IBM. And it was probably through this that Minsky met John Cocke, a lifelong computer designer (and general inventor) at IBM (who in later years was instrumental in the development of RISC architecture). The result was that in 1963 Minsky and Cocke published a paper entitled “Universality of Tag Systems with P=2” that dramatically simplified Minsky’s construction and showed (essentially by compiling to a Turing machine) that universality could be achieved with tag systems that delete only 2 elements at each step. (One might think of it as an ultimate RISC architecture.)

For several years, Minsky had been trying to find out what the simplest universal Turing machine might be, and in 1962 he used the results Cocke and he had about tag systems to construct a 7-state, 4-color universal machine. That machine remained the record holder for the simplest known universal Turing machine for more than 40 years, though finally now we know the very simplest possible universal machine: a 2,3 machine that I discovered and conjectured would be universal—and that was proved so by Alex Smith in 2007 (thereby winning a prize I offered).

But back in 1967, the visibility of tag systems got a big boost. Minsky wrote an influential book entitled Computation: Finite and Infinite Machinesand the last part of the book was devoted to “Symbol-Manipulation Systems and Computability”, with Post’s string rewriting systems a centerpiece.

But my favorite part of Minsky’s book was always the very last chapter: “Very Simple Bases for Computability”. And there on page 267 is Post’s tag system:

From Marvin Minsky’s “Very Simple Bases for Computability”

Minsky reports that “Post found this (00, 1101) problem ‘intractable’, and so did I, even with the help of a computer”. But then he adds, in a style very characteristic of the Marvin Minsky I knew for nearly 40 years: “Of course, unless one has a theory, one cannot expect much help from a computer (unless it has a theory)…” He goes on to say that “if the reader tries to study the behavior of 100100100100100100 without [the aid of a computer] he will be sorry”.

Well, I guess computers have gotten a lot faster since the early 1960s; for me now it’s trivial to determine that this case evolves to a 10-cycle after 47 steps:

ListStepPlot
&#10005
CloudGet["https://www.wolframcloud.com/obj/sw-blog/PostTagSystem/Programs-01.wl"];
						ListStepPlot[
 Length /@ TSDirectEvolveList[Flatten[Table[{1, 0, 0}, 6]], 90], 
 Filling -> Axis, Frame -> True, AspectRatio -> 1/3, 
 PlotStyle -> Hue[0.07, 1, 1]]

(By the way, I recently asked Martin Davis if Post had ever run a tag system on a computer. He responded: “Goodness! When Post died von Neumann still thought that a dozen computers should suffice for America’s needs.  I guess I could have programmed [the tag system] for the [Institute for Advanced Study] computer, but it never occurred to me to do so.” Notably, in 1954 Davis did start programming logic theorem-proving algorithms on that computer.)

After their appearance in Minsky’s book, tag systems became “known”, but they hardly became famous, and only a very few papers appeared about them. In 1972, at least their name got some visibility, when Alan Cobham, a longtime IBMer then working on coding theory, published a paper entitled “Uniform Tag Sequences”. Yes, this was about tag systems, but now with just one element being deleted at each step, which meant there couldn’t really be any interaction between elements. The mathematics was much more tractable (this was one of several inventions of neighbor-independent substitution systems generating purely nested behavior), but it didn’t really say anything Post’s “problem of tag”.

Actually, I’ve Been Here Before…

When I started working on A New Kind of Science in 1991 I wanted to explore the computational universe of simple programs as widely as I could—to find out just how general (or not) the surprising phenomena I’d seen in cellular automata in the 1980s actually were. And almost from the beginning in the table of contents for my chapter on “The World of Simple Programs”, nestled between substitution systems and register machines, were tag systems (I had actually first mentioned tag systems in a paper in 1985):

“The World of Simple Programs”

In the main text, I only spent two pages on them:

Tag systems in A New Kind of Science

And I did what I have done so many times for so many kinds of systems: I searched and found remarkably simple rules that generate complex behavior. And then on these pages I showed my favorite examples. (I generalized Post’s specific tag systems by allowing dependence on more than just the first element.)

Did I look at Post’s specific 00, 1101 system? A New Kind of Science includes the note:

Notes on tag systems

And, yes, it mentions Post’s 00, 1101 tag system, then comments that “at least for all the initial conditions up to length 28, the rule eventually just leads to behavior that repeats”. An innocuous-looking statement, in very small print, tucked at the back of my very big book. But like so many such statements in the book, there was quite a lot behind it. (By the way, “length 28” then is what I would consider [compressed] length 9 now.)

A quick search of my filesystem quickly reveals (.ma is an earlier format for notebooks that, yes, we can still read over a third of a century later):

A quick search of my filesystem

I open one of the notebook files (and, yes, windows—and screens—were tiny in those days):

TagSystems2.nb

And there it is! Post’s 00, 1101 tag system, along with many others I was studying. And it seems I couldn’t let go of this; in 1994 I was running a standalone program to try to find infinitely growing cases. Here’s the output:

Output

So that’s where I got my statement about “up to size 28” (now size 9) from. I don’t know how long this took to run; “pyrethrum” was at the time the fastest computer at our company—with a newfangled 64-bit CPU (a DEC Alpha) running at the now-snail-sounding clock speed of 150 MHz.

My archives from the early 1990s record a fair amount of additional “traffic” about tag systems. Interactions with Marvin Minsky. Interactions with my then-research-assistant about what I ended up calling “cyclic tag systems” (I originally called them “cyclic substitution systems”).

For nearly 15 years there’s not much. That is, until June 25, 2007. It’s been my tradition since we started our Wolfram Summer School back in 2003 that on the first day I do a “live experiment”, and try to discover something. Well, that day I decided to look at tag systems. Here’s how I began:

LiveExperiment1-01.nb

Right there, it’s Post’s 00, 1101 system. And I think I took it further than I’d ever done before. Pretty soon I was finding “long survivors” (I even got one that lasted more than 200,000 steps):

LiveExperiment1-03.nb

I was drawing state transition graphs:

LiveExperiment1-02.nb

But I obviously decided that I couldn’t get further with the 00, 1101 system that day. So I turned to “variants” and quickly found the 2-element-deletion 1, 110 rule that I’ve described above.

I happened to write a piece about this particular live experiment (“Science: Live and in Public”), and right then I made a mental note: let me look at Post’s tag system again before its centenary, in 2021. So here we are….

The Path Forward

Emil Post didn’t manage to crack his 00, 1101 tag system back in 1921 with hand calculations. But we might imagine that a century later—with the equivalent of tens of billions times more computational power we’d be able to do. But so far I haven’t managed it.

For Post, the failure to crack his system derailed his whole intellectual worldview. For me now, the failure to crack Post’s system in a sense just bolsters my worldview—providing yet more indication of the strength and ubiquity of computational irreducibility and the Principle of Computational Equivalence.

After spending several weeks throwing hundreds of modern computers and all sorts of computational methods at Post’s 00, 1101 tag system, what do we know? Here’s a summary:

  •   All initial strings up to (uncompressed) length 84 lead either to cycles or termination
  •   The time to termination or cycling can be as long as 643 billion steps
  •   The sequence of lengths of strings generated seems to always behave much like a random walk
  •   The sequences of 0s and 1s generated seem effectively random, apart from about 31% statistical redundancy
  •   Most cycles are in definite families, but there are also some sporadic ones

What’s missing here? Post wanted to know whether the system would halt, and so do we. But now the Principle of Computational Equivalence makes a definite prediction. It predicts that the system should be capable of universal computation. And this basically has the implication that the system can’t always halt: there has to be some initial string that will make it grow forever.

In natural science it’s standard for theories to make predictions that can be investigated by doing experiments in the physical world. But the kind of predictions that the Principle of Computational Equivalence makes are more general; they’re not just about particular systems in the natural world, but about all possible abstract systems, and in a sense all conceivable universes. But it’s still possible to do experiments about them, though the experiments are now not physical ones, but abstract ones, carried out in the computational universe of possible programs.

And with Post’s tag system we have an example of one particular such experiment: can we find non-halting behavior that will validate the prediction that the system can support universal computation? To do so would be another piece of evidence for the breadth of applicability of the Principle of Computational Equivalence.

But what’s going to be involved in doing it? Computational irreducibility tells us that we can’t know.

Traditional mathematical science has tended to make the assumption that once you know an abstract theory for something, then you can work out anything you want about it. But computational irreducibility shows that isn’t true. And in fact it shows how there are fundamental limitations to science that intrinsically arise from within science itself. And our difficulty in analyzing Post’s tag system is in a sense just an “in your face” example of how strong these limitations can be.

But the Principle of Computational Equivalence that somewhere we’ll see non-halting behavior. It doesn’t tell us exactly what that behavior will be like, or how difficult it’ll be for us to interpret what we see. But it says that the “simple conclusion” of “always halting” shouldn’t continue forever.

I’ve so far done nearly a quintillion iterations of Post’s tag system in all. But that hasn’t been enough. I’ve been able to optimize the computations a bit. But fundamentally I’ve been left with what seems to be raw computational irreducibility. And to make progress I seem to need more time and more computers.

Will a million of today’s computers be enough? Will it take a billion? I don’t know. Maybe it requires a new level of computational speed. Maybe to resolve the question requires more steps of computation than the physical universe has ever done. I don’t know for sure. But I’m optimistic that it’s within the current computational capabilities of the world to find that little string of bits for the tag system that will allow us to see more about the general Principle of Computational Equivalence and what it predicts.

In the future there will be ever more that we will want and need to explore in the computational universe. And in a sense the problem of tag is a dry run for the kinds of things that we will see more and more often. But with the distinction of a century of history it’s a good place to rally our efforts and learn more about what’s involved.

So far it’s only been my computers that have been working on this. But we’ll be setting things up so that anyone can join the project. I don’t know if it’ll get solved in a month, a year or a century. But with the Principle of Computational Equivalence as my guide I’m confident there’s something interesting to discover. And a century after Emil Post defined the problem I, for one, want to see it resolved.

Notes

The main tag-system-related functions used are in the Wolfram Function Repository, as TagSystemEvolve, TagSystemEvolveList, TagSystemConvert, CyclicTagSystemEvolveList.

A list of t steps in the evolution of the tag system from an (uncompressed) initial list init can be achieved with

TagSystemEvolveList
&#10005
TagSystemEvolveList[init_List, t_Integer] := 
 With[{ru = 
    Dispatch[{{0, _, _, s___} -> {s, 0, 0}, {1, _, _, s___} -> {s, 1, 
        1, 0, 1}}]}, NestList[Replace[ru], init, t]]

or

TagSystemEvolveList
&#10005
TagSystemEvolveList[init_List, t_Integer] := 
 NestWhileList[
  Join[Drop[#, 3], {{0, 0}, {1, 1, 0, 1}}[[1 + First[#]]]] &, init, 
  Length[#] >= 3 &, 1, t]

giving for example:

TagSystemEvolveList
&#10005
TagSystemEvolveList[{1, 0, 0, 1, 0}, 4]

The list of lengths can be obtained from

TagSystemLengthList
&#10005
TagSystemLengthList[init_List, t_Integer] := 
 Reap[NestWhile[(Sow[Length[#]]; #) &[
      Join[Drop[#, 3], {{0, 0}, {1, 1, 0, 1}}[[1 + First[#]]]]] &, 
    init, Length[#] >= 3 &, 1, t]][[2, 1]]

giving for example:

TagSystemLengthList
&#10005
TagSystemLengthList[{1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0}, 25]

The output from t steps of evolution can be obtained from:

TagSystemEvolve
&#10005
TagSystemEvolve[init_List, t_Integer] := 
 NestWhile[Join[Drop[#, 3], {{0, 0}, {1, 1, 0, 1}}[[1 + First[#]]]] &,
   init, Length[#] >= 3 &, 1, t]

A version of this using a low-level queue data structure is:

TagSystemEvolve
&#10005
TagSystemEvolve[init_List, t_Integer] := 
 Module[{q = CreateDataStructure["Queue"]}, Scan[q["Push", #] &, init];
  Do[If[q["Length"] >= 3, 
    Scan[q["Push", #] &, If[q["Pop"] == 0, {0, 0}, {1, 1, 0, 1}]]; 
    Do[q["Pop"], 2]], t]; Normal[q]]

The compressed {p, values} form of a tag system state can be obtained with

TagSystemCompress
&#10005
TagSystemCompress[list_] := {Mod[Length[list], 3], 
  Take[list, 1 ;; -1 ;; 3]}

while an uncompressed form can be recovered with

TagSystemUncompress
&#10005
TagSystemUncompress[{p_, list_}, pad_ : 0] := 
 Join[Riffle[list, Splice[{pad, pad}]], 
  Table[pad, <|0 -> 2, 1 -> 0, 2 -> 1|>[p]]]

Each step in evolution in compressed form is obtained from

TagSystemCompressedStep
&#10005
TagSystemCompressedStep[{p_, {s_, r___}}] := 
 Apply[{#1, Join[{r}, #2]} &,
  <|{0, 0} -> {2, {0}}, {1, 0} -> {0, {}}, {2, 0} -> {1, {0}}, {0, 
      1} -> {1, {1, 1}}, {1, 1} -> {2, {0}}, {2, 1} -> {0, {1}}|>[{p, 
    s}]]

or:

TagSystemCompressedStep
&#10005
TagSystemCompressedStep[list : {_Integer, _List}] := 
 Replace[list, {{0, {0, s___}} -> {2, {s, 0}}, {1, {0, 
      s___}} -> {0, {s}}, {2, {0, s___}} -> {1, {s, 0}}, {0, {1, 
      s___}} -> {1, {s, 1, 1}}, {1, {1, s___}} -> {2, {s, 0}}, {2, {1,
       s___}} -> {0, {s, 1}}}]

The largest-scale computations done here made use of further-optimized code (available in the Wolfram Function Repository), in which the state of the tag system is stored in a bit-packed array, with 8 updates being done at a time by having a table of results for all 256 cases and using the first byte of the bit-packed array to index into this. This approach routinely achieves a quarter billion updates per second on current hardware. (Larger update tables no longer fit in L1 cache and so typically do not help.)

As I’ve mentioned, there isn’t a particularly large literature on the specific behavior of tag systems. In 1963 Shigeru Watanabe described the basic families of cycles for Post’s 00, 1101 tag system (though did not discover the “sporadic cases”). After A New Kind of Science in 2002, I’m aware of one extensive series of papers (partly using computer experiment methods) written by Liesbeth De Mol following her 2007 PhD thesis. Carlos Martin (a student at the Wolfram Summer School) also wrote about probabilistic methods for predicting tag system evolution.

Thanks, etc.

Thanks to Max Piskunov and Mano Namuduri for help with tag system implementations, Ed Pegg for tag system analysis (and for joining me in some tag system “hunting expeditions”), Matthew Szudzik and Jonathan Gorard for clarifying metamathematical issues, and Catherine Wolfram for help on the theory of random walks. Thanks also to Martin Davis and Margaret Minsky for clarifying some historical issues (and Dana Scott for having also done so long ago).

You can help!

We’re in the process of setting up a distributed computing project to try to answer Emil Post’s 100-year old tag system question. Let us know if you’d like to get involved….

What Is Consciousness? Some New Perspectives from Our Physics Project

$
0
0
consciousness-fi2

What Is Consciousness?--Visual Summary—click to enlarge

“What about Consciousness?”

For years I’ve batted it away. I’ll be talking about my discoveries in the computational universe, and computational irreducibility, and my Principle of Computational Equivalence, and people will ask “So what does this mean about consciousness?” And I’ll say “that’s a slippery topic”. And I’ll start talking about the sequence: life, intelligence, consciousness.

I’ll ask “What is the abstract definition of life?” We know about the case of life on Earth, with all its RNA and proteins and other implementation details. But how do we generalize? What is life generally? And I’ll argue that it’s really just computational sophistication, which the Principle of Computational Equivalence says happens all over the place. Then I’ll talk about intelligence. And I’ll argue it’s the same kind of thing. We know the case of human intelligence. But if we generalize, it’s just computational sophistication—and it’s ubiquitous. And so it’s perfectly reasonable to say that “the weather has a mind of its own”; it just happens to be a mind whose details and “purposes” aren’t aligned with our existing human experience.

I’ve always implicitly assumed that consciousness is just a continuation of the same story: something that, if thought about in enough generality, is just a feature of computational sophistication, and therefore quite ubiquitous. But from our Physics Project—and particularly from thinking about its implications for the foundations of quantum mechanics—I’ve begun to realize that at its core consciousness is actually something rather different. Yes, its implementation involves computational sophistication. But its essence is not so much about what can happen as about having ways to integrate what’s happening to make it somehow coherent and to allow what we might see as “definite thoughts” to be formed about it.

And rather than consciousness being somehow beyond “generalized intelligence” or general computational sophistication, I now instead see it as a kind of “step down”—as something associated with simplified descriptions of the universe based on using only bounded amounts of computation. At the outset, it’s not obvious that a notion of consciousness defined in this way could consistently exist in our universe. And indeed the possibility of it seems to be related to deep features of the formal system that underlies physics.

In the end, there’s a lot going on in the universe that’s in a sense “beyond consciousness”. But the core notion of consciousness is crucial to our whole way of seeing and describing the universe—and at a very fundamental level it’s what makes the universe seem to us to have the kinds of laws and behavior it does.

Consciousness is a topic that’s been discussed and debated for centuries. But the surprise to me is that with what we’ve learned from exploring the computational universe and especially from our recent Physics Project it seems there may be new perspectives to be had, which most significantly seem to have the potential to connect questions about consciousness to concrete, formal scientific ideas.

Inevitably the discussion of consciousness—and especially its connection to our new foundations of physics—is quite conceptually complex, and all I’ll try to do here is sketch some preliminary ideas. No doubt quite a bit of what I say can be connected to existing philosophical and other thinking, but so far I’ve only had a chance to explore the ideas themselves, and haven’t yet tried to study their historical context.

Observers and Their Physics

The universe in our models is full of sophisticated computation, all the way down. At the lowest level it’s just a giant collection of “atoms of space”, whose relationships are continually being updated according to a computational rule. And inevitably much of that process is computationally irreducible, in the sense that there’s no general way to “figure out what’s going to happen” except, in effect, by just running each step.

But given that, how come the universe doesn’t just seem to us arbitrarily complex and unpredictable? How come there’s order and regularity that we can perceive in it? There’s still plenty of computational irreducibility. But somehow there are also pockets of reducibility that we manage to leverage to form a simpler description of the world, that we can successfully and coherently make use of. And a fundamental discovery of our Physics Project is that the two great pillars of twentieth-century physics—general relativity and quantum mechanics—correspond precisely to two such pockets of reducibility.

There’s an immediate analog—that actually ends up being an example of the same fundamental computational phenomenon. Consider a gas, like air. Ultimately the gas consists of lots of molecules bouncing around in a complicated way that’s full of computational irreducibility. But it’s a central fact of statistical mechanics that if we look at the gas on a large scale, we can get a useful description of what it does just in terms of properties like temperature and pressure. And in effect this reflects a pocket of computational reducibility, that allows us to operate without engaging with all the computational irreducibility underneath.

How should we think about this? An idea that will generalize is that as “observers” of the gas, we’re conflating lots of different microscopic configurations of molecules, and just paying attention to overall aggregate properties. In the language of statistical mechanics, it’s effectively a story of “coarse graining”. But within our computational approach, there’s now a clear, computational way to characterize this. At the level of individual molecules there’s an irreducible computation happening. And to “understand what’s going on” the observer is doing a computation. But the crucial point is that if there’s a certain boundedness to that computation then this has immediate consequences for the effective behavior the observer will perceive. And in the case of something like a gas, it turns out to directly imply the Second Law of Thermodynamics.

In the past there’s been a certain amount of mystery around the origin and validity of the Second Law. But now we can see it as a consequence of the interplay between underlying computational irreducibility and the computational boundedness of observers. If the observer kept track of all the computationally irreducible motions of individual molecules, they wouldn’t see Second Law behavior. The Second Law depends on a pocket of computational reducibility that in effect emerges only when there’s a constraint on the observer that amounts to the requirement that the observer has a “coherent view” of what’s going on.

So what about physical space? The traditional view had been that space was something that could to a large extent just be described as a coherent mathematical object. But in our models of physics, space is actually made of an immense number of discrete elements whose pattern of interconnections evolves in a complex and computationally irreducible way. But it’s much like with the gas molecules. If an observer is going to form a coherent view of what’s going on, and if they have bounded computational capabilities, then this puts definite constraints on what behavior they will perceive. And it turns out that those constraints yield exactly relativity.

In other words, for the “atoms of space”, relativity is the result of the interplay between underlying computational irreducibility and the requirement that the observer has a coherent view of what’s going on.

It may be helpful to fill in a little more of the technical details. Our underlying theory basically says that each elementary element of space follows computational rules that will yield computationally irreducible behavior. But if that was all there was to it, the universe would seem like a completely incoherent place, with every part of it doing irreducibly unpredictable things.

But imagine there’s an observer who perceives coherence in the universe. And who, for example, views there as being a definite coherent notion of “space”. What can we say about such an observer? The first thing is that since our model is supposed to describe everything in the universe, it must in particular include our observer. The observer must be an embedded part of the system—made up of the same atoms of space, and following the same rules, as everything else.

And there’s an immediate consequence to this. From “inside” the system there are only certain things about the system that the observer can perceive. Let’s say, for example, that in the whole universe there’s only one point at which anything is updated at any given time, but that that “update point” zips around the universe (in “Turing machine style”), sometimes updating a piece of the observer, and sometimes updating something they were observing. If one traces through scenarios like this, one realizes that from “inside the system” the only thing the observer can ever perceive is causal relationships between events.

They can’t tell “specifically when” any given event happens; all they can tell is what event has to happen before what other one, or in other words, what the causal relationships between events are. And this is the beginning of what makes relativity inevitable in our models.

But there are two other pieces. If the observer is going to have a coherent description of “space” they can’t in effect be tracking each atom separately; they’ll have to fit them into some overall framework, say by assigning each of them particular “coordinates”, or, in the language of relativity, defining a “reference frame” that conflates many different points in space. But if the observer is computationally bounded, then this puts constraints on the structure of the reference frame: it can’t for example be so wild that it separately traces the computationally irreducible behavior of individual atoms of space.

But let’s say an observer has successfully picked some reference frame. What’s to say that as the universe evolves it’s still possible to consistently maintain that reference frame? Well, this relies on a fundamental property that we believe either directly or effectively defines the operation of our universe: what we call “causal invariance”. The underlying rules just describe possible ways that the connections between atoms of space can be updated. But causal invariance implies that whatever actual sequence of updatings is used, there must always be the same graph of causal relationships.

And it’s this that gives observers the ability to pick different reference frames, and still have the same consistent and coherent perception of the behavior of the universe. And in the end, we have a definite result: that if there’s underlying computational irreducibility—plus causal invariance—then any observer who forms their perception of the universe in a computationally bounded way must inevitably perceive the universe to follow the laws of general relativity.

But—much like with the Second Law—this conclusion relies on having an observer who forms a coherent perception of the universe. If the observer could separately track every atom of space they won’t “see general relativity”; that only emerges for an observer who forms a coherent perception of the universe.

The Quantum Observer

OK, so what about quantum mechanics? How does that relate to observers? The story is actually surprisingly similar to both the Second Law and general relativity: quantum mechanics is again something that emerges as a result of trying to form a coherent perception of the universe.

In ordinary classical physics one considers everything that happens in the universe to happen in a definite way, in effect defining a single thread of history. But the essence of quantum mechanics is that actually there are many threads of history that are followed. And an important feature of our models is that this is inevitable.

The underlying rules define how local patterns of connections between atoms of space should be updated. But in the hypergraph of connections that represents the universe there will in general be many different places where the rules can be applied. And if we trace all the possibilities we get a multiway graph that includes many possible threads of history, sometimes branching and sometimes merging.

So how will an observer perceive all this? The crucial point is that the observer is themselves part of this multiway system. So in other words, if the universe is branching, so is the observer. And in essence the question becomes how a “branching brain” will perceive a branching universe.

It’s fairly easy to imagine how an observer who is “spatially large” compared to individual molecules in a gas—or atoms of space—could conflate their view of these elements so as to perceive only some aggregate property. Well, it seems like very much the same kind of thing is going on with observers in quantum mechanics. It’s just that instead of being extended in physical space, they’re extended in what we call branchial space.

Consider a multiway graph representing possible histories for a system. Now imagine slicing through this graph at a particular level that in effect corresponds to a particular time. In that slice there will be a certain set of nodes of the multiway graph, representing possible states of the system. And the structure of the multiway graph then defines relationships between these states (say through common ancestry). And in a large-scale limit we can say that the states are laid out in branchial space.

In the language of quantum mechanics, the geometry of branchial space in effect defines a map of entanglements between quantum states, and coordinates in branchial space are like phases of quantum amplitudes. In the evolution of a quantum system, one might start from a certain bundle of quantum states, then follow their threads of history, looking at where in branchial space they go.

But what would a quantum observer perceive about this? Even if they didn’t start that way, over time a quantum observer will inevitably become spread out in branchial space. And so they’ll always end up sampling a whole region in branchial space, or a whole bundle of “threads of history” in the multiway graph.

What will they make of them? If they considered each of them separately no coherent picture would emerge, not least since the underlying evolution of individual threads of history can be expected to be computationally irreducible. But what if the observer just defines their way of viewing things to be one that systematically organizes different threads of history, say by conflating “computationally nearby” ones? It’s similar to setting up a reference frame in relativity, except that now the coherent representation that this “quantum frame” defines is of branchial space rather than physical space.

But what will this coherent representation be like? Well, it seems to be exactly quantum mechanics as it was developed over the past century. In other words, just like general relativity emerges as an aggregate description of physical space formed by a computationally bounded observer, so quantum mechanics emerges as an aggregate description of branchial space.

Does the observer “create” the quantum mechanics? In some sense, yes. Just as in the spacetime case, the multiway graph has all sorts of computationally irreducible things going on. But if there’s an observer with a coherent description of what’s going on, then their description must follow the laws of quantum mechanics. Of course, there are lots of other things going on too—but they don’t fit into this coherent description.

OK, but let’s say that we have an observer who’s set up a quantum frame that conflates different threads of history to get a coherent description of what’s going on. How will their description correlate with what another observer—with a different quantum frame—would perceive? In the traditional formalism of quantum mechanics it’s always been difficult to explain why different observers—making different measurements—still fundamentally perceive the universe to be working the same.

In our model, there’s a clear answer: just like in the spacetime case, if the underlying rules show causal invariance, then regardless of the frame one uses, the basic perceived behavior will always be the same. Or, in other words, causal invariance guarantees the consistency of the behavior deduced by different observers.

There are many technical details to this. The traditional formalism of quantum mechanics has two separate parts. First, the time evolution of quantum amplitudes, and second, the process of measurement. In our models, there’s a very beautiful correspondence between the phenomenon of motion in space and the evolution of quantum amplitudes. In essence, both are associated with the deflection of (geodesic) paths by the presence of energy-momentum. But in the case of motion this deflection (that we identify as the effect of gravity) happens in physical space, while in the quantum case the deflection (that we identify as the phase change specified by the path integral) happens in branchial space. (In other words, the Feynman path integral is basically just the direct analog in branchial space of the Einstein equations in physical space.)

OK, so what about quantum measurement? Doing a quantum measurement involves somehow taking many threads of history (corresponding to a superposition of many quantum states) and effectively reducing them to a single thread that coherently represents the “outcome”. A quantum frame defines a way to do this—in effect specifying the pattern of threads of history that should be conflated. In and of itself, a quantum frame—like a relativistic reference frame—isn’t a physical thing; it just defines a way of describing what’s going on.

But as a way of probing possible coherent representations that an observer can form, one can consider what happens if one formally conflates things according to a particular quantum frame. In an analogy where the multiway graph defines inferences between propositions in a formal system, conflating things is like “performing certain completions”. And each completion is then like an elementary step in the act of measurement. And by looking at the effect of all necessary completions one gets the “Completion Interpretation of Quantum Mechanics” suggested by Jonathan Gorard.

Assuming that the underlying rule for the universe ultimately shows causal invariance, doing these completions is never fundamentally necessary, because different threads of history will always eventually give the same results for what can be perceived within the system. But if we want to get a “possible snapshot” of what the system is doing, we can pick a quantum frame and formally do the completions it defines.

Doing this doesn’t actually “change the system” in a way that we would “see from outside”. It’s only that we’re in effect “doing a formal projection” to see how things would be perceived by an observer who’s picked a particular quantum frame. And if the observer is going to have a coherent perception of what’s going on, they in effect have to have picked some specific quantum frame. But then from the “point of view of the observer” the completions associated with that frame in some sense “seem real” because they’re the way the observer is accessing what’s going on.

Or, in other words, the way a computationally bounded “branching brain” can have a coherent perception of a “branching universe” is by looking at things in terms of quantum frames and completions, and effectively picking off a computationally reducible slice of the whole computationally irreducible evolution of the universe—where it then turns out that the slice must necessarily follow the laws of quantum mechanics.

So, once again, for a computationally bounded observer to get a coherent perception of the universe—with all its underlying computational irreducibility—there’s a strong constraint on what that perception can be. And what we’ve discovered is that it turns out to basically have to follow the two great core theories of twentieth-century physics: general relativity and quantum mechanics.

It’s not immediately obvious that there has to be any way to get a coherent perception of the universe. But what we now know is that if there is, it essentially forces specific major results about physics. And, of course, if there wasn’t any way to get a coherent perception of the universe there wouldn’t really be systematic overall laws, or, for that matter, anything like physics, or science as we know it.

So, What Is Consciousness?

What’s special about the way we humans experience the world? At some level, the very fact that we even have a notion of “experiencing” it at all is special. The world is doing what it does, with all sorts of computational irreducibility. But somehow even with the computationally bounded resources of our brains (or minds) we’re able to form some kind of coherent model of what’s going on, so that, in a sense, we’re able to meaningfully “form coherent thoughts” about the universe. And just as we can form coherent thoughts about the universe, so also we can form coherent thoughts about that small part of the universe that corresponds to our brains—or to the computations that represent the operation of our minds.

But what does it mean to say that we “form coherent thoughts”? There’s a general notion of computation, which the Principle of Computational Equivalence tells us is quite ubiquitous. But it seems that what it means to “form coherent thoughts” is that computations are being “concentrated down” to the point where a coherent stream of “definite thoughts” can be identified in them.

At the outset it’s certainly not obvious that our brains—with their billions of neurons operating in parallel—should achieve anything like this. But in fact it seems that our brains have a quite specific neural architecture—presumably produced by biological evolution—that in effect attempts to “integrate and sequentialize” everything. In our cortex we bring together sensory data we collect, then process it with a definite thread of attention. And indeed in medical settings observed deficits in this are what are normally used to identify absence of levels of consciousness. There may still be neurons firing but without integration and sequentialization there doesn’t really seem to be what we normally consider consciousness.

These are biological details. But they seem to point to a fundamental feature of consciousness. Consciousness is not about the general computation that brains—or, for that matter, many other things—can do. It’s about the particular feature of our brains that causes us to have a coherent thread of experience.

But what we have now realized is that the notion of having a coherent thread of experience has deep consequences that far transcend the details of brains or biology. Because in particular what we’ve seen is that it defines the laws of physics, or at least what we consider the laws of physics to be.

Consciousness—like intelligence—is something of which we only have a clear sense in the single case of humans. But just as we’ve seen that the notion of intelligence can be generalized to the notion of arbitrary sophisticated computation, so now it seems that the notion of consciousness can be generalized to the notion of forming a coherent thread of representation for computations.

Operationally, there’s potentially a rather straightforward way to think about this, though it depends on our recent understanding of the concept of time. In the past, time in fundamental physics was usually viewed as being another dimension, much like space. But in our models of fundamental physics, time is something quite different from space. Space corresponds to the hypergraph of connections between the elements that we can consider as “atoms of space”. But time is instead associated with the inexorable and irreducible computational process of repeatedly updating these connections in all possible ways.

There are definite causal relationships between these updating events (ultimately defined by the multiway causal graph), but one can think of many of the events as happening “in parallel” in different parts of space or on different threads of history. But this kind of parallelism is in a sense antithetical to the concept of a coherent thread of experience.

And as we’ve discussed above, the formalism of physics—whether reference frames in relativity or quantum mechanics—is specifically set up to conflate things to the point where there is a single thread of evolution in time.

So one way to think about this is that we’re setting things up so we only have to do sequential computation, like a Turing machine. We don’t have multiple elements getting updated in parallel like in a cellular automaton, and we don’t have multiple threads of history like in a multiway (or nondeterministic) Turing machine.

The operation of the universe may be fundamentally parallel, but our “parsing” and “experience” of it is somehow sequential. As we’ve discussed above, it’s not obvious that such a “sequentialization” would be consistent. But if it’s done with frames and so on, the interplay between causal invariance and underlying computational irreducibility ensures that it will be—and that the behavior of the universe that we’ll perceive will follow the core features of twentieth-century physics, namely general relativity and quantum mechanics.

But do we really “sequentialize” everything? Experience with artificial neural networks seems to give us a fairly good sense of the basic operation of brains. And, yes, something like initial processing of visual scenes is definitely handled in parallel. But the closer we get to things we might realistically describe as “thoughts” the more sequential things seem to get. And a notable feature is that what seems to be our richest way to communicate thoughts, namely language, is decidedly sequential.

When people talk about consciousness, something often mentioned is “self-awareness” or the ability to “think about one’s own processes of thinking”. Without the conceptual framework of computation, this might seem quite mysterious. But the idea of universal computation instead makes it seem almost inevitable. The whole point of a universal computer is that it can be made to emulate any computational system—even itself. And that is why, for example, we can write the evaluator for Wolfram Language in Wolfram Language itself.

The Principle of Computational Equivalence implies that universal computation is ubiquitous, and that both brains and minds, as well as the universe at large, have it. Yes, the emulated version of something will usually take more time to execute than the original. But the point is that the emulation is possible.

But consider a mind in effect thinking about itself. When a mind thinks about the world at large, its process of perception involves essentially making a model of what’s out there (and, as we’ve discussed, typically a sequentialized one). So when the mind thinks about itself, it will again make a model. Our experiences may start by making models of the “outside world”. But then we’ll recursively make models of the models we make, perhaps barely distinguishing between “raw material” that comes from “inside” and “outside”.

The connection between sequentialization and consciousness gives one a way to understand why there can be different consciousnesses, say associated with different people, that have different “experiences”. Essentially it’s just that one can pick different frames and so on that lead to different “sequentialized” accounts of what’s going on.

Why should they end up eventually being consistent, and eventually agreeing on an objective reality? Essentially for the same reason that relativity works, namely that causal invariance implies that whatever frame one picks, the causal graph that’s eventually traced out is always the same.

If it wasn’t for all the interactions continually going on in the universe, there’d be no reason for the experience of different consciousnesses to get aligned. But the interactions—with their underlying computational irreducibility and overall causal invariance—lead to the consistency that’s needed, and, as we’ve discussed, something else too: particular effective laws of physics, that turn out to be just the relativity and quantum mechanics we know.

Other Consciousnesses

The view of consciousness that we’ve discussed is in a sense focused on the primacy of time: it’s about reducing the “parallelism” associated with space—and branchial space—to allow the formation of a coherent thread of experience, that in effect occurs sequentially in time.

And it’s undoubtedly no coincidence that we humans are in effect well placed in the universe to be able to do this. In large part this has to do with the physical sizes of things—and with the (undoubtedly not coincidental) fact that human scales are intermediate between those at which the effects of either relativity or quantum mechanics become extreme.

Why can we “ignore space” to the point where we can just discuss things happening “wherever” at a sequence of moments in time? Basically it’s because the speed of light is large compared to human scales. In our everyday lives the important parts of our visual environment tend to be at most tens of meters away—so it takes light only tens of nanoseconds to reach us. Yet our brains process information on timescales measured in milliseconds. And this means that as far as our experience is concerned, we can just “combine together” things at different places in space, and consider a sequence of instantaneous states in time.

If we were the size of planets, though, this would no longer work. Because—assuming our brains still ran at the same speed—we’d inevitably end up with a fragmented visual experience, that we wouldn’t be able to think about as a single thread about which we can say “this happened, then that happened”.

Even at standard human scale, we’d have somewhat the same experience if we used for example smell as our source of information about the world (as, say, dogs to a large extent do). Because in effect the “speed of smell” is quite slow compared to brain processing. And this would make it much less useful to identify our usual notion of “space” as a coherent concept. So instead we might invent some “other physics”, perhaps labeling things in terms of the paths of air currents that deliver smells to us, then inventing some elaborate gauge-field-like construct to talk about the relations between different paths.

In thinking about our “place in the universe” there’s also another important effect: our brains are small and slow enough that they’re not limited by the speed of light, which is why it’s possible for them to “form coherent thoughts” in the first place. If our brains were the size of planets, it would necessarily take far longer than milliseconds to “come to equilibrium”, so if we insisted on operating on those timescales there’d be no way—at least “from the outside”—to ensure a consistent thread of experience.

From “inside”, though, a planet-size brain might simply assume that it has a consistent thread of experience. And in doing this it would in a sense try to force a different physics on the universe. Would it work? Based on what we currently know, not without at least significantly changing the notions of space and time that we use.

By the way, the situation would be even more extreme if different parts of a brain were separated by permanent event horizons. And it seems as if the only way to maintain a consistent thread of experience in this case would be in effect to “freeze experience” before the event horizons formed.

What if we and our brains were much smaller than they actually are? As it is, our brains may contain perhaps 10300 atoms of space. But what if they contained, say, only a few hundred? Probably it would be hard to avoid computational irreducibility—and we’d never even be able to imagine that there were overall laws, or generally predictable features of the universe, and we’d never be able to build up the kind of coherent experience needed for our view of consciousness.

What about our extent in branchial space? In effect, our perception that “definite things happen even despite quantum mechanics” implies a conflation of the different threads of history that exist in the region of branchial space that we occupy. But how much effect does this have on the rest of the universe? It’s much like the story with the speed of light, except now what’s relevant is a new quantity that appears in our models: the maximum entanglement speed. And somehow this is large enough that over “everyday scales” in branchial space it’s adequate for us just to pick a quantum frame and treat it as something that can be considered to have a definite state at any given instant in time—so that we can indeed consistently maintain a “single thread of experience”.

OK, so now we have a sense of why with our particular human scale and characteristics our view of consciousness might be possible. But where else might consciousness be possible?

It’s a tricky and challenging thing to ask. To achieve our view of consciousness we need to be able to build up something that “viewed from the inside” represents a coherent thread of experience. But the issue is that we’re in effect “on the outside”. We know about our human thread of experience. And we know about the physics that effectively follows from it. And we can ask how we might experience that if, for example, our sensory systems were different. But to truly “get inside” we have to be able to imagine something very alien. Not only different sensory data and different “patterns of thinking”, but also different implied physics.

An obvious place to start in thinking about “other consciousnesses” is with animals and other organisms. But immediately we have the issue of communication. And it’s a fundamental one. Perhaps one day there’ll be ways for various animals to fluidly express themselves through something like human-relatable videogames. But as of now we have surprisingly little idea how animals “think about things”, and, for example, what their experience of the world is.

We can guess that there will be many differences from ours. At the simplest level, there are organisms that use different sensory modalities to probe the world, whether those be smell, sound, electrical, thermal, pressure, or other. There are “hive mind” organisms, where whatever integrated experience of the world there may be is built up through slow communication between different individuals. There are organisms like plants, which are (quite literally) rooted to one place in space. There are also things like viruses where anything akin to an “integrated thread of experience” can presumably only emerge at the level of something like the progress of an epidemic.

Meanwhile, even in us, there are things like the immune system, which in effect have some kind of “thread of experience” though with rather different input and output than our brains. Even if it seems bizarre to attribute something like consciousness to the immune system, it is interesting to try to imagine what its “implied physics” would be.

One can go even further afield, and think about things like the complete tree of life on Earth, or, for that matter, the geological history of the Earth, or the weather. But how can these have anything like consciousness? The Principle of Computational Equivalence implies that all of them have just the same fundamental computational sophistication as our brains. But, as we have discussed, consciousness seems to require something else as well: a kind of coherent integration and sequentialization.

Take the weather as an example. Yes, there is lots of computational sophistication in the patterns of fluid flow in the atmosphere. But—like fundamental processes in physics—it seems to be happening all over the place, with nothing, it seems, to define anything like a coherent thread of experience.

Coming a little closer to home, we can consider software and AI systems. One might expect that to “achieve consciousness” one would have to go further than ever before and inject some special “human-like spark”. But I suspect that the true story is rather different. If one wants the systems to make the richest use of what the computational universe has to offer, then they should behave a bit like fundamental physics (or nature in general), with all sorts of components and all sorts of computationally irreducible behavior.

But to have something like our view of consciousness requires taking a step down, and effectively forcing simpler behavior in which things are integrated to produce a “sequentialized” experience. And in the end, it may not be that different from picking out of the computational universe of possibilities just what can be expressed in a definite computational language of the kind the Wolfram Language provides.

Again we can ask about the “implied physics” of such a setup. But since the Wolfram Language is modeled on picking out the computational essence of human thinking it’s basically inevitable that its implied physics will be largely the same as the ordinary physics that is derived from ordinary human thinking.

One feature of having a fundamental model for physics is that it “reduces physics to mathematics”, in the sense that it provides a purely formal system that describes the universe. So this raises the question of whether one can think about consciousness in a formal system, like mathematics.

For example, imagine a formal analog of the universe constructed by applying axioms of mathematics. One would build up an elaborate network of theorems, that in effect populate “metamathematical space”. This setup leads to some fascinating analogies between physics and metamathematics. The notion of time effectively remains as always, but here represents the progressive proving of new mathematical theorems.

The analog of our spatial hypergraph is a structure that represents all theorems proved up to a given time. (And there’s also an analog of the multiway graph that yields quantum mechanics, but in which different paths now in effect represent different possible proofs of a theorem.) So what about things like reference frames?

Well, just as in physics, a reference frame is something associated with an observer. But here the observer is observing not physical space, but metamathematical space. And in a sense any given observer is “discovering mathematics in a particular order”. It could be that all the different “points in metamathematical space” (i.e. theorems) are behaving in completely incoherent—and computationally irreducible—ways. But just as in physics, it seems that there’s a certain computational reducibility: causal invariance implies that different reference frames will in a sense ultimately always “see the same mathematics”.

There’s an analog of the speed of light: the speed at which a new theorem can affect theorems that are progressively further away in metamathematical space. And relativistic invariance then becomes the statement that “there’s only one mathematics”—but it can just be explored in different ways.

How does this relate to “mathematical consciousness”? The whole idea of setting up reference frames in effect relies on the notion that one can “sequentialize metamathematical space”. And this in turn relies on a notion of “mathematical perception”. The situation is a bit like in physics. But now one has a formalized mathematician whose mind stretches over a certain region of metamathematical space.

In current formalized approaches to mathematics, a typical “human-scale mathematical theorem” might correspond to perhaps 105 lowest-level mathematical propositions. Meanwhile, the “mathematician” might “integrate into their experience” some small fraction of the metamathematical universe (which, for human mathematics, is currently perhaps 3 × 106 theorems). And it’s this setup—which amounts to defining a “sequentialized mathematical consciousness”—that means it makes sense to do analysis using reference frames, etc.

So, just as in physics, it’s ultimately the characteristics of our consciousness that lead to the physics we attribute to the universe, so something similar seems to happen in mathematics.

Clearly we’ve now reached a quite high level of abstraction, so perhaps it’s worth mentioning one more wrinkle that involves an even higher level of abstraction.

We’ve talked about applying a rule to update the abstract structure that represents the universe. And we’ve discussed the fact that the rule can be applied at different places, and on different threads of history. But there’s another freedom: we don’t have to consider a specific rule; we can consider all possible rules.

The result is a rulial multiway graph of possible states of the universe. On different paths, different specific rules are followed. And if you slice across the graph you can get a map of states laid out in rulial space, with different positions corresponding to the outcomes of applying different rules to the universe.

An important fact is then that at the level of the rulial multiway graph there is always causal invariance. So this means that different “rulial reference frames” must always ultimately give equivalent results. Or, in other words, even if one attributes the evolution of the universe to different rules, there is always fundamental equivalence in the results.

In a sense, this can be viewed as a reflection of the Principle of Computational Equivalence and the fundamental idea that the universe is computational. In essence it is saying that since whatever rules one uses to “construct the universe” are almost inevitably computation universal, one can always use them to emulate any other rules.

How does this relate to consciousness? Well, one feature of different rulial reference frames is that they can lead to utterly and incoherently different basic descriptions of the universe.

One of them could be our hypergraph-rewriting-based setup, with a representation of space that corresponds well with what emerged in twentieth-century physics. But another could be a Turing machine, in which one views the updating of the universe as being done by a single head zipping around to different places.

We’ve talked about some possible systems in which consciousness could occur. But one we haven’t yet mentioned—but which has often been considered—is “extraterrestrial intelligences”. Before our Physics Project one might reasonably have assumed that even if there was little else in common with such “alien intelligences”, at least they would be “experiencing the same physics”.

But it’s now clear that this absolutely does not need to be the case. An alien intelligence could perfectly well be experiencing the universe in a different rulial reference frame, utterly incoherent with the one we use.

Is there anything “sequentializable” in a different rulial reference frame? Presumably it’s possible to find at least something sequentializable in any rulial reference frame. But the question of whether the alien intelligence can be thought of as sampling it is a quite different one.

Does there need to be a “sequentializable consciousness” to imply “meaningful laws of physics”? Presumably meaningful laws have to somehow be associated with computational reducibility; certainly that would be true if they were going to be useful to a “computationally bounded” alien intelligence.

But it’s undoubtedly the case that “sequentializability” is not the only way to access computational reducibility. In a mathematical analogy, using sequentializability is a bit like using ordinary mathematical induction. But there are other axiomatic setups (like transfinite induction) that define other ways to do things like prove theorems.

Yes, human-like consciousness might involve sequentializability. But if the general idea of consciousness is to have a way of “experiencing the universe” that accesses computational reducibility then there are no doubt other ways. It’s a kind of “second-order alienness”: in addition to using a different rulial reference frame, it’s using a different scheme for accessing reducibility. And the implied physics of such a setup is likely to be very different from anything we currently think of as physics.

Could we ever expect to identify what some of these “alien possibilities” are? The Principle of Computational Equivalence at least implies that we can in principle expect to be able to set up any possible computational rule. But if we start doing experiments we can’t have an expectation that scientific induction will work, and it is potentially arbitrarily difficult to identify computational reducibility. Yes, we might recognize some form of prediction or regularity that we are familiar with. But to recognize an arbitrary form of computational reducibility in effect relies on some analog of a definition of consciousness, which is what we were looking for in the first place.

What Now?

Consciousness is a difficult topic, that has vexed philosophers and others for centuries. But with what we know now from our Physics Project it at least seems possible to cast it in a new light much more closely connected to the traditions of formal science. And although I haven’t done it here, I fully anticipate that it’ll be possible to take the ideas I’ve discussed and use them to create formal models that can answer questions about consciousness and capture its connections, particularly to physics.

It’s not clear how much realistic physics there will need to be in models to make them useful. Perhaps one will already be able to get worthwhile information about how branching brains perceive a branching universe by looking at some simple case of a multiway Turing machine. Perhaps some combinator system will already reveal something about how different versions of physics could be set up.

In a sense what’s important is that it seems we may have a realistic way to formalize issues about consciousness, and to turn questions about consciousness into what amount to concrete questions about mathematics, computation, logic or whatever that can be formally and rigorously explored.

But ultimately the way to tether the discussion—and to have it not for example devolve into debates about the meaning of words—is to connect it to actionable issues and applications.

As a first example, let’s discuss distributed computing. How should we think about computations that—like those in our model of physics—take place in parallel across many different elements? Well—except in very simple or structured cases—it’s hard, at least for us humans. And from what we’ve discussed about consciousness, perhaps we can now understand why.

The basic issue is that consciousness seems to be all about forming a definite “sequentialized” thread of experience of the world, which is directly at odds with the idea of parallelism.

But so how can we proceed if we need to do distributed computing? Following what we believe about consciousness, I suspect a good approach will be to essentially mirror what we do in parsing the physical universe—and for example to pick reference frames in which to view and integrate the computation.

Distributed computing is difficult enough for us humans to “wrap our brains around”. Multiway or nondeterministic computing tends to be even harder. And once again I suspect this is because of the “limitations imposed by consciousness”. And that the way to handle it will be to use ideas that come from physics, and from the interaction of consciousness with quantum mechanics.

A few years ago at an AI ethics conference I raised the question of what would make us think AIs should have rights and responsibilities. “When they have consciousness!” said an enthusiastic philosopher. Of course, that begs the question of what it would mean for AIs to have consciousness. But the point is that attributing consciousness to something has potential consequences, say for ethics.

And it’s interesting to see how the connection might work. Consider a system that’s doing all sorts of sophisticated and irreducible computation. Already we might reasonably say that the system is showing a generalization of intelligence. But to achieve what we’re viewing as consciousness the system also has to integrate this computation into some kind of single thread of experience.

And somehow it seems much more appropriate to attribute “responsibility” to that single thread that we can somehow “point to” than to a whole incoherent distributed computation. In addition, it seems much “more wrong” to imagine “killing” a single thread, probably because it feels much more unique and special. In a generic computational system there are many ways to “move forward”. But if there’s a single thread of experience it’s more like there’s only one.

And perhaps it’s like the death of a human consciousness. Inevitably the history around that consciousness has affected all sorts of things in the physical universe that will survive its disappearance. But it’s the thread of consciousness that ties it all together that seems significant to us, particularly as we try to make a “summary” of the universe to create our own coherent thread of experience.

And, by the way, when we talk about “explaining AI” what it tends to come down to is being able not just to say “that’s the computation that ran”, but being able to “tell a story” about what happened, which typically begins with making it “sequential enough” that we can relate to it like “another consciousness”.

I’ve often noted that the Principle of Computational Equivalence has important implications for understanding our “place in the universe”. We might have thought that with our life and intelligence there must be something fundamentally special about us. But what we’ve realized is that the essence of these is just computational sophistication—and the Principle of Computational Equivalence implies that that’s actually quite ubiquitous and generic. So in a sense this promotes the importance of our human details—because that’s ultimately all that’s special about us.

So what about consciousness? In full generality it too has a certain genericity. Because it can potentially “plug into” any pocket of reducibility of which there are inevitably infinitely many—even though we humans would not yet recognize most of them. But for our particular version of consciousness the idea of sequentialization seems to be central.

And, yes, we might have hoped that our consciousness would be something that even at an abstract level would put us “above” other parts of the physical universe. So the idea that this vaunted feature of ours is ultimately associated with what amounts to a restriction on computation might seem disappointing. But I view this as just part of the story that what’s special about us are not big, abstract things, but specific things that reflect all that specific irreducible computation that has gone into creating our biology, our civilization and our lives.

In a sense the story of science is a story of struggle between computational irreducibility and computational reducibility. The richness of what we see is a reflection of computational irreducibility, but if we are to understand it we must find computational reducibility in it. And from what we have discussed here we now see how consciousness—which seems so core to our existence—might fundamentally relate to the computational reducibility we need for science, and might ultimately drive our actual scientific laws.

Notes

How does this all relate to what philosophers (and others) have said before? It will take significant work to figure that out, and I haven’t done it. But it’ll surely be valuable. Of course it’ll be fun to know if Leibniz or Kant or Plato already figured out—or guessed—this or that, even centuries or millennia before we discovered some feature of computation or physics. But what’s more important is that if there’s overlap with some existing body of work then this provides the potential to make a connection with other aspects of that work, and to show, for example, how what I discuss might relate, say, to other areas of philosophy or other questions in philosophy.

My mother, Sybil Wolfram, was a longtime philosophy professor at Oxford University, and I was introduced to philosophical discourse at a very young age. I always said, though, that if there was one thing I’d never do when I was grown up, it’s philosophy; it just seemed too crazy to still be arguing about the same issues after two thousand years. But after more than half a century of “detour” in science, here I am, arguably, doing philosophy after all….

Some of the early development of the ideas here were captured in the livestream: A Discussion about Physics Built by Alien Intelligences (June 25, 2020). Thanks particularly to Jeff Arle, Jonathan Gorard and Alexander Wolfram for discussions.

A Little Closer to Finding What Became of Moses Schönfinkel, Inventor of Combinators

$
0
0
news-icon

A Little Closer to Finding What Became of Moses Schönfinkel, Inventor of Combinators

For most big ideas in recorded intellectual history one can answer the question: “What became of the person who originated it?” But late last year I tried to answer that for Moses Schönfinkel, who sowed a seed for what’s probably the single biggest idea of the past century: abstract computation and its universality.

I managed to find out quite a lot about Moses Schönfinkel. But I couldn’t figure out what became of him. Still, I kept on digging. And it turns out I was able to find out more. So here’s an update….

To recap a bit: Moses Schönfinkel was born in 1888 in Ekaterinoslav (now Dnipro) in what’s now Ukraine. He went to college in Odessa, and then in 1914 went to Göttingen to work with David Hilbert. He didn’t publish anything, but on December 7, 1920—at the age of 32—he gave a lecture entitled “Elemente der Logik” (“Elements of Logic”) that introduced what are now called combinators, the first complete formalism for what we’d now call abstract computation. Then on March 18, 1924, with a paper based on his lecture just submitted for publication, he left for Moscow. And basically vanished.

It’s said that he had mental health issues, and that he died in poverty in Moscow in 1940 or 1942. But we have no concrete evidence for either of these claims.

When I was researching this last year, I found out that Moses Schönfinkel had a younger brother Nathan Scheinfinkel (yes, he used a different transliteration of the Russian Шейнфинкель) who became a physiology professor at Bern in Switzerland, and later in Turkey. Late in the process, I also found out that Moses Schönfinkel had a younger sister Debora, who we could tell graduated from high school in 1907.

Moses Schönfinkel came from a Jewish merchant family, and his mother came from a quite prominent family. I suspected that there might be other siblings (Moses’s mother came from a family of 8). And the first “new find” was that, yes, there were indeed two additional younger brothers. Here are the recordings of their births now to be found in the State Archives of the Dnipropetrovsk (i.e. Ekaterinoslav) Region:

Birth records—click to enlarge

So the complete complement of Шейнфинкель/Schönfinkel/Scheinfinkel children was (including birth dates both in their original Julian calendar form, and in their modern Gregorian form, and graduation dates in modern form):

 

And having failed to find out more about Moses Schönfinkel directly, plan B was to investigate his siblings.

I had already found out a fair amount about Nathan. He was married, and lived at least well into the 1960s, eventually returning to Switzerland. And most likely he had no children.

Debora we could find no trace of after her high-school graduation (we looked for marriage records, but they’re not readily available for what we assume is the relevant time period).

By the way, rather surprisingly, we found nice (alphabetically ordered), printed class lists from the high-school graduations (apparently these were distributed to higher-education institutions across the Russian Empire so anyone could verify “graduation status”, and were deposited in the archives of the education district, where they’ve now remained for more than a century):

Graduation records—click to enlarge

(We can’t find any particular trace of the 36 other students in the same group as Moses.)

OK, so what about the “newly found siblings”, Israel and Gregory? Well, here we had a bit more luck.

For Israel we found these somewhat strange traces:

Admission records—click to enlarge

They are World War I hospital admission records from January and December 1916. Apparently Israel was a private in the 2nd Finnish Regiment (which—despite its name—by then didn’t have any Finns in it, and in 1916 was part of the Russian 7th Army pushing west in southern Ukraine in the effort to retake Galicia). And the documents we have show that twice he ended up in a hospital in Pavlohrad (only about 40 miles from Ekaterinoslav, though in the opposite direction from where the 7th Army was) with some kind of (presumably not life-threatening) hernia-like problem.

But unfortunately, that’s it. No more trace of Israel.

OK, what about the “baby brother”, Gregory, 11 years younger than Moses? Well, he shows up in World War II records. We found four documents:

WWII documents—click to enlarge

Document #4 contains something interesting: an address for Gregory in 1944—in Moscow. Remember that Moses went to Moscow in 1924. And one of my speculations was that this was the result of some family connection there. Well, at least 20 years later (and probably also much earlier, as we’ll see), his brother Gregory was in Moscow. So perhaps that’s why Moses went there in 1924.

OK, but what story do these World War II documents tell about Gregory? Document #1 tells us that on July 27, 1943, Gregory arrived at the military unit designated 15 зсп 44 зсбр (15 ZSP 44 ZSBR) at transit point (i.e. basically “military address”) 215 азсп 61А (215 AZSP 61A). It also tells us that he had the rank of private in the Red Army.

Sometime soon thereafter he was transferred to unit 206 ZSP. But unfortunately he didn’t last long in the field. Around October 1, 1943, he was wounded (later, we learn he has “one wound”), and—as document #2 tells us—he was one of 5 people picked up by hospital train #762 (at transit point 206 зсп ЗапФ). On November 26, 1943, document #3 records that he was discharged from the hospital train (specifically, the document explains that he’s not getting paid for the time he was on the hospital train). And, finally, document #4 records that on February 18, 1944—presumably after a period of assessment of his condition—he’s discharged from the military altogether, returning to an address in Moscow.

OK, so first some military points. When Gregory arrived in the army in July 1943 he was assigned to the 44th Rifle Brigade (44 зсбр) in the 15th Rifle Division (15 зсп) in the 61st Army (61A)—presumably as part of reinforcements brought in after some heavy Soviet losses. Later he was transferred to the 206th Rifle Division in the 47th Army, which is where he was when he was wounded around October 1, 1943.

What was the general military situation then? In the summer of 1943 the major story was that the Soviets were trying to push the Germans back west, with the front pretty much along the Dnieper River in Ukraine—which, curiously enough, flows right through the middle of Ekaterinoslav. On October 4, 1943, here’s how the New York Times presented things:

October 4, 1943—click to enlarge

But military history being what it is, there’s much more detailed information available. Here’s a modern map showing troop movements involving the 47th Army in late September 1943:

1943 Map—click to enlarge

The Soviets managed to get more than 100,000 men across the Dnieper River, but there was intense fighting, and at the end of September the 206th Rifle Division (as part of the 47th Army) was probably involved in the later stages of the fight for the Bukrin Bridgehead. And this is probably where Gregory Schönfinkel was wounded.

After being wounded, he seems to have been taken to some kind of service area for the 206th Rifle Division (206 зсп ЗапФ), from which he was picked up by a hospital train (and, yes, it was actually a moving hospital, with lots of cars with red crosses painted on top).

But more significant in our quest for the story of Gregory Schönfinkel is other information in the military documents we have. They record that he is Jewish (as opposed to “Russian”, which is how basically all the other soldiers in these lists are described). Then they say that he has “higher education”. One says he is an “engineer”. Another is more specific, and says he’s an “engineer economist” (Инж. Эконом.). They also say that he is not a member of the Communist Party.

They say he is a widower, and that his wife’s name was Evdokiya Ivanovna (Евдокия Иван.). They also list his “mother”, giving her name as Мария Григ. (“Maria Grig.”, perhaps short for “Grigorievna”). And then they list an address: Москва С. Набер. д. 26 кв. 1ч6, which is presumably 26 Sofiyskaya Embankment, Apartment 1-6, Moscow.

Where is that address? Well, it turns out it’s in the very center of Moscow (“inside the Garden Ring”), with the front looking over the Moscow River directly at the Kremlin:

Here’s a current picture of the building

26 Sofiyskaya Embankment, Apartment 1-6, Moscow

as well as one from perhaps 100 years earlier:

26 Sofiyskaya Embankment, Apartment 1-6, Moscow

The building was built by a family of merchants named the Bakhrushins in 1900–1903 to provide free apartments for widows and orphans (apparently there were about 450 one-room 150-to-300-square-foot apartments). In the Russian Revolution, the building was taken over by the government, and set up to house the Ministry of Oil and Gas. But some “communal apartments” were left, and it’s presumably in one of those that Gregory Schönfinkel lived. (Today the building is the headquarters of the Russian state oil company Rosneft.)

OK, but let’s unpack this a bit further. “Communal apartments” basically means dormitory-style housing. A swank building, but apparently not so swank accommodation. Well, actually, in Soviet times dormitory-style housing was pretty typical in Moscow, so this really was a swank setup.

But then there are a couple of mysteries. First, how come a highly educated engineering economist with a swank address was just a private in the army? (When the hospital train picked up Gregory, along with four other privates, one of the others was listed as a carpenter; the others were all listed as “с/хоз” or “сельское хозяйство”, basically meaning “farm laborer”, or what before Soviet times would have been called “peasant”).

Maybe the Russian army was so desperate for recruits after all their losses that—despite being 44 years old—Gregory was drafted. Maybe he volunteered (though then we have to explain why he didn’t do that earlier). But regardless of how he wound up in the army, maybe his status as a private had to do with the fact that he wasn’t a member of the Communist Party. At that time, a large fraction of the city-dwelling “elite” were members of the Communist Party (and it wouldn’t have been a major problem that he was Jewish, though coming from a merchant family might have been a negative). But if he wasn’t in the “elite”, how come the swank address?

A first observation is that his wife’s first name Evdokiya was a popular Russian Orthodox name, at least before 1917 (and is apparently popular again now). So presumably Gregory had—not uncommonly in the Soviet era—married someone who wasn’t Jewish. But now let’s look at the “mother’s” name: “Мария Григ.” (“Maria Grig.”).

We know Gregory’s (and Moses’s) mother’s name was Maria/“Masha” Gertsovna Schönfinkel (née Lurie)—or Мария (“Маша”) Герцовна Шейнфинкель. And according to other information, she died in 1936. So—unless someone miswrote Gregory’s “mother’s” name—the patronymics (second names) don’t match. So what’s going on?

My guess is that the “mother” is actually a mother-in-law, and that it was her apartment. Perhaps her husband (most likely at that point not her) had worked at the Ministry of Oil and Gas, and that’s how she ended up with the apartment. Maybe Gregory worked there too.

OK, so what was an “engineer economist” (Инженер Экономист)? In the planning-oriented Soviet system, it was something quite important: basically a person who planned and organized production and labor in some particular industry.

How did one become an “engineer economist”? At least a bit later, it was a 5-year “master’s level” course of study, including courses in engineering, mathematics, bookkeeping, finance, economics of a particular sector, and “political economy” (à la Marx). And it was a very Soviet kind of thing. So the fact that that was what Gregory did presumably means that he was educated in the Soviet Union.

He must have finished high school right when the Tsar was being overthrown. Probably too late to be involved in World War I. But perhaps he got swept up in the Russian Civil War. Or maybe he was in college then, getting an early Soviet education. But, in any case, as an engineer economist it’s pretty surprising that in World War II he didn’t get assigned to something technical in the army, and was just a simple private in the infantry.

From the data we have, it’s not clear what was going on. But maybe it had something to do with Moses.

It’s claimed that Moses died in 1940 or 1942 and was “living in a communal apartment”. Well, maybe that communal apartment was actually Gregory’s (or at least his mother-in-law’s) apartment. And here’s a perhaps fanciful theory: Gregory joined the army out of some kind of despondency. His wife died. His older brother died. And in February 1942 (though it might have taken him a while to find out) any of his family still in Ekaterinoslav probably died in the massacre of the Jewish population there. He hadn’t joined the army earlier in the war, notably during the Battle of Moscow. And by 1943 he was 44 years old. So perhaps in some despondency—or anger—he volunteered for the army.

We don’t know. And at this point the trail seems to go cold. It doesn’t appear that Gregory had any children, and we haven’t been able to find out anything more about him.

But I consider it progress that we’ve managed to identify that Moses’s younger brother lived in Moscow, potentially providing a plausible reason that Moses might have gone to Moscow.

Actually, there may have been other “family reasons”. There seems to have been quite a lot of back-and-forth in the Jewish population between Moscow and Ekaterinoslav. And Moses’s mother came from the Lurie family, which was prominent not only in Ekaterinoslav, but also in Moscow. And it turns out that the Lurie family has done a fair amount of genealogy research. So we were able, for example, to reach a first cousin once removed of Moses’s (i.e. someone whose parent shared a grandparent with Moses, or 1/32 of the genetics). But so far nobody has known anything about what happened to Moses, and nobody has said “Oh, and by the way, we have a suitcase full of strange papers” or anything.

I haven’t given up. And I’m hoping that we’ll still be able to find out more. But this is where we’ve got so far.

One More Thing

In addition to pursuing the question of the fate of Moses Schönfinkel, I’ve made one other potential connection. Partly in compiling a bibliography of combinators, I discovered a whole collection of literature about “combinatory categorial grammars” and “combinatory linguistics”.

What are these? These days, the most common way to parse an English sentence like “I am trying to track down a piece of history” is a hierarchical tree structure—analogous to the way a context-free computer language would be parsed:

 

But there is an alternative—and, as it turns out, significantly older—approach: to use a so-called dependency grammar in which verbs act like functions, “depending” on a collection of arguments:

 

In something like Wolfram Language, the arguments in a function would appear in some definite order and structure, say as f[x, y, z]. But in a natural language like English, everything is just given in sequence, and a function somehow has to have a way to figure out what to grab. And the idea is that this process might work like how combinators written out in sequence “grab” certain elements to act on.

This idea seems to have a fairly tortuous history, mixed up with attempts and confusions about connecting the syntax (i.e. grammatical structure) of human languages to their semantics (i.e. meaning). The core issue has been that it’s perfectly possible to have a syntactically correct sentence (“The flying chair ate a happy semicolon”) that just doesn’t seem to have any “real-world” meaning. How should one think about this?

I think the concept of computational language that I’ve spent so many years developing actually makes it fairly clear. If one can express something in computational language there’s a way to compute from it. Maybe the resulting computation will align with what happens in the real world; maybe it won’t. But there’s some “meaningful place to go” with what one has. And the point is that a computational language has a well-defined “inner computational representation” for things. The particular syntax (e.g. sequence of characters) that one might use for input or output in the computational language is just something superficial.

But without the idea of computational language people have struggled to formalize semantics, tending to try to hang what they’re doing on the detailed structure and syntax of human languages. But then what should one do about syntactically correct structures that don’t “mean anything”? An example of what I consider to be a rather bizarre solution—embodied in so-called Montague grammars from the 1970s—is essentially to turn all sentences into functions, in which there’s nothing “concrete” there, just “slots” where things could go (“x_ ate y_”)—and where one can “hold off meaninglessness” by studying things without explicitly filling in the slots.

In the original formulation, the “functions” were thought about in terms of lambdas. But combinatory categorial grammars view them instead in terms of combinators, in which in the course of a sentence words in a sense “apply to each other”. And even without the notion of slots one can do “combinatory linguistics” and imagine finding the structure of sentences by taking words to “apply themselves” “across the sentence” like combinators.

If well designed (as I hope the Wolfram Language is!) computational language has a certain clean, formal structure. But human natural language is full of messiness, which has to be untangled by natural language understanding—as we’ve done for so many years for Wolfram|Alpha, always ultimately translating to our computational language, the Wolfram Language.

But without the notion of an underlying computational language, people tend to feel the need to search endlessly for formal structure in human natural language. And, yes, some exists. But—as we see all the time in actually doing practical natural language understanding for Wolfram|Alpha—there’s a giant tail that seems to utterly explode any all-encompassing formal theory.

Are there at least fragments that have formal structure? There are things like logic (“and”, “or”, etc.) that get used in human language, and which are fairly straightforwardly formalizable. But maybe there are more “functional” structures too, perhaps having to do with the operation of verbs. And in combinatory linguistics, there’ve been attempts to find these—even for example directly using things like Schönfinkel’s S combinator. (Given S f g xf[x][g[x]] one can start imagining—with a slight stretch—that “eat peel orange” operates like the S combinator in meaning “eat[orange][peel[orange]]”.)

Much of the work on this has been done in the last few decades. But it turns out that its history stretches back much further, and might conceivably actually intersect with Moses Schönfinkel himself.

The key potential link is Kazimierz Ajdukiewicz (1890–1963). Ajdukiewicz was a Polish logician/philosopher who long tried to develop a “mathematicized theory” of how meaning emerges, among other things, from natural language, and who basically laid the early groundwork for what’s now combinatory linguistics.

Kazimierz Ajdukiewicz was born two years after Moses Schönfinkel, and studied philosophy, mathematics and physics at the University of Lviv (now in Ukraine), finishing his PhD in 1912 with a thesis on Kant’s philosophy of space. But what’s most interesting for our purposes is that in 1913 Ajdukiewicz went to Göttingen to study with David Hilbert and Edmund Husserl.

In 1914 Ajdukiewicz published one paper on “Hilbert’s New Axiom System for Arithmetic”, and another on contradiction in the light of Bertrand Russell’s work. And then in 1915 Ajdukiewicz was drafted into the Austrian army, where he remained until 1920, after which he went to work at the University of Warsaw.

But in 1914 there’s an interesting potential intersection. Because June of that year is when Moses Schönfinkel arrived in Göttingen to work with Hilbert. At the time, Hilbert was mostly lecturing about physics (though he also did some lectures about “principles of mathematics”). And it seems inconceivable that—given their similar interests in the structural foundations of mathematics—they wouldn’t have interacted.

Of course, we don’t know how close to combinators Schönfinkel was in 1914; after all, his lecture introducing them was six years later. But it’s interesting to at least imagine some interaction with Ajdukiewicz. Ajdukiewicz’s own work was at first most concerned with things like the relationship of mathematical formalism and meaning. (Do mathematical constructs “actually exist”, given that their axioms can be changed, etc.?) But by the beginning of the 1930s he was solidly concerned with natural language, and was soon writing papers with titles like “Syntactic Connexion” that gave formal symbolic descriptions of language (complete with “functors”, etc.) quite reminiscent of Schönfinkel’s work.

So far as I can tell Ajdukiewicz never explicitly mentioned Schönfinkel in his publications. But it seems like too much of a coincidence for the idea of something like combinators to have arisen completely independently in two people who presumably knew each other—and never to have independently arisen anywhere else.

Thanks

Thanks to Vitaliy Kaurov for finding additional documents (and to the State Archives of the Dnipropetrovsk Region and Elena Zavoiskaia for providing various documents), Oleg and Anna Marichev for interpreting documents, and Jason Cawley for information about military history.

The Wolfram Physics Project: A Gallery of the First Year

$
0
0
wpp-gallery-icon

Bulletins & Papers

Functions

Functions

People

People

Student Projects

Student Projects

Livestreams & Video Work Logs

Livestreams

Working Notebooks

Working Notebooks

Viewing all 205 articles
Browse latest View live