Tholonia - 550-F-An_Unexpected_Pattern
The Existential Mechanics of Awareness
Duncan Stroud
Published: January 15, 2020
Updated: Updated: Jan 1, 2026
Welkin Wall Publishing
ISBN-10:
ISBN-13: 978-1-6780-2532-8
Copyright ©2020 Duncan Stroud CC BY-NC-SA 4.0

This book is an open sourced book. This means that anyone can contribute changes or updates. Instructions and more information at https://tholonia.github.io/the-book (or contact the author at duncan.stroud@gmail.com). This book and its on-line version are distributed under the terms of the Creative Commons Attribution-Noncommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license, with the additional proviso that the right to publish it on paper for sale or other for-profit use is reserved to Duncan Stroud and authorized agents thereof. A reference copy of this license may be found at https://creativecommons.org/licenses/by-nc-sa/4.0/. The above terms include the following: Attribution - you must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. Noncommercial - You may not use the material for commercial purposes. Share Alike - If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. No additional restrictions - you may not apply legal terms or technological measures that legally restrict others from doing anything the license permits. Notices - You do not have to comply with the license for elements of the material in the public domain or where your use is permitted by an applicable exception or limitation. No warranties are given. The license may not give you all of the permissions necessary for your intended use. For example, other rights such as publicity, privacy, or moral rights may limit how you use the material.

Appendix F: An Unexpected Pattern

What is this pattern, and why does it exist?

In the course of writing this book, there was the need to generate sets of chaotic numbers from other sets of patterned numbers. In one case, the results presented a very unexpected pattern.

Numbers were generated by periodically calculating the longitudinal position of the Sun and the Moon (from Earth’s coordinates of -34.61262S, -58.41048W, Buenos Aires, Argentina on the 24th of each month at 5:31:07 PM UT) down to 10-15 radians. The individual digits of these two values were then concatenated or cross-summed. For example, the Sun’s longitude in radians at one moment was 4.2430033683776855, and the Moon’s position at that moment was 2.399298667907715. These were then cross-summed until they were reduced to a single-digit value.

Millions of samples were recorded and reduced to one number per sample. The input data of the Sun and Moon positions was perfectly deterministic and predictable, but the single cross-summed number series of data was “mostly random” according to the Chi-squared results of a randomness test1. Below are the test results of the single-number data and the data from a random number generator (RNG) for comparison2.

Next, each single digit number (1-9) was assigned a color from the spectrum and then plotted such that each dot occupied one consecutive pixel on one line, and when that line was full, the plot advanced to the next line, just like a dot-matrix printer. This was an extremely simple plot. The below-left plot shows only one number being plotted at a time, and the below-right plot shows cumulative numbers. The distribution of the numbers within the data set of 33,000,000 is shown in the bar chart.

The data was also tested against the principle of Benford’s law, which demonstrates that naturally occurring numerical data tends to show non-uniform, patterned distributions. We looked at 3 sets of data; 1) the crossed-sums of the Sun, 2) the crossed-sums of the Moon, and 3) the crossed-sums of the Sun and Moon combined. We counted the number of each digit 1 through 9 in a data set of 33,575,635 samples. While the numbers varied in count, they did not show the uniform distribution characteristic of truly random numbers, confirming the presence of underlying patterns. To visualize these patterns, we plotted the difference between the lowest and highest counts (charts below) for each number 1-9 across the last 60,000 samples. As you can see, there is very patterned behavior, especially in the first chart and even more so in the last chart, which also shows a unique pattern where prime numbers (2, 3, 5, 7) consistently appear as the lowest-occurring digits.

Much to our surprise, this resulting pattern matches that of a Fresnel diffraction.

At the rate we were sampling, each position of each “planet” was only about 7x10-8 radians further than the previous position sampled. This equates to about 23 kilometers of movement for the Sun and only 64 millimeters of movement for the Moon.

When cross-summing numbers, the place of each digit is insignificant, so 123 and 321 are the same as they both cross-sum to 6. Because all the digits in a cross-summed value have equal significance, the slightest measurable movement of the sun is equal to the movement of the two-thousand years it takes to pass through a constellation. This is because 0.000000000000001 is the same as 100000000000000.0

Earlier, we plotted the values simply as colors, but when we use the actual values of the cross-summed numbers as the Y-axis, with each number advancing one step across the X-axis (time), we get a different perspective, as seen in the top image below. The bottom image is the same plot but only using the differences between each number.

This plot uses 300,000 cross-summed data points that covers 25 minutes of observation.

Surprisingly, the top image appears to be a cross-sectional diagram of a Fresnel lens!

This raises the questions:

Additional details for the curious

Below are some of the visual results of the plot.

These plots are of data captured every 0.005 and .01 seconds, and we only look at the cross-summed values that are prime numbers (2,3,5,7), but the results were the same when using all numbers 0-9. These twelve plots represent sampled data from each month of the year at the same time.

It is interesting to see how the 0.005s look slightly “zoomed out” compared to the 0.01s. Also interesting is how the plot pattern and the pattern of wood grain is so similar, suggesting the source of these patterns also share a common process, at least mathematically.

We see another pattern if we change the time between samples from microseconds to days. Below are 10 sets representing 10 months with the same starting point as the sets above, but each pixel represents an entire day (86,400 seconds), i.e., the position of the Sun and the Moon was recorded only once a day. This means that the entire set represents 1,095 years. Each plot looked almost identical and appeared completely random, but upon closer examination, they were not random at all, but a very “zoomed out” view. It is clearer to see the pattern if we only show the blue dots of a small section of the image.

My first assumption was that the farther “out” we zoom in time, the clearer the patterns, but it seems to be the exact opposite; the smaller the time delta, the larger the pattern. This might make sense if we were looking at the linear difference between, say, 1.2345670 and 1.2345678, as the difference would be 0.0000008, and such a small change would need a lot of samples to make itself apparent, but this does not apply here because we are cross-summing the numbers together, making the place of the digits meaningless.

Below is a comparison of the patterns at each scale. The values above the images are the interval sampling time, and the values below are the sample’s entire duration. No obvious pattern can be seen at 5s, but a slight pattern begins to emerge at 0.5s, becomes more apparent at 0.05s, and is most dominant at 0.005s. A pattern is equally visible in 0.0005s but appears only as a simple oscillation or wave. 0.00005 is identical to 0.0005 because (it seems that) once we start going below 0.002, we hit the limits of resolution. This is the point where we begin getting duplicate data because the Sun or Moon has not moved enough in the small amount of time between samples, and the smaller we go, the more duplicates we get. For example, at 0.00001 second intervals of sampling , to get the 155,520 unique data points to create the graph, we need to sample 50,000,000 times. This simple wave pattern is the 1st wave pattern we can detect (given the limitation of my laptop and/or the ephemeris used to calculate planetary positions). This simple wave pattern is what then evolves into the more complex patterns. Presumably, the difference between sample points in the wave patterns is only the rightmost last digit being increased by 1, given that we keep sampling until we get the next unique number.

How do we have patterns when we sample at 0.01, but nothing (significant) when we sample at .05? I think this has to do with where we are sampling from within the entire pattern. The two images below show that the Fresnel pattern is slightly altered with time but eventually returns. It looks like Fresnel diffraction at its most ordered point, but at its least ordered point, it looks identical to pure random data (but we assume it is just as ordered, only difficult to see that order).

When the data is sampled, we have no idea where we are within the cycle of repeating patterns, so it is only by chance that we will end up in an ordered section.

For those not familiar with Fresnel diffraction, it describes the interference pattern created when light waves bend (diffract) around the edge of an object and then interact with each other, creating alternating zones of constructive and destructive interference

Digging Deeper

Having no idea where to look for more information, I posted this to various mathematics forums, and even though the question was read by many, no one had an answer. I then sent this question to my old friend, who is an accomplished physicist but who also considers these speculative ideas to be a waste of time, so I knew any answer he gave would include an unabashed commentary on the silliness of it all.

He did not disappoint on either front. I’ll forego the commentary and just focus on the relevant part of his reply, which was very short. Essentially he told me that no matter how hard you try to create random numbers from non-random data, you will always end up with some sort of a pattern. “Why?” was my naive response, to which he replied only, “Look into Zipf’s law”.

Zipf’s law is (from Wikipedia):

an empirical law… that refers to the fact that many types of data … can be approximated with [certain types of] power law probability distributions. Zipf distribution is related to the zeta distribution but is not identical.

There are 2 very significant points in this definition. The first is that it is an empirical law. Something is called an empirical law when it can be observed but not proven (yet). The second important detail is that Zipf distribution is related to Zeta distribution. We’ll get to that in a moment.

Zipf’s law was not discovered by physicists or even within the hard sciences. It was a French stenographer, Jean-Baptiste Estoup (1868 - 1950), that first discovered the patterns of words which became Zipf’s law after it was popularized by the American linguist George Kingsley Zipf (1902 - 1950).

It seems that all languages, from every corner of the world, both modern and ancient, follow Zipf’s law in the way words are distributed in both writing and speaking (and by inference, some implicit pattern in the way we think). We don’t need to drill into the details, but what is significant is that this same distribution pattern is also seen in city populations, solar flare intensities, protein sequences, immune receptors, the amount of traffic websites get, earthquake magnitudes, the number of times academic papers are cited, last names, the firing patterns of neural networks, ingredients used in cookbooks, the number of phone calls people received, the diameter of Moon craters, the number of people that die in wars, the popularity of opening chess moves, even the rate at which we forget… and countless more.3

In short, Zipf’s law is an example of power laws, and power laws touch everything in existence in one way or another. But what does this have to do with anything? This is where the second point applies. The charts above are called Zipf Distributions, and they all have different scales of values based on the kind of data being sampled. If we reduce all the scales to 1, we have what is called a normalized Zipf distribution, also known as a Zeta distribution. Mathematicians use the Riemann zeta function (RZF) to study and test these distributions. This function is written as ζ(s), where s is any complex number other than 1.

Related sidenote, using just prime numbers, you can also determine π, though this is unrelated to the RZF: .

One of the useful features of the RZF is that it can turn a divergent series of numbers (numbers that, when you add them together, approach infinity) into a convergent series (when the sum approaches a specific value, like or ). It can do this because the hypothesis uses the concept of analytic continuation and complex numbers. You can play with this function using an online Riemann Zeta Function Calculator. 4

The RZF is based on the Riemann hypothesis, which also states that ζ(s)=0 only when s is a negative even integer or a complex number with the real part of . This hypothesis is quite profound in how it relates to the distribution of prime numbers. That won’t be explained here, but it is understandable when properly explained. There are a couple of excellent videos on the subject at https://www.youtube.com/watch?v=sD0NjbwqlYw and https://www.youtube.com/watch?v=d6c6uIyieoo.

However, this 160-year-old hypothesis has yet to be proven, and because it is considered by many mathematicians to be the most important unsolved problem in pure mathematics, the Clay Mathematics Institute has a reward of US $1,000,000 to anyone who could solve it. (Update: September 24, 2018. Michael Atiyah, the 89-year-old mathematician emeritus at The University of Edinburgh, claimed to have proved this, but his proof was not accepted by the mathematical community. Atiyah passed away in January 2019, and the hypothesis remains unsolved as of 2026.)

Understanding how the RZF works would (probably) help explain how and why patterns of numbers appear organically in any data set of naturally occurring samples, such as the charts above.

What value would we get if we applied the RZF to the series of square roots of natural numbers, the same numbers that define a Fresnel diffraction? Some really smart person did exactly this 5, and the answer is… voilà!

This is the inverse of , and is especially interesting because now we have a formula that fits into the pattern-position of Ohm’s law and Newton’s laws of motions. This is consistent with and supports the ideas put forth earlier that archetypal functions or formulas instantiate differently depending on their context, similar to how the energy of nature interacts with itself across various contexts, as previously shown. For clarity, we’ll refer to contextual functions that are algorithmically identical as having the same pattern-positions (PP). An example of this, as previously shown, is the single (fractal) function with various contextual parameters that describes energy interacting with itself across various contexts.

In this case, as we are only looking at Ohm’s law and its 12 formulas, we see that has the same pattern as or at position 11:00 (as in the 12-hour clock) on the 12-part wheel we are using, and we can test that by plugging in the appropriate values. The table below shows these pi-based values that seem to describe the pi-based values we are seeing with regard to Ohm’s law. For the sake of clarity, I assigned some of the formulas to variables; otherwise, it’s too bothersome to read (or compose). In the table below, I show Ohm’s law with some previously shown examples plus the simple math examples and then the application of and . Those 2 values alone were enough to fill in all the missing values. In the middle column of the table, I show the actual numbers and the formulas to the right.

All of this may have nothing to do with unraveling the mystery of the Fresnel pattern hidden in the data, but it is interesting nonetheless, specifically:

The Cornu Spiral

While the manner in which the cross-summed numbers create a diffraction pattern may be a mystery (to me), how diffraction patterns are created are not. Properly described, it would require a lot of math which we won’t get into here, but a quick search on “What is the Cornu’s spiral method for diffraction pattern?” will reveal all to the more curious. In short, because we know that the intensity of light at any point is always proportional to 2 values known as the Fresnel integrals , we can plot the intensity of each point along the diffraction pattern on a 2-Dimensional plot. When we do this, we end up with an amazing figure called the Cornu Spiral, shown below as “2D Cornu Spiral (Fresnel)”.

Another interesting thing about the Cornu Spiral is that it also describes the path of the sun over the course of the year (from Earth’s perspective)6, which is pretty poetic. It appears as though the Cornu spiral describes not only how light and matter interact on the nanometer scale but also on the planetary scale, at least in one sense.

Not surprisingly, the Riemann Zeta Function also creates not just one Cornu Spiral but an entire series of spirals, as shown in “Cornu Spiral (RZF)”, that emerge from nothing and continues to expand from within itself.

The x and the y values are the result of a function based on a single value, as in x(t) and y(t) of the Fresnel Integrals, but there can be a third dimension based on t. When this new z(t) dimension is proportional to the x and y dimensions, a cone is created on the z-axis, which might be related to the diminishing angles of the diffraction pattern in the “Slightly zoomed out” plot of the cross-summed data patterns shown above.

So, by saying the following…

… we can relate our mystery pattern to several other patterns, including Newton’s 2nd, but the most dominant ones have to do with the nature of light.

In one way, it is like the cross-summed numbers are similar to photons in that the individual photons create the diffraction pattern, just like the individual numbers of our data also create the diffraction pattern. Both of these “particles” (photons and numbers) are presumably following some of the same rules in some fashion if we are to judge them by their resulting patterns. Given my lack of training in this area, I can only wonder if photons’ wave/particle properties also apply (contextually) to cross-summed numbers or numbers in general. Perhaps they are somehow related or similar to Feynman Sunshine Numbers, which relates prime numbers and photons, among other things. These numbers include RZF values that appear in the description of sunshine, ancient relics of the Big Bang, and the theory of photons, electrons, and positrons.

Cross Sum Patterns

It seems useful to say something more about cross-summed numbers.

Cross-summing multi-place numbers like a carnival numerologist will generate seemingly meaningless data, at least with respect to the quantitative significance of that data, but it will have plenty of information about the nature or quality of the original number. For example, a single-digit cross-summed value preserves many of the same properties as the original uncross-summed number, no matter how large that number is. A cross-sum divisible by 3 means the entire number is divisible by 3, and if divisible by 9, then the entire number is divisible by 9. We all know that if the last digit of a number is 2, then that number is divisible by 2, but it is also true if the cross-sum of the last 2 digits of a number equals 4, then that number is divisible by 4. Testing for divisibility by 8 can also be done by cross-sums but is a bit more complicated. Take the last three digits of the number. Multiply the first two of these by 2, then add the last digit. For example, if a number ends in 464, such as 7464, then 2×(46)+4=96=12×8. Now we know that 7464 is also a multiple of 8, which is the case; 8×933=7464.

10 is obvious, but 11 is very interesting. If we have the number 5025669, to test for divisibility by 11, we reverse the number, giving us 9665205. We then do an alternating cross-sum like so: 9-6+6-5+2-0+5=11, so 5025669 is divisible by 11 (5025669÷11=456879). This is interesting because 11 cross-sums to 2, and we can use the number of places used to test for divisibility as exponents of 2 (20=1, 21=2, 22=4, 23=8). With these numbers, which is a binary expansion, we can apply the oscillating functions or - and + to get a final number, and if that number is 11, then the original number is divisible by 11!

Divisibility by 7 is also interesting because it uses recursive cross-sums. To test the number 371, we take the rightmost digit and multiply it by 2, then subtract that value from the remaining digits:

37 - (2×1) = 35

Then we do that again on the answer:

3 - (2 × 5) = -7;

Cross-summing is like a form of qualitative math. In all these examples, the cross-summing told us something about the quality of the number, such as how it can be divided, regardless of the quantity of the number.

If we have patterns of non-random numbers, we can use cross-summing to learn of some quality of that pattern.

For example, if we create a list of number that increases by 8, as in 8, 16, 24, 32, 40, 48, 56, 64, 72, etc., and then cross-sum them as follows:

…we get a resulting sequence of 8, 7, 6, 5, 4, 3, 2, 1, 9 which continually repeats itself.

Our original data was sequential because it was tracking the linear movement of planets, so that means that the progression of values will always represent progressive points on a linear path, with each value being slightly larger (in this case) than the previous value, but when we cross-sum these values, all spatial or temporal information is lost leaving only the pattern of the relationship of the numbers. For example, 10 is larger than 9 but cross-sums to 1. Here, 1 represents the lower octave (for lack of a better term) of 10. Cross-summing essentially lowers all the numbers to the same octave. We see this in the number matrix below (left), as any consecutive sequence of digits will always repeat one of the nine patterns shown in the matrix. While this removes a lot of quantitative data, it does not remove the relationship between the data. In fact, it makes that relationship clearer, much like listening to a 10-octave piano piece in only 1 octave.

This number matrix shows the repeating pattern of every sequence of numbers that increases by the number in the left column. The chart on the right shows the changing progression of each row in the matrix is only 6 stages of transformation between the straight line at the bottom (1) and the perfectly ordered line at the top (8). The line above that (9) are the totals of each line, which also happens to be exactly the same pattern as line 8. The graph at the bottom shows these same numbers but non-stacked and smoothed, and here you can see the symmetry of the progressive changes. This example applies to any sequence of integers.

In our example of the series of numbers based on multiples of 8, we have two types of numbers; the quantitative values and the cross-summed values that represent their “position” or relation to other numbers within the context of a base-10 number system. In the charts below, the top-left chart shows both of these values as 2-dimensional and shows how the quantitative value can decrease as the cross-sum value increases. The top-right chart shows the values of 2 dimensions plotted as x and y, which shows that the difference between the diverging values is linear.

Further below, the left chart is a plot of what appears to be un-patterned data, but when we reduce it to one octave by cross-summing, we see a very clear pattern.

The point of this is to demonstrate how there can be hidden patterns in seemingly un-patterned data that has nothing to do with how it exists in time and space.

Attempts were made to replicate the patterns that appeared in the planetary data with various other inputs, such as incremental counting, the number of seconds from some moment, a number and some function of that number, such as square root, but nothing was able to get even close to the Sun/Moon patterns. This suggests that the Fresnel pattern is not solely an inherent cross-sum pattern but a result of the source data. If that is the case, could it be possible that the Fresnel pattern is a property of the movement of the Sun and the Moon or at least 2 rotating bodies in the same system?

The image above is more fun than informative because it fuels the imagination. The top left and right charts show 64 samples of the position of the Sun to 14 places. The values of each place are then plotted as one set of data. For example, if there are 3 samples of data of 1.23, 4.56, and 7.89, we plot the 1st line as 1,4,7, the 2nd line as 2,5,8 and the 3rd line as 3,6,9. Were this a live, real-time plot, we would see the outermost circle (or top-most line) changing thousands of times per second and the innermost (or bottom) changing every couple of months. If we plotted time instead of space to 14 places, the outer (or top) would change a million times a second, and the inner (or bottom) would change once every 11 days. Looking at the upper left image, it is very interesting (although probably statistically meaningless) that there is an obvious pattern in the first 6 values, and with the 7th, that pattern quickly begins to degrade.

The bottom images (of the image above) is simply a graph where each value is a dot over the course of 3,000 samples.

The images above were created by plotting each digit of the Sun’s position (x-axis) and showing some interesting patterns worth highlighting.


What Makes These Findings Unexpected?

When my physicist friend saw these patterns, he dismissed them: “Of course you’ll get patterns. Zipf’s law guarantees it.” While partially correct, this doesn’t explain what we actually found.

The Unexpected Observations

1. Fresnel Patterns Specifically

Zipf’s law predicts some pattern will emerge, but not which pattern. Cross-summed Sun and Moon positions produce patterns matching Fresnel diffraction, the interference patterns light creates when bending around obstacles. This is not a vague similarity but a precise structural match.

When we tested alternative inputs (incremental counting, elapsed seconds, square roots, digits of π), none replicated the Sun/Moon patterns. The Fresnel structure is not an artifact of cross-summing but specific to the Earth-Sun-Moon orbital relationship.

2. Scale-Dependent Visibility

Patterns appear and disappear depending on sampling rate:

This “sweet spot” at 0.005 seconds suggests resonance with underlying orbital frequencies rather than generic mathematical artifacts.

3. Prime Number Minimization

In the combined Sun and Moon data, prime numbers (2, 3, 5, 7) consistently appeared with the lowest frequency. This is unexplained by Benford’s law, random distribution, or standard number theory.

4. Replication Failure

Simple mathematical sequences produced no Fresnel patterns. Only the actual Sun and Moon orbital data generated them. This specificity suggests we’re observing something intrinsic to celestial mechanics, not mathematical coincidence.

A Tholonic Hypothesis

The Fresnel patterns appearing in cross-summed Sun-Moon data represent the material instantiation of a fundamental tholonic negotiation between two opposing archetypal forces.

Here’s why this makes sense from the tholonic perspective:

1. Dual Opposition as Tholonic Structure

In the tholonic model, every stable pattern emerges from the negotiation between two opposing forces (Definition and Contribution) mediated by a third force (Negotiation). The Sun and Moon represent such a duality:

This is structurally identical to a tholon, and the tholonic model predicts that such structures will exhibit self-similar patterns across scales and contexts.

2. Interference Patterns as Tholonic Signatures

Fresnel diffraction is fundamentally an interference pattern where waves simultaneously cooperate (constructive interference) and compete (destructive interference). This is the essence of tholonic negotiation. The tholonic model states that:

When light creates Fresnel patterns, it’s not because light “knows” about tholons but because the tholonic structure is the archetypal pattern through which energy interacts with itself. Similarly, when the Sun and Moon positions create the same pattern, it’s because their gravitational-orbital relationship follows the same archetypal structure.

Notably, the similarity between these patterns and wood grain (as observed earlier in the microsecond-scale plots) provides compelling evidence for this interpretation. Wood grain forms through the negotiation between growth forces and environmental constraints in trees, which is another example of dual opposition creating interference patterns. The fact that celestial mechanics, optical diffraction, and biological growth all produce structurally similar patterns suggests we’re observing the same archetypal tholonic structure instantiating across radically different contexts: astronomical scale (planets), nanometer scale (light waves), and biological scale (tree growth). This cross-context consistency is exactly what the tholonic model predicts.

3. Cross-Summing as Archetypal Extraction

Cross-summing performs a remarkable operation: it eliminates quantitative magnitude while preserving qualitative relationships. In tholonic terms:

By reducing all numbers to single digits, cross-summing creates a “lower octave” representation. This is analogous to how fractal patterns maintain their structure at different scales. The fact that the Fresnel pattern survives this reduction but only for Sun-Moon data suggests we’re observing an archetypal pattern that exists independent of scale.

4. Scale Resonance and Tholonic Frequencies

The appearance and disappearance of patterns at different sampling intervals suggests we’re observing a multi-scale tholonic structure:

This is consistent with the tholonic principle that patterns exist at specific scales and that observation must match the scale of the pattern to perceive it clearly. It’s like tuning a radio: too far off frequency and you get static; at the right frequency, the signal is clear.

5. Prime Number Minimization as Tholonic Integration

The consistent minimization of prime numbers (2, 3, 5, 7) in the combined Sun-Moon data is particularly intriguing from a tholonic perspective. In the 236 pattern:

If primes represent the most fundamental, un-integrated building blocks, their minimization in the interference pattern suggests they’re being actively combined into composite numbers. This would be consistent with the tholonic view that stable systems emerge from the integration of opposing forces. The Sun-Moon negotiation doesn’t produce isolated primes; it produces integrated, composite patterns, which is evidence of successful tholonic negotiation.

6. Replication Failure as Proof of Specificity

The fact that no other input data (counting sequences, random numbers, mathematical constants) produces these patterns is perhaps the strongest evidence for a tholonic interpretation. The tholonic model predicts that specific archetypal patterns require specific structural relationships to instantiate. You can’t create a Fresnel pattern with any old data because a Fresnel pattern is the signature of a specific type of interaction: a true duality engaged in continuous negotiation.

The Sun-Moon-Earth system is such a duality: - Two massive bodies - Gravitationally bound - In perpetual orbital motion - Creating interference through their combined influence on a third body (Earth) - Maintaining stable patterns over vast time scales

This is not a mathematical curiosity; it’s the material instantiation of a tholonic archetype.

Implications

If this interpretation is correct, it suggests several profound implications:

  1. Archetypal patterns transcend context: The same interference pattern that describes light bending around obstacles also describes the negotiation between celestial bodies. This is exactly what the tholonic model predicts: archetypal patterns instantiate across all domains.

  2. Measurement can reveal archetypal structure: Cross-summing, by stripping away quantitative information, inadvertently reveals the qualitative archetypal pattern underneath. This suggests that other “reduction” operations might similarly reveal hidden tholonic structures.

  3. Celestial mechanics as tholonic intelligence: The Sun-Moon-Earth system isn’t just following physical laws; it’s instantiating a tholonic pattern that has its own inherent intelligence and stability. This is consistent with the tholonic view that all existence exhibits awareness and intention.

  4. Scale-specific observation: The “sweet spot” sampling rate demonstrates that patterns exist at specific scales and require scale-appropriate observation. This has implications for how we measure and understand all natural phenomena.

  5. Prime numbers as tholonic building blocks: Their consistent minimization in stable patterns suggests they represent fundamental forces that, when properly integrated through tholonic negotiation, create composite stability. This connects number theory directly to tholonic structure.

Testable Predictions

This hypothesis makes several testable predictions:

  1. Other gravitationally-bound pairs (Earth-Jupiter, binary stars, etc.) should produce similar interference patterns when their positions are cross-summed, with pattern characteristics dependent on orbital parameters.

  2. The specific sampling rate where patterns are most visible should correlate with orbital resonances in the Sun-Moon-Earth system.

  3. Single-body systems (a planet with no moon, or measuring only the Sun) should not produce coherent Fresnel patterns, as they lack the duality required for tholonic negotiation.

  4. The prime number minimization should be most pronounced when both Sun and Moon data are combined, less so when measured separately.

  5. Other methods of “archetypal extraction” (modulo operations, digit manipulation, etc.) should reveal similar underlying patterns if the tholonic hypothesis is correct.

While my physicist friend was correct that Zipf’s law guarantees some pattern will emerge, he missed the deeper question: Why this specific pattern? The tholonic answer is: because the Sun-Moon-Earth system is a material instantiation of the same archetypal structure that creates Fresnel diffraction. The pattern isn’t emerging from the mathematics; the mathematics is revealing the pattern that was always there.


  1. These tests are: monobit frequency test, block frequency test, runs test, longest run ones10000, binary matrix rank test, spectral test, non-overlapping template matching test, overlapping template matching test, Maurers universal statistic test, linear complexity test, serial test, approximate entropy test, cumulative sums test, random excursions test, random excursions variant test, cumulative sums test reverse. These tests are provided by the National Institute of Standards and Technology as implemented by the program testrandom.py by Ilja Gerhardt, https://gerhardt.ch/random.php. The numbers published here are the result of the program “ent”, a pseudo-random number sequence test program by John Walker of Fourmilab Switzerland, https://en.wikipedia.org/wiki/John_Walker_(programmer)↩︎

  2. Data generated with the command “ent” with the following variables: Is Intel CPU; AESNI Supported in instruction set; Format=Hex model=pure size = 1000 kilobytes XOR mode off XOR range mode; off Output to file x Hardware Random Seeding on. Non deterministic mode; Reseed c_max=511; Using RdRand as nondeterministic source; Total Entropy = 0.000000; Per bit Entropy = 0.000000 %↩︎

  3. Stevens, M. (2015, September 15). The Zipf Mystery. Retrieved June 30, 2020, from https://www.youtube.com/watch?v=fCn8zs912OE. The page of this video has dozens of excellent related links↩︎

  4. https://solvemymath.com/online_math_calculator/number_theory/riemann_function/index.php or https://keisan.casio.com/exec/system/1180573439, for example.↩︎

  5. Snehal Shekatkar, The sum of the rth roots of first n natural numbers and new formula for factorial, 2012, arXiv:1204.0877v2 [math.NT], https://arxiv.org/abs/1204.0877v2↩︎

  6. Saad-Cook, J., Ross, C., Holt, N., & Turrell, J. (1988). “Touching the Sky: Artworks using Natural Phenomena, Earth, Sky and Connections to Astronomy”. Leonardo 21(2), 123-134. https://www.muse.jhu.edu/article/600628. Charles Ross, the creator, describes: “1972. I had placed this simple lens set-up on the roof so the sun would burn a path across a wooden plank as the day progressed. The idea was to collect a portrait of the weather each day. As the work progressed, I noticed that the burn’s curvature was changing with the seasons. We took photos of the burns and placed them end to end following their curvature to see what a year’s worth looked like. The sum of days generated a double spiral figure. At first, it did not make any sense-this primitive lens set-up was producing a complex spiral shape. A few of the astronomers I showed it to said, “Well, it must be coming from somewhere, but we have no idea what it is”. Most of the scientists insisted that there had to be some anomaly in the set-up and that the shape had nothing to do with astronomy -just some weirdness in the lens. In reality, it made no difference at all if the lens faced one way or the other as long as it faced generally toward south. The elements of the spiral are in sunlight itself; it was an archetypal image falling from the sky. I finally contacted Kenneth Franklin at the Hayden Planetarium. He directed me to the Naval Observatory, where LeRoy Doggett, an astronomer with the Nautical Almanac office…”. Authors note: This Sun Spiral was pieced together manually from many separate samples and is an approximation of the Sun’s path.↩︎

  7. Nishiyama, Yutaka. “A Study of Odd- and Even-Number Cultures.Bulletin of Science, Technology & Society 26, no. 6 (2006): 479-84. doi:10.1177/0270467606295408.↩︎