A graphic describing the decomposition of a module

intuition

Conceptually, each module within a branch represents a type.

Modules inherit the narrative design of their branches and as consequence many such modules depend on previous ones. Given this, the design of a module within this library should then fascilitate composability of its code with the code of other modules. This is achieved by requiring the modules to adhere to a template design.

I have based this design on philosophical ponderings regarding the nature of mathematical spaces and why they are so successful in their utility across the natural sciences. For example, 3 as a vector space is invaluable in modelling many 3-dimensional real world objects. Spheres, cubes, toruses, dodecahedrons, and so on. What is it about this space—what are the properties of such spaces in general—that make it and others like it so reusable?

theory

If you view a type as a mathematical set, then a type's success in its ability to model many applications is largely based on how easily you can create subtypes (subsets), and how easily you can navigate those subtypes. I make the claim that the following decomposition of a type is sufficient for such a success, and as consequence offers the specification we seek for the template design of our modules:

module template

  1. interface
  2. In the world of computing, types are constructive. This is to say they have a grammar which expresses their construction. This expressed construction is then the starting point for navigating and subtyping. It forms a baseline user interface to our type.

    1. structure
    2. In particular, the grammar used to construct a type generates an underlying structural pattern implied by the shape of the grammar itself. 3 is a triple product (ℝ×ℝ×ℝ) of . Its constructive grammar is as a product, so to access its x-axis, y-axis or z-axis is to specify access to the respective operand of the product.

      Implication: All such constructive code is maintained in the structure division of a given module.

    3. navigator
    4. Not only does the constructive grammar imply an underlying structure, but the way in which one navigates the grammatical components also offers a natural and universal coordinate system as a default means to navigating any subtypes within our module. With our 3 example, our universal coordinate system is the standard catesian coordinate system.

      This particular example points out another consideration: 3 also has the spherical as well as cylindrical universal coordinate systems. This is to say no assumption is made that there must be a unique universal system, only that they are universal. By this I mean no instance of the whole type is inaccessible by the given system of navigation.

      Implication: All such navigational code is maintained in the navigator division of a given module.

  3. perspective
  4. Assuming we now have a universal interface for navigation, we focus on being able to subtype. From the lens of set-theory this means we need a language to create subsets. In the broadest sense possible, the starting point for being able to group and regroup objects is the ability to compare them.

    Intuitively, if we can compare two instances of a type, we can say whether they're similar enough to group together or not, but only from the point of view of this particular comparison: Different perspectives of comparison will group instances together differently. Looking at it this way, there are three levels of complexity (of decreasing precision) when it comes to means of comparison:

    1. identity
    2. To compare for identity is to test for equality. If two objects are of the same type, you should be able to tell if they're equal or not.

      Implication: All such identity comparison code is maintained in the identity division of a given module.

    3. proximity
    4. Two objects are either equal or their not. If their not equal, then what? The next level of comparison is when you're no longer interested in perfect equality and are focused on nearness. In mathematics, there's a whole branch devoted to concepts of nearness called topology. Metric spaces derive from this, and so does the computational approximate nearest neighbour problem.

      Implication: All such proximity comparison code is maintained in the proximity division of a given module.

    5. functor
    6. If two objects are not equal and furthermore there is no preferred measure of comparison between them, we weaken our assumptions. It's possible the two objects aren't even of the same type, in which case we compare them by means of the mappings that can exist between them. In category theory (a foundation to type theory) such a mapping between types is called a functor.

      Implication: All such functor comparison code is maintained in the functor division of a given module.

  5. model
  6. We assume a universal system to navigate the instances of our type. We also assume we are able to compare the instances of a type—not only to other instances of the same type but to instances of different types altogether. We are finally able to group and regroup our type instances to form subtypes. With that said, in practice the true signature of a type are the functions it's equipped with.

    Again, with our 3 example we might have general purpose functions for the whole space, but if we subtype to a sphere we often specialize a different inventory of functions just for that sphere. If such functions are specialized just for that substructure, they implicitly form a correspondence and so in many cases can act as an alternative to its original definition.

    Given this reflection—that we know in advance function inventories for subtypes often exist—we thus allocate space for them here within this module partition. Any such model divisions will be context specific and so are otherwise generally difficult to predict in advance. With that said, I have made space for a known pattern:

    1. printer
    2. As a practical consideration, types (regardless of which branch their module belongs to) are often printed to an alternative form as a diagnostic or as human readable output. Technically, by this definition one could categorize printers as functors (they map across types). I have chosen to instead to categorized them here with the intuition that they better help to model subtypes rather than to compare their instances.

      Implication: All such printer code is maintained in the printer division of a given module.

      [note:] Keep in mind, for any given type no assumption can be made that graphic modules exist yet to actually be able to print its instances to a screen (the most common case for printing). If one desires to maintain the narrative purity within their own library, the printer functions in these divisions would exist in anticipation of such faculties: In practice they likely would preprocess and hand off the appropriate information to the eventual screen or other printer.

      [note:] In the case of this library, as it is built on top of C++ and has print functions available already at a lower level (grammatically), I have allowed these printer divisions full access to screen printing.

security

Premature optimization is the root of all evil. —Donald Knuth

Truthfully this opening quote is problematic, and probably overused in public programming discourses. In any case, it still offers enough wisdom here that it's worth perpetuating.

Security is a subtle concept when relating it back to other design considerations such as composability or optimization. On the one hand, modern technology within society has shown that security needs to be built into every level and layer of code. It needs to be privileged higher in the value-system of any specified design, more-so than most (if not all) other considerations. On the other hand, when it actually is implemented within the code at every level and layer, it acts as a tax or a toll. It's a computational expense, and if not carefully done it furthermore reduces the composability of code it's meant to protect. Upon first glance it appears to be a trade-off.

As a solution to this potential problem, I have divided each module division into two parts:

  1. semiotic
  2. This version of the code disregards safety and security considerations altogether. It privileges minimalism, composability, and entropy (reusability). Interestingly enough, if this part is done well, as a side effect many security issues are prevented to begin with.

  3. media
  4. This version of the code privileges safety. Effectively, you can think of it as a wrapper for the unsafe, insecure semiotic version. It's like having a toll to get into the building, but once you're in you're allowed to move around freely.

To reiterate and clarify this security feature: By intention, if others were to use this library they would be tagged as general users and architect users. General users would only have access to use the media versions of this code while architect users (who build the library itself) would have access to the semiotic versions as well. Architects would compose and built optimized data structures and algorithms from the semiotic space, and then wrap their final forms respectfully in a security blanket for general use. General users would then use the media versions to assist in their application code projects.