YouTube Video

Economic Model Defined 
An economic model is a theoretical construct representing component processes by a set of variables or functions, describing the logical relationships between them. If one has studied traditional or market-based economic modeling, a great deal of time is often spent on things such as price trends, behavioral patterns, inflation, the labor market, currency fluctuations, and so forth.

Rarely, if ever, is anything said about public or ecological health. Why? - Because the market is life-blind and decoupled from the actual science of life support and sustainability. It is a proxy system that is based only around the act of exchange and exchange preferences.

Therefore, the best way to think about a NLRBE is not in the traditional terms of any form of market-oriented economic model common today. Rather, this model can best be thought about as an advanced production, distribution and management system, which is democratically engaged by the public, through a kind of “participatory economics”.

This type of approach facilitates input processes, such as design proposals and demand assessment, while also filtering all actions through what we could call sustainability and efficiency protocols. These protocols are the basic rules of industrial action set by natural law, not human opinion. As noted, neither of these two interests is structurally inherent in the capitalist model.

Goals, Myths & Overview

All economic systems have structural goals and often times these goals are not exactly apparent in the theories set forward in principle. The market system and a NLRBE have very different structural goals.

-Market capitalism's structural goal is growth and maintaining rates of consumption high enough to keep enough people employed at any given time. Likewise, employment itself requires a culture of real or perceived inefficiency and that often means the preservation of scarcity in one form or another.

-A NLRBE's goal is to optimize technical efficiency and create the highest level of abundance possible, within the bounds of Earthly sustainability, seeking to meet human needs directly.

That noted, there are a number of assumptions, myths and confusions that have arisen over time that are worth addressing upfront. The first is the idea that this model is “centrally planned”. What this assumes, based on historical precedent, is that an elite group of people will make the economic decisions for the society.

A NLRBE is not centrally planned. It is a Collaborative Design System (CDS). It is based entirely upon public interaction, facilitated by programmed, open-access systems, that enable a constant, dynamic feedback exchange that can literally allow for the input of the public on any given industrial matter, whether personal or social. 
Given this, another outcry is “but who programs the system?”, which once again assumes that an elitist interest could exist behind the mediating software programs themselves (as will be expanded upon more so in this essay). The answer, as odd as it may sound, is everyone and no one. The tangible rules of the laws of nature, as they apply to environmental sustainability and engineering efficiency, are an objective frame of reference. The nuances may change to some degree over time, but the general principles of efficiency and sustainability remain, as they have been deduced by basic physics, along with several thousand years of recorded history by which we have been able to recognize basic, yet critical patterns in nature.

Moreover, the actual programming utilized by this interactive system would be available in an open source platform for public input and review. In fact, the system is predicated entirely upon the intelligence of the “group mind” and the open source/open access sharing virtue will help bring all viable interests to the surface for public consideration, in an absolutely transparent manner.

Another confusion surrounds a concept that has, to many, become, the defining difference between capitalism and most all other historically proposed social models. That has to do with whether the “means of production” is privately owned or not. In short, the means of production refers to the non-human assets that create goods, such as machinery, tools, factories, offices and the like. In capitalism, the capitalist owns the means of production, by historical definition.

There has been an ongoing argument for a century that any system that does not have its means of production owned as a form of private property, using currency as the information mechanism, is not going to be as economically efficient as one that does. This, as the argument goes, is because of the use of the price mechanism.

Price, to its credit, has the ability to create exchange value amongst virtually any set of goods due to its divisibility. This creates a feedback mechanism that connects the entire market system in a certain, narrow way. Price, property and money work together to translate subjective demand preferences into semi-objective exchange values. The notion of “semi” is employed here because it is a culturally relative measure only, absent almost every factor that gives true technical quality to a given material, good or process. 
Arguably, the only tangible technical data price that embodies, crudely, relates to a resource's 'scarcity' and the 'labor energy/complexity' put into the creation of a given good. Keep this in mind, as these two value variables will also be addressed again later in this essay with respect to non-price oriented calculation.

That all noted, the reasonable question becomes: is it possible to create a system that can more efficiently facilitate feedback with respect to consumer preference, demand, labor value and resource or component scarcity, without the price system, subjective property values or market exchange? The answer is yes. The modern solution is to completely eliminate exchange and create a direct control and feedback link between the consumer and the means of production itself. The consumer actually becomes part of the means of production and the industrial complex as a whole becomes a tool that is accessed by the public, at will, to generate goods.

To illustrate this, most today likely own a simple paper printer connected to a home computer. When a file is sent to print from the computer, the user is in control of a miniature version of a means of production. Likewise, in some cities today, there are now 3D printing labs, where people in the community can send their 3D design and use these machines to print what they need in physical form. The model being presented here is a similar idea. The next step in this scaling process is the creation of a strategically automated industrial complex, localized as much as possible, which is designed to produce, through automated means, the average of everything any given region has found demand for. As will be described, this is very feasible given the current state of technology and the ephemeralization trends at hand.

Imagine, for example, a clothing store except that is not organized like a "store" as is currently understood. It is a multi-purpose textile-printing house. You find the design you are interested in online, along with the materials you prefer and other customizations, and you print that article of clothing “on-demand” at that facility. Consider for a moment how much storage space, transport energy, and overrun waste is eliminated by this approach if virtually everything could be created on-demand, done by automated systems which can continually produce a greater variety of goods, from increasingly smaller manufacturing configurations.

In truth, the real fallacy of this “private ownership of the means of production” objection is its culture lag. Today, industry is witnessing a merger of capital goods, consumer goods and labor power. Machines are taking over human labor power, becoming capital goods, while also ever reducing in size to become consumer goods. The result is an increasingly smaller and more optimized industrial complex that can do more and more with less and less.

It is also worth mentioning that labor automation is now making the historically notable 'labor theory of value' increasingly moot as well. Today, the labor energy that goes into a given good, while still a factor for process recognition, does not have much of a quantifiable correlation anymore. Today, machines now make and design machines. While the initial creation of a machine might require a good deal of human planning and initial construction at this time, once set in motion, there is a constant decrease in that labor value transference over time.

Structure and Processes

As will be described in detail by section, figure 1 shows the linear schematic of the industrial process, moving from design to production to distribution and recycling. Figure 2 shows how an optimization of such efficiency can be considered from a mathematical point of view, as a minimization or maximization of some functional. 
Because we are talking about efficiency, we can consider the

problem as a maximization of the production function . Figure 3 is a table of symbols and descriptions, as will be used in the following explanations. It is important to note that not all attributes will be covered in this text. The purpose of this essay and the formulas suggested are done so to give a starting point for calculation, highlighting the most relevant, overarching attributes for consideration. 
A full algorithmic calculation of this nature, taking into account all related sub-processes in real life terms would require an enormous text/programming treatment and will likely occur in a future edition of this text's appendix, as an ongoing project development.

Collaborative Design Interface 
The starting point for interaction in a NLRBE is the CDI, or collaborative design interface. The CDI could abstractly be considered the “new “market” or the market of ideas or designs. Design is the first step in any production interest and this interface can be engaged by a single person; it can be engaged by a team; it can be engaged by everyone. It is open source and open access and it would come in the form of an online web interface. 
The notion of “market” is expressed here not to conflate the notion of trade, but rather the notion of sharing and group decision-making. As with the traditional sales market, there is a swarm type of behavior which makes decisions over time as a group whole with respect to what goods will develop (demand) and what goods will perish (lack of demand). In a certain sense, this democratic process is embraced in a NLRBE, but by different means. 
Moreover, all submitted designs, in creation or deemed complete, are stored in an open access, searchable database. This database makes all designs available for others to use or build upon. In this way, it is similar to a traditional goods catalog commonly found today, except it contains digital designs that can be sent into production at any time, on demand. 
This design creation and proposal system is how demand itself is assessed. Instead of traditional advertising and the unidirectional consumer good proposal system - where companies work to persuade the consumer as to what they should buy, with the public mostly going with the flow, favoring or not favoring a company's pitched good, component or feature by purchase or not - this system works in an opposite, more involved and democratic manner. 
In this new, open source type design approach, the entire global community has the option of presenting ideas for everyone to see, weighing in on and building upon designs, harnessing the power of collective experience and global knowledge. 
The mechanism of the CDI would come in the form of an interactive interface, such as we see commonly today with computer-aided design (CAD) or computer-aided engineering (CAE) software. In short, these programs are able to digitally create and represent any given product design, containing all information as to how it should be made in final, physical manufacturing.

As an aside, many considering the educational requirements to engage such an interface, might be concerned about use-complexity. Naturally, the more dedicated designer will develop the skills needed to whatever degree interested while, for the more casual user, different degrees of interface complexity and skill orientation can be utilized. 
This more user-friendly interfacing can develop in a similar fashion to how personal computers transitioned from complex proprietary coding interfaces with manually input instructions, to the now ubiquitous, simple graphic interface icon system, which allows users to operate more intuitively. Future CAD/CAE type programs will likely evolve in the same way, making the interactive process more accessible.

In many cases, as the database is always populated with current, already existing designs, the practice will be to build upon other's work. For example, if an engineer is interested in the optimization of a cell phone, they have the option of building upon any existing phone product design in the database, rather than starting from scratch.

The benefit of this cannot be emphasized enough as a collaborative platform. Rather than limit the design input to, say, a boardroom of engineers and marketers, as is common practice today, literally millions of minds can be brought together to accelerate any given idea in this approach. This new incentive system also ensures everyone interested in the good will receive exactly what everyone else is likely to receive in its advanced optimization states, where personal interest becomes directly tied to societal interest.

Also, given the patterns today, likely not everyone would want or need to be a designer. Many people would be satisfied enough by what had been set in motion already by others, with perhaps minor customization along the way. Today, a very small percentage of the population actually create and engineer the dominant technology and goods we use; and this specialization may naturally continue in the future to some degree, even though it is to the advantage of everyone if more minds came together. If the educational system is orientated away from rote learning and its antiquated basis that originated in the 19 th century social order, we could see an explosion of input and creativity.

All that understood, an incredibly important component of these design and engineering programs today is how they can now incorporate advanced physics and other real world, natural law properties with the proposed design for testing. In other words, the good isn't just viewable in a static visual model with noted properties, it can actually be tested right there, virtually, to a relevant degree.

For instance, all new automobile designs today, long before they are physically built, are run through complex digital testing processes that assist in design integrity greatly. Over time, there is no reason to believe that we will not be able to digitally represent, and set in motion for testing, most all known laws of nature, applying them in different contexts, virtually.

Optimized Efficiency Standards:

Efficiency standards are standards by which a given design must conform. This evaluation will be calculated automatically, or algorithmically, by the CDS's programming. This can also be thought of as a filtering process.

In short, any proposed design will be digitally filtered through a series of sustainability and efficiency protocols which relate not only to the state of existing resources, but also to the current performance of the total industrial system.

These would include the following “efficiency standards”.

a) Strategically Maximized Durability

b) Strategically Maximized Adaptability

c) Strategic Standardization of Genre Components

d) Strategically Integrated Recycling Conduciveness

e) Strategic Conduciveness for Labor Automation 

As per figure 4, design efficiency is one of the main factors that can affect the overall efficiency of the manufacturing and distribution process. This design efficiency depends on several key factors, which can be called current efficiency standards . Here the index corresponds to some particular standard. 
Each standard will be generally explored as follows, expanding in certain cases with respect to the symbolic logic associated, for the sake of clarity.

a) ‘Strategically Maximized Durability’ means to make the good as strong and lasting as relevant. The materials utilized, comparatively assuming possible substitutions due to levels of scarcity or other factors, would be dynamically calculated, likely automatically by the design system, to be most conducive to an optimized durability standard. 
Durability maximization. This durability maximization can be considered as a local optimization issue. It can be analyzed by introducing the factors which affect it where are some optimal values of the factors. 

b) ‘Strategically Maximized Adaptability’ means the highest 
state of flexibility for replacing component parts is made. In the event 
a component part of a good becomes defective or out of date, the 
design facilitates that such components are easily replaced to maximize full product life span, always avoiding the interest to replace the good as a whole. 

c) 'Strategic Standardization of Genre Components’

means all new designs either conform to or replace existing components which are either already in existence or outdated due a lack of comparative efficiency. This logic should not only apply to a given product, it should apply to the entire good genre, however possible.

The aim is to minimize the total number of genre components . In other words, the standardization of the process will enable the possibility of lowering the number to a possible minimum. 

d) 'Recycling Conduciveness' means every design must conform to the current state of regenerative possibility. The breakdown of any good must be anticipated in the initial design and allowed for in the most optimized way. 

e) 'Strategic Conduciveness for Labor Automation' means that the current state of optimized, automated production is also taken into account, seeking to refine the design to be most conducive to production with the least amount of complexity, human labor or monitoring. Again, we seek to simplify the way materials and production means are used so that the maximum number of goods can be produced with the least variation of materials and production equipment.

This is denoted by human labor and automated labor. The aim is to minimize the human interaction with the production process.

This can be written as:

Using this equation, we could also write a simpler condition:

where are factors that influence human and automatic labor.

So, returning to Figure 4, this “Optimized Design Efficiency” function can be described by a function where is durability, is adaptability, is recycling conduciveness, is the minimum number of genre components and is a human labor.

The Industrial Network

The industrial network refers to the basic network of physical facilities that are directly connected to the design and database system just described. The system connects servers, production facilities, distribution facilities and recycling facilities. (Figure 5)

Design Servers:

These computer servers connect the design database to the designers/consumers, while constantly being updated with relevant physical data to guide the process of product creation in the most optimized and sustainable way.

As noted, the engaged CDI (or collaborative design interface) is an open source program that facilitates collective, computer-aided design, running each step through the set of efficiency and sustainability filters (I.e. Figure 4) which assure optimized design. These designs are tested in real time, digitally, and in most cases, the good will exist in whatever state online for others to obtain, on demand, or for use as a preliminary model by which new ideas can be built upon.

Production Facilities:

These structures facilitate the actual manufacturing of a given design. These would evolve as automated factories that increasingly are able to produce more with fewer material inputs and fewer machine configurations. Again, if the interest existed to consciously overcome unnecessary design complexities, we can further this efficiency trend with an ever-lower environmental impact and ever lower resource use per task, while maximizing our abundance producing potential.

The number of production facilities, whether homogeneous or heterogeneous, would be strategically distributed topographically based on population statistics, no different than how grocery stores today try to average distances between pockets of people around neighborhoods. This is the “proximity strategy”, which will be revisited in this essay.

Distribution Facilities:

Distribution can either occur directly from the production facility, usually in the case of an on-demand, one-off production for custom use, or sent to a distribution library for public access in masse, based on regional demand interest.

Some goods will be conducive to low demand, custom production and some will not. Food is the easiest example of a mass production necessity, while a personally tailored piece of furniture would come directly from the manufacturing facility once created.

It is worth reiterating that regardless of whether the good is classified to go to a library or directly to a user, this is still an 'access system'. In other words, at any time, the user of the custom or mass produced good can return the item for reprocessing or restocking.

Recycling Facilities:

Recycling Facilities would likely exist as part of the production facility, allowing access to returned parts for updating and reprocessing. As noted in the design protocol, all goods have been pre-optimized for 'conducive recycling'. The goal here is a zero-waste economy. Whether it is a phone, a couch, a computer, a jacket, or a book, everything goes back to a recycling facility, likely the point of origin, which will directly reprocess any item as best it can.

Of course, an item may be returned elsewhere if needed; the integrated and standardized production and recycling centers, having been conceived of as a complete, compatible and holistic system, would be able to handle returned goods optimally, as is not the case today.

Global Resource and System Management:

These four facilities are also connected, to one degree or another, to a Global Resource Management (GRM) network, which is a sensor and measurement system that provides feedback and information about the current state of raw materials and the environment.

Resource Management, Feedback & Value

As noted, this computer-aided design and engineering process does not exist in a vacuum; it does not process designs with no input as to the current state of the planet and its resources. Connected to the design process, literally built into the noted “Optimize Design Efficiency” function, is dynamic feedback from an Earth-wide accounting system that gives data about all relevant resources which pertain to all productions.

To whatever degree technically possible, all raw materials and related resources are tracked and monitored, in as close to real time as possible. This is mainly because maintaining equilibrium with the Earth's regenerative processes, while also working strategically to maximize the use of the most abundant materials, while minimizing anything with emerging scarcity, is a critical efficiency calculation. Again, this is, in part, the purpose of the Global Resource Management system mentioned prior.

As far as “value” calculation, perhaps the two most important measures, which will undergo constant dynamic recalculation through feedback as industry unfolds, is the level of (a) 'scarcity' and the degree of (b) 'labor complexity'.

(a) 'Scarcity value' can be assigned a numerical value, from 1-100. 1 would denote the most severe scarcity with respect to the current rate of use and 100 the least severe. 50 would be the steady-state dividing line. The scarcity value of any given resource would exist at some value along this line, dynamically updated by the Global Resource Management network.

For example, if the use of wood passes the steady state level of 50, which would mean consumption is currently surpassing the Earth's natural regeneration rate, this would trigger a counter move of some kind, such as the process of 'material substitution' or finding a replacement for wood in any future productions.

As far as a comparative evaluation, in a market system the price mechanism is used to decide which material is more cost efficient, assuming a given price will have already accounted for relevant technical information or, in this case, the issue of scarcity. 
This new approach, rather than use price to compare or assess value, accounts for a given technical quality directly by a comparative quantification. In the case of scarcity concerns, it is best to organize genres or groups of similar use materials and quantify, to the highest degree possible, their related properties and degrees of efficiency for any given purpose. Then, a general numerical value spectrum is applied to those relationships.

For example, there is a spectrum of metals that have different efficiencies for electrical conductivity. These efficiencies can be physically quantified and then compared by value. So, if copper, a conductive metal, goes below the 50 value of equilibrium regarding its scarcity, calculations are triggered by the management program to compare the state of other conducive materials, their scarcity level and theirefficiency level, preparing for substitution.

This is just one example and naturally this type of reasoning would get extremely complicated depending on the material and purpose problems posed. However, that is exactly why it is calculated by machine, not people. The human mind, either singly or organized into large groups, simply cannot process such data effectively. Also, it is worth pointing out that this type of direct value calculation, based around purpose, conduciveness and sustainability, dramatically eclipses the price mechanism when it comes to true resource awareness and intelligent resource management in calculation.

(b) Likewise, “labor complexity” and its assessment simply means estimating the complexity of a given production and drawing a numerical value based on the degree of process complexity. Complexity, in the context of an automation-oriented industry, can be quantified by defining and comparing the number of ‘process stages.’ Any given good production can be foreshadowed as to how many ‘stages’ of production processing it will take. It can then be compared to other good productions, ideally in the same purpose genre, for a quantifiable assessment. In other words, the units of measurement are these 'stages'.

For example, a chair that can be molded in three minutes, from simple polymers in one process, will have a lower ‘labor complexity’ value than a chair which requires automated assembly down a more tedious production chain, with mixed materials. In the event a given process value is too complex or hence comparatively inefficient in terms of what is currently possible (by comparison to an already existing design of a similar nature), the design would be flagged and would hence need to be re-evaluated.

Such adjustments and flagging would come in the form of feedback from the design interface, during the design stage. There is also no reason not to assume that with ongoing advancement in AI, the system could actually feed back with actual suggestions or even direct solutions to a given efficiency or sustainability problem, in real time.

Design Calculation

Those generalizations noted, a walkthrough of this overall, linear process is expressed below. There will be some repetition here for the sake of clarity. If we were to look at good design in the broadest possible way with respect to industrial unfolding, we end up with about four functions or processes, each relating to the four dominant, linear stages, including design, production, distribution and recycling. Again, each of these processes is directly tied to the Global Resource Management system that provides value feedback that assists in the regulatory apparatus to ensure efficiency and sustainability.

The following propositions apply (Figure 1):

All Product Designs must adapt to:

1) Optimized Design Efficiency

2) Optimized Production Efficiency

3) Optimized Distribution Efficiency

4) Optimized Recycling Efficiency

1) Optimized Design Efficiency:

A product design must meet or adapt to criteria set by

[Current Efficiency Standards] .

[Current Efficiency Standards] have five evaluative sub-processes, as expressed before:

[Durability] = td

[Adaptability] =A design

[Standardization] = Nc

[Recycling Conduciveness] = cr

[Automation Conduciveness] = HL

Please note that further breakdown of each of these sub-processes and logical associations can be figuratively made as well to ever-reducing minutiae. However, as noted, this expression is the “top” tier by which all other sub-processes are oriented. It is, again, not the scope of this text to provide all attributes of a working algorithm. It is also not implied here that the parameters expressed are total or absolutely complete.

2) Optimized Production Efficiency

This filter's parameters can change based on the nature of the facilities and how much machine variation in production (fixed automation vs. flexible automation) is required at a given time. For the purpose of expression, two facility types will be distinguished: one for high demand or mass production and one for low demand or short-run, custom goods.

Very simply, a class determination is made which splits the destination facilities based upon the nature of production requirements. The 'high demand' target assumes fixed automation , meaning unvaried production methods ideal for high demand/mass production. The 'low demand' target uses flexible automation , which can do a variety of things but usually in shorter runs.

Again, this schematic assumes only two types of facilities are needed. There could be more facility types based upon production factors, generating more splitting conditions. However, if the design rules are respected, there shouldn’t be too much variation over time as the intent is always to reduce and simplify.

To state the process in linear form (Figure 7):

All product designs are filtered by a [Demand Class Determination] process. The [Demand Class Determination] process filters based on the standards set for [Low Demand] or [High Demand]. All [Low Consumer Demand] product designs are to be manufactured by the [Flexible Automation] process. All [High Consumer Demand] product designs are to be manufactured by the [Fixed Automation] process. Also, both the manufacturing of [Low Consumer Demand] and [High Consumer Demand] product designs will be regionally allocated as per the [Proximity Strategy] of the manufacturing facilities.

3) Optimized Distribution Efficiency

Once process 2 is finished, the product design becomes a 'product' and moves to the [Optimized Distribution Efficiency] filter. In short, all products are allocated based on its prior [Demand Class Determination]. [Low Consumer Demand] products follow the [Direct Distribution] process. [High Consumer Demand] productions follow the [Mass Distribution] process, which would likely be the libraries, mentioned prior. Both the [Low Consumer Demand] and [High Consumer Demand] product will be regionally allocated as per the [Proximity Strategy], as before.

A (left) – Direct Distribution – low demand case,

B (right) – Mass Distribution – high demand case 

In the case of [Low Consumer Demand]

the distribution scheme is direct (Figure 8a). In this case the product goes directly to the consumer without the help of network intermediaries. 

In the case of [High Consumer Demand]

the distribution scheme is mass (Figure 8b). In this case the product goes to intermediary facilities, such as libraries to engage the potential consumers .

Similar to the production efficiency considerations, in the case of 'Distribution Efficiency' , for the low and high demand, the distribution process will be optimized in terms of the distance to the existing facilities. In this case the facilities are places in regional distribution (libraries), based on the level of demand in the given region. (i.e. Proximity Strategy ).

4) Optimized Recycling Efficiency

After distribution, the product then goes through its life-cycle. Once its life-cycle ends, the product becomes "void” and moves to process #4, or the [Optimized Recycling Efficiency] filter. In short, all voided products will follow the current [Regenerative Protocol] . This protocol embraces the standards employed at that time to ensure the optimized reuse or reincorporation of any given good or component. Naturally, the sub-processes of this are vast and complex and it is the role of engineers, embracing natural law physics, to best understand exactly what parameters will be set.

The Domestic Economy

The prior schematic regarding sustainable and technically efficient processes, optimized dynamically to gain the most stability and maximize the potential of any given economic operation, is both extremely complex in detail and deceptively simple in theory.

The tedium of creating a complete, industry-orienting algorithm that serves as the natural law regulatory filter, by which humanity can assure the most optimized technical practices, is certainly a major intellectual project to undertake, once again. The sub-processes inherent to such a multidimensional calculation would run into the thousands, for sure.

Yet, at the same time, the unfolding of the overall process is quite elegant in form. The idea of placing each human, if interested, at the helm of industrial creation, facilitating the “group mind” interaction for problem solving and creation, contains a deeply unifying community gesture, coupled with a kind of personal freedom of expression in the creative process which has not been seen before. The very notion of extremely versatile, on-demand production systems which can produce a good for a single person or goods for an entire cultural demographic, is profound in its implications, not to mention the vast positive outcomes inherent when it comes to creating a more peaceful, humane society.

Given the technological trends, it is not far-fetched to imagine a small town which, just as it may today have an electrical grid which unifies that town in its central source of power, now has a production plant network designed to literally create most everything that town may need, on demand. Raw resources are brought into the plant as per conditions and allocation algorithms surrounding the “global resource management system”, which connects all such economic facilities both regionally and globally.

Yet, within this scenario, the role of the human being is often confused. While the pursuit of post-scarcity in this way will create a sustainable and abundance generating paradigm where people can live without the burden of “working for a living”, the debate over “what will people do?” is a question that often arises, along with another inevitable question: “Who is running the machines for no pay!?”

The first question gets to the heart of human values. People have always found interesting things to do and explore, and it is severely doubted that an era of boredom would arise given that people would no longer need to fight just to live a high quality life. Rather, people might very well be elevated to a new type of existence and engage in higher order interests that were simply unattainable in the prior model.

The second question is more interesting. In an automated economy, which strategically works to remove humans from any kind of monotonous, difficult or unsafe labor, there will still be some basic need for oversight and management. For many who shun post-scarcity rhetoric, this fallback is common, arguing that only in a 100% automated utopia, where people literally have no obligation, would the society be possible. Otherwise, some sub-culture will be required to do the remaining labor and hence some kind of stratified oppression would be inherent.

The problem with this assumption is that it is deeply locked into a market-oriented worldview where time is equated to money. People today have a knee-jerk reaction to assume that in order for anything to actually get done, money must be in play as an incentive.

Yet, statistically, this is simply untrue. In a 1992 Gallup Poll, more than 50% of American adults (94 million Americans) volunteered time for social causes, at an average of 4.2 hours a week, for a total of 20.5 billion hours a year.  A more recent poll in December 2013 showed a steady increase in volunteering from 2001 until 2013. Figures from 2008 in the US also showed an increase in non-religious volunteering, underlining the point that social contributions can exist for their own sake, as well as for religious reasons, and during great economic difficulties. The truth of the matter is that human beings, even in the highly competitive and materialistic orientation of the United States, still decide to do a great deal without an interest in monetary reward.

Open source programming is another example. Linux, which started in 1991 as a simple experiment, was able to complete its community-driven, almost moneyless programming development in just three years. Linux has over 10,000 lines of code and the vast amount of its creation was done for free by a global community. Wikipedia is yet another example of a non-profit, community generated creation, research and expression. It has been estimated that Wikipedia took 100 million hours of volunteer time to create, and features a technically advanced and complex backend, demonstrating that well-engineered interrelating systems, when leveraged with large volunteer efforts, can create world-first systems previously considered unrealistic or unfeasible.

So, while money still rules the overall motivation in the current society, given some free time, people have proven they will contribute greatly to projects which have no monetary return and the real issue underlying the motivation of such labor is the satisfaction and thefeeling of contribution. Today, most jobs do not generate this feeling. Most people walk into a private dictatorship five days a week and are under the control of superiors, knowing they can be fired at any moment.

The contribution they make rarely has a direct return to them and the feeling of accomplishment is diminished. Some jobs might even make one wonder what the point of the occupation even is in the context of social contribution or personal development. Many jobs exist today simply for the sake of generating or moving money and nothing more. Advertising and Wall Street occupations, for example, are examples of high value occupations which, in truth, do very little to improve society.

This perhaps might explain the “lazy” tendency many feel once off their job at the end of the day, returning home feeling defeated and tired. Over time, many lose spirit and motivation overall and find that their job becomes the only thing that is supposed to have meaning in their life, forgetting the enjoyable passions once inherent to their development.

That considered, in a fully realized NLRBE, it is estimated that perhaps 5% of a region's population (5% of the global population as well), on average, would be needed to assist the fluid operation of this industrial system and this figure would likely continually diminish in the future as technology advances. This participation might best be expressed in the form of what has been termed the “domestic” economy. The domestic economy embodies the helpful actions of people in a non-paid environment. Household work, family and community interests are traditional examples.

In a NLRBE, such labor would be relegated in the same gesture and the delegation of such labor roles could be distributed amongst a large-scale population making the actual time commitment miniscule overall. Even by current standards, if one were to ask the average worker if they would be willing to live, say, the equivalence of a $100,000 a year lifestyle, but having to volunteer 5% of their time for maintenance of the system that supports their standard of living, there is little doubt agreement would be met by most. The amount of time saved alone in this type of socioeconomic model, coupled again with the vast problem alleviation of the environmental problems and social conflicts inherent to the market, leave little room for rational objection.

Likewise, once set free, the creative, collaborative contribution propensity itself, which is the true driver of progress, will no longer be inhibited by the monotony of labor or the income system. It is very difficult to predict the incredible level of productivity and focus a society may achieve once such oppressive factors are removed.

The Decentralization Paradox 
While the rhetoric of a global society with global values underscores this socioeconomic model, it is important to understand the nature of its redundancy and its decentralized layout. John Dalberg-Acton, 1st Baron Acton, once stated "Power tends to corrupt, and absolute power corrupts absolutely”. This power-fearing perspective is certainly well justified in history and many who hear about a NLRBE often assume this global society is “ruled” by one mainframe, one machine, an elite group of technocrats or something similar.

It is important to remind ourselves that almost all prior societies have lived within great scarcity and hence great conflict. This, coupled with the fact that money and resources have been a means to gain power - usually after a good deal of battle, reinforcing a status and dominance hierarchy - illustrates that we should not be surprised at these reactions. However, this statement is also deeply counter-productive on the whole as it gives the paranoid sense that no one can ever be trusted, if they are given any type of control over others.

A NLRBE is, indeed, a global structure in how it processes economic information and assesses output possibilities. Once a good is designed, it runs through the aforementioned efficiency and sustainability filters, which invariably tie back to the status of global resources, along with a global network for design contribution. At the same time, larger order societal decisions, meaning those decisions once made by elected representatives, are also achieved by consensus by the population, directly.

The only real centralization inherent is this digital network connecting the world itself. Given that, we could consider a few possible problems in this circumstance in the same way we think about the Internet today, which is essentially the same infrastructure. “Hacking”, for example, which is the act of disturbing, stealing or corrupting a program or digital information by invading the source code, might be a concern.

However, we have to first ask why anyone would perform such an act in the new model. Since the entire system is designed to provide for everyone, where is the incentive to disturb it? Anyone shutting down such a system is also shutting down their means to contribute and develop. An analogy would be a person today living in an apartment building, where everyone shares the utility infrastructure, deciding to destroy the fuse or electrical breaker box which divides the incoming electricity to power the whole apartment building. Why would they do it if it shuts down their own electricity as well?

It is important to review why people can be so vicious today. Anger is bred by deprivation and some external act is often interpreted as the source of this abuse. So, in retribution, people today “hack” and violate websites and the like to either make a protest point or to get revenge. In a NLRBE, it is hard to fathom where the source of such angst and outrage would materialize. If a person doesn't like the way the system is working in a specific way, they have the capacity to change it by assessing consensus with others. The system is emergent.

However, in the event that this did happen, there is a simple solution: active redundancy. In a monetary-driven society, based around cost efficiency, we see little fail-safe redundancy in place as it is unaffordable. For example, we see an airplane with two engines and both are needed to fly. Why not create an airplane with two main engines and two back-up engines, which are not running when the full plane is in working order, but in the event an engine fails, another engine is able to take over.

The main server network which facilitates social connectivity and unification could have 5,6,7...20 levels of redundancy and automatic backup in the event anything went down. It might not be perfect. Some data may be lost. However, again, this isn't a utopia. With respect to who has the “power” to notice the problem and implement this redundancy, technical teams exist to monitor the network, just like any other existing vocation.

Of course, the question then arises: what if someone on the technical team is corrupted and purposefully messes up the system? Once again, the counter question is: why would they do it? What is the incentive? In the event this did happen, it would not take long for others to notice and the system could be corrected in the same semblance of redundancy, with the person removed. That person would then be questioned by his or her peers and society overall to better understand why this act occurred.

Overall, we trustfully give ourselves over to “authority” all the time. Doctors, mechanics, and any other specialization always involve a level of trust by those seeking such help, and most of the time, even in a monetary society which generates dishonesty, people are mostly honest, or as honest as they can be, the majority of the time. It is simply too cynical to assume that any allocation of control is dangerous. At no time in human history have we not shared some level of delegated power responsibility to each other, and in almost all cases, as with dentistry or mechanics, the nature of the "power" delegated in question is characterized by its technical merit; precisely the kind of oversight advocated within the present context.

In a NLRBE, the reinforcement is to help oneself, which means to help society, not to exploit or abuse. There is literally no reward reinforcement for such negative behavior, as opposed to the natural state of general corruption we endure today.

As far as the physical network itself, it is decentralized in its orientation in many ways, often more so than we see today. The topographic layout of Earth makes many things logically obvious as far as structure placement. People, being social, naturally have an interest to have some kind of community centralization; the existence of certain energy providing areas, such as for solar/wind/geothermal/hydro, carve out their own locations logically; extraction, production and distribution networks also have a topographical logic inherent as efficiency mandates we keep such facilitates as close to each other as possible, reducing energy waste and transport; etc.

Cities themselves will change in two major ways. For one, the construction and networking of the internal city system will seek to meet the highest state of technical efficiency possible, including sustainable infrastructure, homes, production/distribution networks and the like, taking the systems basis into direct account. Secondly, it is expected that due to the evolution of ephemeralization, a given city will produce all regional goods locally. Management of the city on the level of broad infrastructure, such as where to put a bridge, will also be a regional decision making process, set in motion by the direct democracy, CDS system. Land allocation works the same way, even though that is a larger subject, which is addressed in the essay “Post-Scarcity Trends, Capacity and Efficiency”.

Of course, each city naturally connects to other cities, ideally with advanced transport systems, which can cleanly and fluidly move people. Maglev type trains systems are on pace to be the next stage of fast, safe and efficient transport with little to no environmental footprint, as compared to oil-powered planes, buses and cars.

As for as the “engine” of a city, which is its industry, digital networks and sensor systems work to gather important regional and non-regional data. This relates to the “global resource management” network as described before and both regional and global networks of measurements allow all cities/citizens to have a holistic sense of what is going on, affecting production and other important environmental factors. 

So, this network might very well be “centralized” in its data and raw resource flow to a city's internal production facilities, but it is decentralized in that a city imports nothing else. It is mostly self-contained. All productions occur internally, importing and exporting no produced goods, only resources. This idea of “self-containment per scale degree” is important and even applies towards structures, such as houses. The ideal house would be off-the-grid and self-contained in its energy sourcing and with redundant backup energy sources in place should anything become compromised.

Put another way, there is no central “off switch” in such a natural redundancy based system. For example, if a base-load providing power grid is being used and that grid goes down, it would have little effect on houses if those houses are also designed to harvest local energy sources (i.e. solar) and hence be self-contained. Likewise, no one thing can upset the international system. Unlike modern monetary finance and currency structures, which are highly centralized and can wreak havoc globally if things go wrong, a problem in one city has little effect on any other city in a NLRBE.

So, in truth, a properly organized NLRBE is not centralized in any real sense. It is more accurate to say that it is a global decentralized system, with various degrees of inherent redundancy, which, degree by degree, connects itself by information flow and physical channels to acquire proper resources, to be used for each region's local economy.