Here’s an interesting idea which I thought of a while ago, while attending a lecture from a member of the Occupy movement. Government is about decision making and scalability, and each solution to the problem has its associated cost proportional to the size of the governed populus.
The person from the Occupy discussion was teaching us the principles of anarchy, and how the system worked without any de facto or de jure authority. The end result was that it was incredibly inefficient, and I wanted to think of how to characterize that.
Below are the sort of fundamentals of group discussion, rather than the optimization techniques which arise from certain arrangements of the larger population into sub groups (eg. representative democracy). These fundamental group communication models are basic and tend to be inefficient as they are the elements which can not scale and thus need to be covered by overlay networks.
Anarchy: O(n^2 + n)
This follows the sort of triangular numbers scheme. The basic idea is that with an anarchy, barring all uncivilized conduct, naturally lends to having each individual meeting with a single other individual and discussing matters on their own accord and resolving things thusly. This scheme requires splitting a group into all possible combinations of groups of two and x nCr 2 simplifies down to n(n+1)/2 or something to that effect.
Multicast Anarchy/Direct Democracy: O(n)
I must apologize for the terms which I am probably abusing. The Mulitcast Anarchy basically changes the dynamic into having each individual line up and give a presentation to every other member. The fact it is a presentation and addresses a wide audience makes it somewhat multicast, rather than the strictly one-on-one system of a basic anarchy. Everyone has one turn, and thus it’s something which occurs in linear time.
Ballot Democracy: O(1)
This restricted definition of a democracy involves the concept of voting, where each individual has a limited say which operates completely independent of others, as in, no individual has the ability to influence any other person’s opinion, rather, people simply have a set of options and vote and the majority rules (in a non-binary scenario, it may be plurality or some more complex metric for determining the winner, since it has been demonstrated more than once that first-past-the-pole is a woefully inadequate scheme, though this is beyond the scope of this blog post).
This can be seen as operating in constant time because voting is a highly parallelizable operation if it doesn’t depend on the ideas of others. However, that is rather important in the process of disseminating ideas and it’s best not to blindly create law out of the preconceptions of the voters.
Having a single head for all decisions without need to confer with others is the fastest and most scalable approach to government. Naturally, by disregarding all the input from the population and acting entirely off the ruler’s whims, the approach scales independently of the size of the body which is to be represented by the supreme dictator.
What follows are the non-fundamental sort of voting schemes which depend quite a lot on the ones above and where the lower level discussion schemes may very much be irrelevant (or entirely relevant depending on the scenario).
Representative Democracy: O(log n)
A representative democracy is a structure which is naturally organized into layers, and barring the fact that the layers are not of consistent relative size (which may be the subject of another post, what the ideal person-to-representative or representative-to-metarepresentative ratio is, if there is one, from a network topology perspective). The advantage is that the levels can happen in parallel, for instance, organizing people into random groups which are smaller allows each smaller group to decide things in a considerably shorter amount of time and to send a representative for that idea to rinse and repeat until a sort of universal consensus is determined. This happens in logarithmic time.
Representative Anarchy: O(e^2 log n)
This is a made up construct which sort of expands on the mathematical nature while sort of breaking away from the strict notations of Big-O. Technically e^2 * log(n) time is considered equivalent to log(n), but this represents a hybridized approach. For instance, the real time cost for a representative system is the number of layers (which is the logarithmic factor) multiplied by the cost of the scheme for discussing ideas in each of those layers, where if an anarchy is employed would be proportional to the square of the radix.
Though it doesn’t actually deserve to be considered separately, it may be useful to determine whether natural discussions are more akin to this kind of representative anarchy or a true representative democracy (by the definitions of this page and this page only, which are considerably more formal than traditional definitions are).
Statistical Sampling: O(m)
Statistically sampling representatives for a quasi-anarchical or democratic forum. This would technically be something which could scale largely independent of the size of the population which makes it quite efficient but is quite jarring to the psyche and the individual mentality. Just the negative connotation to the word “statistic” and the eternal fear of being reduced to a mere statistic likely renders this solution infeasible in any practical setting.
Note that this would instead be proportional to the size of sample which is collected, and would operate just as Anarchy and Direct Democracy, with the number instead representing sample rather than the population.
Issue Oriented/Party Organization: O(m)
If one clusters people who have similar views into a singular highly weighted individual, then the system just naturally becomes much more efficient and then independent of natural size of the population. It’s a bit consequentialist because it clusters together people by their end-desires rather than their motives which have a much more important role in the process of sharing and discussing information. But it works, and for yes-no issues it causes a considerable speedup in theory.
While a computer may consider O(log n) to be equivalent to O(999 log n), in practice, human time is limited and the precise coefficients do matter.
Which makes me bring up an Alan Perlis quote, “for every polynomial-time algorithm you have, there is an exponential-time algorithm I’d rather run”.
This says quite a lot about the problem with the theory which is espoused on this blog post. The theories here are completely divorced from the rather important coefficients and the practical overhead of parallel operations (counting votes and setting up ballots is far from an instantaneous operation). The only real way to consider which form of government, or overlay network topology or awkward amalgam of the above (or entirely different scheme which I have not been witty enough to devise or absent minded enough to categorize), is to empirically test it. Of course, that’s hard because political experimentation is sadly quite taboo.
I can’t think of something actually funny, but I’d imagine there were some politcal schemes which could be crafted which are ridiculous in terms of their Big O notation. For instance:
Congress: No theoretical average case performance because it’s subject to the Turing Halting problem
Ha ha ha. No I’m pretty sure that’s not funny because it’s not even particularly witty and presupposes knowledge of the halting problem (which depends a lot on the readership of this blog, which almost literally doesn’t exist and certainly isn’t very regular). The halting problem being a problem which rephrases Godel’s incompleteness theorem in such a way that there’s no way without running certain classes of algorithms to determine if the algorithm actually terminates. Also, jokes about Congress’s ineptitude certainly should be trite by now.
Whenever two groups disagree with regard to a policy issue, split them up and then allocate certain places where each can exist under their own desired conditions. If the political landscape involves more than a single policy issue, then attempt the King Solomon-esque solution of cuting people into little pieces.
Whenever you have a policy issue, allocate certain days for one solution and other days for another solution. Try them all until you know which one is the better option.
Quantum Bogocracy: O(1)
This is based on the idea of the Quantum Bogosort algorithm. Whenever two groups disagree, base the decision off a quantum source of entropy such that it creates a branch according to the many worlds interpretation and based on the option’s performance given a certain span of time, decide whether or not to destroy the universe.
Pair up all people who disagree and have each kill the other in a manner which is mutually destructive and leave the one group which remains to implement their policy.
Construct a sierpinski gasket out of the class pyramid, and anyone who isn’t dead after being cut into an infinite number of triangles becomes the supreme dictator.
This has been left as an exercise for the reader.