This is a third paper in a cycle on distributed swarms, OODA loops and stigmergy co-authored with a PhD student of mine. The paper is titled Distributed Swarming and Stigmergic Effects on ISIS Networks: OODA Loop Model, and was published in the Journal of Media and Information Warfare. This is probably the densest and most interesting paper in the series, as we analyse information warfare waged by distributed swarms in the context of network-centric warfare theory, stigmergic adaptation, and John Boyd’s work on the OODA loop concept. For me the most interesting elements of the paper involve our discussion of Von Moltke’s concept of auftragstactic in the context of maneuver warfare in the information domain.
This is a paper I co-authored with two collaborators, one of which is a PhD student of mine, titled Encrypted Jihad: Investigating the Role of Telegram App in Lone Wolf Attacks in the West, and published in the Journal of Strategic Security. We examine the role played by Telegram, one of the most popular social media apps offering end-to-end encrypted communications, in the command and control [C2] operations of distributed terrorist organizations. Specifically, I was interested in illustrating how encrypted platforms such as Telegram can be used as part of a complex stigmergic communications strategy relying on memetic impact both within the distributed network and outside of it. In brief, Telegram acts as a standalone communication platform where core C2 vectors are encrypted and obfuscated from counter-terrorism efforts, while all other communication is built for maximum memetic potential, relying on stigmergic impact among otherwise unconnected nodes acting as lone wolves.
These are some loosely organized observations about the nature of network topologies in the wild.
In terms of both agency and information, all entities, be they singular [person], plural [clan/tribe/small company], or meta-plural [nation/empire/global corporation] are essentially stacks of various network topologies. To understand how the entities operate in space these topologies can be simplified to a set of basic characteristics. When networks are mapped and discussed, it is usually at this 2-dimensional level. However, in addition to operating in space, all entities have to perform themselves in time.
This performative aspect of networks is harder to grasp, as it involves a continuously looping process of encountering other networks and adapting to them. In the process of performative adaptation all networks experience dynamic changes to their topologies, which in turn challenge their internal coherence. This process is fractal, in that at any one moment there is a vast multiplicity of networks interacting with each other across the entire surface of their periphery [important qualification here – fully distributed networks are all periphery]. There are several important aspects to this process, which for simplicity’s sake can be reduced to an interaction of two networks and classified as follows:
1] the topology of the network we are observing [A];
2] the topology of network B, that A is in the process of encountering;
3] the nature of the encounter: positive [dynamic collaboration], negative [dynamic war], zero sum [dynamic equilibrium].
All encounters are dynamic, and can collapse into each other at any moment. All encounters are also expressed in terms of entropy – they increase or decrease it within the network. Centralized networks cannot manage entropy very well and are extremely fragile to it.
Positive encounters are self explanatory, in that they allow networks to operate in a quasi-symbiotic relationship strengthening each network. These encounters are dynamically negentropic for both networks, in that they enable both networks to increase coherence and reduce entropy.
Negative encounters can be offensive or defensive, whereby one or both [or multiple] networks attempt to undermine and/or disrupt the internal coherency of the other network/s. These encounters are by definition entropic for at least one of the networks involved [often for all], in that they dramatically increase entropy in at least one of the combatants. They can however be negentropic for some of the participants. For example, WW2 was arguably negentropic for the US and highly entropic for European states.
Zero sum encounters are interesting, in that they represent a dynamic cancelling out of networks. There is neither cooperation nor war, but a state of co-presence without an exchange of entropy in a dynamic time-space range. I believe this is a rare type of encounters, because the absence of entropy exchange can appear only if 1] there is no exchange of information or agency, or 2] the amount of agency/information exchanged is identical from both sides. Needless to say, this process cannot be easily stabilized over a long time period and either morphs into one of the other two states or the networks stop encountering each other.
These are the slides for what was perhaps my favorite lecture so far in BCM112. The lecture has three distinct parts, presented by myself and my PhD students Doug Simkin and Travis Wall. I opened by building on the previous lecture which focused on the dynamics of networked participation, and expanded on the shift from passive consumption to produsage. The modalities of this shift are elegantly illustrated by the event-frame-story structure I developed to formalize the process of news production [it applies to any content production]. The event stage is where the original footage appears – it often is user generated, raw, messy, and with indeterminate context. The frame stage provides the filter for interpreting the raw data. The story stage is what is produced after the frame has done its work. In the legacy media paradigm the event and frame stages are closed to everyone except the authority figures responsible for story production – governments, institutions, journalists, academics, intellectuals, corporate content producers. This generates an environment where authority is dominant, and authenticity is whatever authority decides – the audience is passive and in a state of pure consumption. In the distributed media paradigm the entire process is open and can be entered by anyone at any point – event, frame, or story. This generates an environment where multiple event versions, frames, and stories compete for produser attention on an equal footing.
These dynamics have profound effects on information as a tool for persuasion and frame shifting, or in other words – propaganda. In legacy media propaganda is a function of the dynamics of the paradigm: high cost of entry, high cost of failure, minimum experimentation, inherent quality filter, limited competition, cartelization with limited variation, and an inevitable stagnation.
In distributed media propaganda is memes. Here too propaganda is a function of the dynamics of the paradigm, but those are characterized by collective intelligence as the default form of participation in distributed networks. In this configuration users act as a self-coordinating swarm towards an emergent aggregate goal. The swarm has an orders of magnitude faster production time than the legacy media. This results in orders of magnitude faster feedback loops and information dissemination.
The next part of the lecture, delivered by Doug Simkin, focused on a case study of the /SG/ threads on 4chan’s /pol/ board as an illustration of an emergent distributed swarm in action. This is an excellent case study as it focuses on real-world change produced with astonishing speed in a fully distributed manner.
The final part of the lecture, delivered by Travis Wall, focused on a case study of the #draftourdaughters memetic warfare campaign, which occurred on 4chan’s /pol/ board in the days preceding the 2016 US presidential election. This case study is a potent illustration of the ability of networked swarms to leverage fast feedback loops, rapid prototyping, error discovery, and distributed coordination in highly scalable content production.
These are slides from a lecture I delivered in the fifth week of BCM112, building on open-process arguments conceptualized in a lecture on the logic and aesthetics of digital production. My particular focus in this lecture was on examining the main dynamics of the audience trajectory in the process of convergence. I develop the conceptual frame around Richard Sennet’s notion of dialogic media as ontologically distinct from monologic media, where the latter render a passive audience as listeners and consumers, while the former render conversational participants. I then build on this with Axel Bruns’ ideas on produsage [a better term than prosumer], and specifically his identification of thew new modalities of media in this configuration: a distributed generation of content, fluid movement of produsers between roles, digital artefacts remaining open and in a state of indeterminacy, and permissive ownership regimes enabling continuous collaboration. The key conceptual element here is that the entire chain of the process of production, aggregation, and curation of content is open to modification, and can be entered at any point.
Recently I have been trying to formulate my digital media teaching and learning philosophy as a systemic framework. This is a posteriori work because philosophies can be non-systemic, but systems are always based on a philosophy. I also don’t think a teaching/learning system can ever be complete, because entropy and change are the only givens [even in academy]. It has to be understood as dynamic, and therefore more along the lines of rules-of-thumb as opposed to prescriptive dogma.
None of the specific elements of the framework I use are critical to its success, and the only axiom is that the elements have to form a coherent system. By coherence, I understand a dynamic setting where 1] the elements of the system are integrated both horizontally and vertically [more on that below], and 2] the system is bigger than the sum of its parts. The second point needs further elaboration, as I have often found even highly educated people really struggle with non-linear systems. Briefly, linear progression is utterly predictable [x + 1 + 1…= x + n] and comfortable to build models in – i.e. if you increase x by 1, the new state of the system will be x +1. Nonlinear progression by contrast is utterly unpredictable and exhibits rapid deviations from whatever the fashionable mean is at the moment – i.e. x+1= y. Needless to say, one cannot model nonlinear systems over long periods of time, as the systems will inevitably deviate from the limited variables given in the model.
Axiom: all complex systems are nonlinear when exposed to time [even in academy].
The age of the moderns has configured us to think exceedingly in linear terms, while reality is and has always been regretfully non-linear [Nassim Taleb built a career pointing this out for fun and profit]. Unfortunately this mass delusion extends to education, where linear thinking rules across all disciplines. Every time you hear the “take these five exams and you will receive a certificate that you know stuff” mantra you are encountering a manifestation of magical linear thinking. Fortunately, learning does not follow a linear progression, and is in fact one of the most non-linear processes we are ever likely to encounter as a species.
Most importantly, learning has to be understood as paradigmatically opposed to knowing facts, because the former is non-linear and relies on dynamic encounters with reality, while the latter is linear and relies on static encounters with models of reality.
With that out of the way, let’s get to the framework I have developed so far. There are two fundamental philosophical pillars framing the assessment structure in the digital media and communication [DIGC] subjects I have been teaching at the University of Wollongong [UOW], both informed by constructivist pedagogic approaches to knowledge creation [the subjects I coordinate are BCM112, DIGC202, and DIGC302].
1] The first of those pillars is the notion of content creation for a publicly available portfolio, expressed through the content formats students are asked to produce in the DIGC major.
Rule of thumb: all content creation without exception has to be non-prescriptive, where students are given starting points and asked to develop learning trajectories on their own – i.e. ‘write a 500 word blog post on surveillance using the following problems as starting points, and make a meme illustrating your argument’.
Rule of thumb: all content has to be publicly available, in order to expose students to nonlinear feedback loops – i.e. ‘my video has 20 000 views in three days – why is this happening?’ [first year student, true story].
Rule of thumb: all content has to be produced in aggregate in order to leverage nonlinear time effects on learning – i.e. ‘I suddenly discovered I taught myself Adobe Premiere while editing my videos for this subject’ [second year student, true story].
The formats students produce include, but are not limited to, short WordPress essays and comments, annotated Twitter links, YouTube videos, SoundCloud podcasts, single image semantically-rich memetic messages on Imgur, dynamic semantically-rich memetic messages on Giphy, and large-scale free-form media-rich digital artefacts [more on those below].
Rule of thumb: design for simultaneous, dynamic content production of varying intensity, in order to multiply interface points with topic problematic – i.e. ‘this week you should write a blog post on distributed network topologies, make a video illustrating the argument, tweet three examples of distributed networks in the real world, and comment on three other student posts’.
2] The second pillar is expressed through the notion of horizontal and vertical integration of knowledge creation practices. This stands for a model of media production where the same assessments and platforms are used extensively across different subject areas at the same level and program of study [horizontal integration], as well as across levels and programs [vertical integration].
Rule of thumb: the higher the horizontal/vertical integration, the more content serendipity students are likely to encounter, and the more pronounced the effects of non-linearity on learning.
Crucially, and this point has to be strongly emphasized, the integration of assessments and content platforms both horizontally and vertically allows students to leverage content aggregates and scale up in terms of their output [non-linearity, hello again]. In practice, this means that a student taking BCM112 [a core subject in the DIGC major] will use the same media platforms also in BCM110 [a core subject for all communication and media studies students], but also in JOUR102 [a core subject in the journalism degree] and MEDA101 [a core subject in media arts]. This horizontal integration across 100 level subjects allows students to rapidly build up sophisticated content portfolios and leverage content serendipity.
Rule of thumb: always try to design for content serendipity, where content of topical variety coexists on the same platform – i.e. a multitude of subjects with blogging assessments allowing the student to use the same WordPress blog. When serendipity is actively encouraged it transforms content platforms into so many idea colliders with potentially nonlinear learning results.
Adding the vertical integration allows students to reuse the same platforms in their 200 and 300 level subjects across the same major, and/or other majors and programs. Naturally, this results in highly scalable content outputs, the aggregation of extensively documented portfolios of media production, and most importantly, the rapid nonlinear accumulation of knowledge production techniques and practices.
On digital artefacts
A significant challenge across academy as a whole, and media studies as a discipline, is giving students the opportunity to work on projects with real-world implications and relevance, that is, projects with nonlinear outcomes aimed at real stakeholders, users, and audiences. The digital artefact [DA] assessment framework I developed along the lines of the model discussed above is a direct response to this challenge. The only limiting requirements for a DA are that 1] artefacts should be developed in public on the open internet, therefore leveraging non-linearity, collective intelligence and fast feedback loops, and 2] artefacts should have a clearly defined social utility for stakeholders and audiences outside the subject and program.
Rule of thumb: media project assessments should always be non-prescriptive in order to leverage non-linearity – i.e. ‘I thought I am fooling around with a drone, and now I have a start-up and have to learn how to talk to investors’ [second year student, true story].
Implementing the above rule of thumb means that you absolutely cannot structure and/or limit: 1] group numbers – in my subjects students can work with whoever they want, in whatever numbers and configurations, with people in and/or out of the subject, degree, university; 2] the project topic – my students are expected to define the DA topic on their own, the only limitations provided by the criteria for public availability, social utility, and the broad confines of the subject area – i.e. digital media; 3] the project duration – I expect my students to approach the DA as a project that can be completed within the subject, but that can also be extended throughout the duration of the degree and beyond.
Digital artefact development rule of thumb 1: Fail Early, Fail Often [FEFO]
#fefo is a developmental strategy originating in the open source community, and first formalized by Eric Raymond in The Cathedral and the Bazaar. FEFO looks simple, but is the embodiment of a fundamental insight about complex systems. If a complex system has to last in time while interfacing with nonlinear environments, its best bet is to distribute and normalize risk taking [a better word for decision making] across its network, while also accounting for the systemic effects of failure within the system [see Nassim Taleb’s Antifragile for an elaboration]. In the context of teaching and learning, FEFO asks creators to push towards the limits of their idea, experiment at those limits and inevitably fail, and then to immediately iterate through this very process again, and again. At the individual level the result of FEFO in practice is rapid error discovery and elimination, while at the systemic level it leads to a culture of rapid prototyping, experimentation, and ideation.
Digital artefact development rule of thumb 2: Fast, Inexpensive, Simple, Tiny [FIST]
#fist is a developmental strategy developed by Lt. Col. Dan Ward, Chief of Acquisition Innovation at USAF. It provides a rule-of-thumb framework for evaluating the potential and scope of projects, allowing creators to chart ideation trajectories within parameters geared for simplicity. In my subjects FIST projects have to be: 1] time-bound [fast], even if part of an ongoing process; 2] reusing existing easily accessible techniques [inexpensive], as opposed to relying on complex new developments; 3] constantly aiming away from fragility [simple], and towards structural simplicity; 4] small-scale with the potential to grow [tiny], as opposed to large-scale with the potential to crumble.
In the context of my teaching, starting with their first foray into the DIGC major in BCM112 students are asked to ideate, rapidly prototype, develop, produce, and iterate a DA along the criteria outlined above. Crucially, students are allowed and encouraged to have complete conceptual freedom in developing their DAs. Students can work alone or in a group, which can include students from different classes or outside stakeholders. Students can also leverage multiple subjects across levels of study to work on the same digital artefact [therefore scaling up horizontally and/or vertically]. For example, they can work on the same project while enrolled in DIGC202 and DIGC302, or while enrolled in DIGC202 and DIGC335. Most importantly, students are encouraged to continue working on their projects even after a subject has been completed, which potentially leads to projects lasting for the entirety of their degree, spanning 3 years and a multitude of subjects.
In an effort to further ground the digital artefact framework in real-world practices in digital media and communication, DA creators from BCM112, DIGC202, and DIGC302 have been encouraged to collaborate with and initiate various UOW media campaigns aimed at students and outside stakeholders. Such successful campaigns as Faces of UOW, UOW Student Life, and UOW Goes Global all started as digital artefacts in DIGC202 and DIGC302. In this way, student-created digital media content is leveraged by the University and by the students for their digital artefacts and media portfolios. To date, DIGC students have developed digital artefacts for UOW Marketing, URAC, UOW College, Wollongong City Council, and a range of businesses. A number of DAs have also evolved into viable businesses.
In line with the opening paragraph I will stop here, even though [precisely because] this is an incomplete snapshot of the framework I am working on.
Haven’t been able to post for a while due to plenty of boring work – the worst combination. The flaneur spirit crumbles when faced with repetitive and intellectually unchallenging tasks. However, meanwhile in the real world the Greece fiasco turned into farce, and The Economist captured that just brilliantly in their May 1 issue, with Angela Merkel appearing as a natural in the Colonel Kurtz role on the cover below.
While the Greeks were burning banks, and keeping in line with the Apocalyptic theme, the Euro almost collapsed with the beginning of this week:
May the 6th was an interesting day in that context, because while the euro was heading for the Acropolis gold broke above 1200:
And, what a curious coincidence, something even stranger happened on that same day:
The financial markets had a 20 minute period of complete collapse, which the media immediately explained as a human error (haha). Other, less imbecile explanations are to be found here and here. I find it fascinating how in a highly leveraged complex system a relatively small event (sorry Greece) can cause tremendous and unpredictable repercussions which apart from forming a somewhat black swan, cause system-wide readjustments. This again comes to show that  in a complex networked environment the notion of periphery is meaningless,  connectivity acts as a magnifying glass for network events,  the longer structural instabilities are ignored/covered up, the bigger the eventual ripple-effects of the collapse.
Interesting weeks ahead.
John Robb over at Global Guerrillas has an interesting post on the root problem in dealing with entropic complexity (entropic because of the inevitability of collapse) – influenced by the work of Quigley and Tainter. As he narrates it, the key issue is the uniqueness of each system at the level of its smallest nodes – the entities/actors enacting the system. In other words, whether it is the international wheat market, the English Premier League, or the Australian banking system, while there are certain structural similarities once the systems reach a certain level of complexity (network power laws, etc) at the most local level each system is absolutely unique, and differs even from ‘similar’ systems next door. Furthermore, the local level is the fastest changing part of a system, in that it is the closest to the inputs (of course all levels are local as actor network theory argues, but that is all too often not understood), and consequently when viewed over time there grows a chasm between the fluidity of the local and the structural integrity of the wider system. As Robb words it:
The need for evolutionary advances at the local level will always outstrip the pace of evolutionary change at the center. When the mismatch grows too large, the entire system collapses.
Of course, Robb forgets that every system is always local at every layer of its network, the center and the periphery are equally situated in a local setting, and the problem he describes is not one of miscommunication between local and global, but of breakdown of translation between equally local layers. The solution Robb proposes is one of resilient local communities existing in some sort of semi-autonomy from a wider system. This of course has been an old political dream of both the far left and the far right. Interestingly though, the Austrian economic school and Murray Rothbard in particular have long argued for the independent city-state as the optimal politico-economic entity on a global scale – I don’t think Robb is aware of that though.
There is a lot to be said for the inevitability of a complex system’s collapse, starting from the second law of thermodynamics, going through the costs of rising complexity usually purchased at the expense of system stability, and ending with the crucial point where a system’s capacity to adapt to a changing environment fizzles out precisely because of the need for stability. This inevitability seems to hold true for all kinds of complex systems, from bio-ecologies, to organizational models and societies. Couple of fascinating posts on this:
In his The Collapse of Complex Business Models, Clay Shirky makes an interesting argument on the future of the current business model of media production. Basing on Joseph Tainter’s The Collapse of Complex Societies, and using as an example TV companies, Shirky argues that the active participants of complex systems are at the end simply unable to integrate a new rule-set into the complex structure of the system, therefore causing the at first paradoxical situation where ‘the most powerful/connected/etc’ are the least capable of innovating to survive. As a media company how do you compete with content such as Charlie bit my finger when your content production costs are astronomical by comparison, and you cannot afford to lower them if you want to remain in existence as an organisation? The answer is – you can’t. Which comes a long way in explaining Murdoch’s desperate attempts to shore up the defenses and patch up the walls of his crumbling empire.
John Robb’s The Simplification of Complex Societies takes that argument to the complexity of the present global society, and unsurprisingly concludes that collapse is imminent, albeit with the caveat that there is a way out, as pointed by China’s successful transition from Maoist barbarity by allowing totally unregulated innovation at its societal periphery (arguably the largest and most successful peasant revolution in human history). While predicting the imminent collapse of capitalism is a hobby century-and-a-half old, there is a lot to be said about the growing layers of complexity involved in international governance, financial and otherwise. Plenty to ponder.