Author Topic: An Architecture for Effects Based Course of Action Development (CAESAR II/Eb)  (Read 4855 times)

0 Members and 1 Guest are viewing this topic.

Offline squarepusher

  • Member
  • *****
  • Posts: 2,013
What distinguishes this from other papers on CAESAR are the easily understandable influence net models. They're less abstract than the ones that have been covered before.

This document is from 2000, and it covers CAESAR II/Eb (even then they had already switched the name to Pythia). Overall, this is far more understandable than 'CAESAR III: Inferring Adversary Intent & Estimating Behavior'.

Download here

An Architecture for Effects Based Course of Action Development

Alexander H. Levis
George Mason University
C3I Center, MSN 4D2
Fairfax, VA 22030

A prototype system to assist in developing Courses of Action and evaluating them with respect to the effects they are expected to achieve has been developed and is called CAESAR II/EB. The key components of the system are an influence net modeler and an executable model generator and simulator. The executable model is exercised using the plan that is derived from the selected Course of Action and the probabilities of achieving the desired effects are calculated. The architecture of CAESAR II/EB is presented and an illustrative example is used to show its operation.

Since Desert Storm, the concept of integrated Planning and Execution is becoming accepted and systems and procedures are being implemented to achieve it (e.g., concepts are being tested in Advanced Warfighting Experiments by the Services). Integrated Planning and Execution enables dynamic battle control, (sometimes referred to as dynamic planning). Bosnia and especially operation Allied Force in Kosovo, have focused broad attention on effects-based planning and effects assessment (see Washington Post, Sept. 20-22, 1999). This leads to closer interaction of intelligence and planning: intelligence is not only an input to the process, but a key component of the effects assessment feedback loop. Given the potential complexity of future situations and the many consequences of the responses, an approach is needed that (a) relates actions to events and events to effects; (b) allows for the critical time phasing of counter-actions for maximum effect, and (c) provides in a timely manner the ability to carry out in near real time trade-off analyses of alternative COAs. Such an approach, based on research and development carried out over the last five years, is now feasible. The approach is described in this paper.

The first step is to develop and select a Course of Action that will lead to a desired outcome. A Course of Action is composed of a timed sequence of actionable events that are expected to cause the desired effects. In current practice, probabilistic models that relate causes to effects are used to identify the set of actionable events that yield the greatest likelihood of achieving the desired outcomes and effects. Note that these models do not include timing information. The selected set of actionable events is provided to planners who use experience to select, assign, and schedule resources to perform tasks that will cause the actionable events to occur. The schedule of tasks with the assigned resources constitutes a plan. Outcomes, in terms of effects, are critically dependent on the timing of the actionable events.

The problem requires the synthesis of a number of approaches that have been emerging in the last few years from basic research efforts by DOD and industry. Indeed, the rapid improvement in computational capability and the availability of design tools have made the process of going from an idea to a proof of principle much more rapid.

The process diagram in Figure 1 identifies four principal functions of Effects Based Operations and three feedback mechanisms that enable these functions to be accomplished. This conceptualization expands the conventional C2 process to include not only the traditional Battle Damage Assessment (BDA) feedback loop, referred to here as Action Assessment, but also two other feedback loops: Dynamic Battle Control and Effects Assessment. There is also a fourth loop not considered here, the real time shooter assessment loop, often referred to as Execution Control. The distinction between Execution Control and Dynamic Battle Control is that the latter involves the controllers and sometimes the planners. The Dynamic Battle Control loop allows for changes in the plans after the plan has been disseminated, while the longer loop involves assessment on how  well the actions being taken are achieving the desired effects or how well the goals are being met. Each one of these loops precipitates different responses. The Dynamic Battle Control loop affects the execution of the plan by doing real and near real time retasking of assets. The Action Assessment loop affects the development of the next days plan. The Effect Assessment loop leads to the reconsideration of the Course of Action being followed and possibly to the selection of an alternative COA to meet the changing circumstances.  

More specifically, the forward process includes COA development. COA selection, Planning, and Execution. As Fig. 1 shows, the first three stages require the close interaction of Intelligence and Planning, while the last two require the integration of Planning and Execution. The latter is already occurring in the case of air operations, while the former is beginning to take form.

Once the forward process has been completed, the execution of the resultant plan induces the feedback process. Once we begin to take actions and other events occur, the process and tools must track our progress in achieving the desired effects. Measures (triggers) must be developed for changing COAs. The tools and the process must facilitate the ability to make changes to plans in a dynamic manner, while the plans are being executed. The actionable events are specific; the first feedback loop, dynamic battle control/dynamic planning, involves local adjustments to the specific actions of resources as they perform planned tasks. The assessment is conducted by operational controllers attempting to ensure that the tasks and actionable events occur according to the plan. The second feedback loop, action assessment, addresses the measurement and evaluation of whether the actionable event occurred and to what extent. For example, in conventional air warfare, this would be equivalent to measuring whether the bombs hit their targets and the extent of damage they have inflicted. Adjustments are made to future plans to account for actionable events that were scheduled but did not occur or observations about the immediate impact of the actionable events The third feedback loop assesses progress toward the overall desired effects. Given that the actionable events have occurred, have the effects been achieved? Given that the blue forces have achieved a planned level of bomb damage through the air campaign, have they forced the adversary to change his policies? Changing the policy is a desired effect. If events are not unfolding as originally envisioned, it may be necessary to change or adapt the COA. Ultimately, this feedback is used to assess whether the goal has been met when certain effects have been achieved.

The first observation is that the internal loop, if it is fast enough, permits dynamic planning. The latter forces the integration of planning and execution, since the concept of dynamic planning breaks down the paradigm of a fixed plan to which ad hoc changes are being made. Implementation of dynamic planning results in a fluid, evolving integrated plan that is being modified as it is being executed.

The presence of the two outside feedback loops in Fig. 1 distinguishes traditional planning and execution from Dynamic Effects Based Command and Control (DEBC2). In the same way that dynamic planning integrates planning and execution, DEBC2 integrates intelligence with planning. The establishment of cause – effect relationships between actionable events and effects, or in the reverse direction, the inferencing of the occurrence of events from the observation of effects, is an activity that is carried out by intelligence analysts. By closing the two loops, the paradigm requires intelligence to become an integral part of the dynamics of the planning process, rather than only providing inputs to it.  
This process can now be expressed in terms of specific activities that need to be performed and the tools and techniques that support them. This is illustrated in Fig. 2 using the IDEF0 formalism. The first activity is the analysis of the situation using several modeling techniques. This activity is carried out by situation analysts who are usually intelligence analysts. The second activity is the development and selection of alternative courses of action. In the case of DEBC2, the proposed system should be capable of being used to generate a variety of contingency COAs and plans and also be used (with scenarios) to evaluate these COAs and plans in terms of their likelihood of achieving desired effects. It should also be capable to be used in generating plans in near real time for unanticipated circumstances. The third activity is to generate plans for the alternative Courses of Action. The approved plan is disseminated to the units that carry out the tasks in the plan and to operational controllers who monitor the execution. In the fourth activity, the execution of the plan is controlled using the capability to exercise all three feedback loops shown in Figure 1.

The first activity produces a set of models that are at the heart of providing the capability for dynamic effects based command and control. The development of the first of these models starts with the process shown schematically in Fig. 3. The goals are set by the National Command Authority at the strategic level and by the Commander for the operational level. It is then determined that, to reach the goals, certain effects must be achieved. This determination can be accomplished using probabilistic modeling tools (e.g., Influence net modeling) such as SIAM,1 as shown in Fig. 4. An influence net model allows the intelligence analyst to build complex models of probabilistic influences between causes and effects and effects and actionable events. This is shown in Fig. 5 which also implies the existence of a library of models that can be used as modules to create new influence models that are appropriate for the specific situation.  

The Influence net model is then used to carry out sensitivity analyses to determine which actionable events, alone and in combination, appear to produce the desired effects. It should be noted that Influence nets are static probabilistic models; they do not take into account temporal aspects in relating causes and effects. However, they serve an effective role in relating actions to events and in winnowing out the large number of possible combinations. The result of this exercise is the determination of a number of actionable events that appear to produce the desired effects and give an estimate of the extent to which the goal can be achieved.

Once the influence net of the situation has been developed, the situation analyst converts it into an executable model that allows the introduction of temporal aspects (Fig. 6). An automatic algorithm that performs this conversion has been developed, tested, and demonstrated. A Colored Petri Net model is developed using the structural and probabilistic information (the influences) contained in the Influence net model. (Wagenhals et al., 1998) The current probabilistic equilibrium models (Influence nets) used for situation assessment contain a great deal of information in the form of beliefs about the relationships between events and the ultimate outcome or effect. They have an underlying rigorous mathematical model that supports analysis. They provide only a single probability value for a given set of actionable events. They do not capture the effect of the sequence or timing of the actionable events. Additional information needs to be inserted to account for temporal and logical sequencing of actionable events. A particular sequence of actionable events represents an alternative Course of Action. Note that in a threat environment proper sequencing is critical; reversal of two operations can endanger lives and affect critical operations. Consider a trivial example: wear protective equipment then step in hazardous environment vs. step in hazardous environment and then put on protective equipment. While this is obvious, such reversals are not easily observed in a complex scenario with many concurrent tasks. The executable model brings these issues to the fore.  

Recent research by the GMU System Architectures Laboratory has shown that it is possible to enhance these models so that the impact of timing of the inputs on the outcomes/effects can be determined. This impact can be represented by the timed sequence of changes in the likelihood of the outcomes/effects determined by the timing of the actionable events. The sequence of changes in probability is called the probability profile. It is a key measure of the effectiveness of a COA that can be used to evaluate COAs during their development and to determine when and how to change the COA during execution.

The executable model, when properly initialized with a scenario, can be used in simulation mode to test the various COAs to determine their effectiveness by generating the timed probability profile for the particular COA. (Fig. 7) The problem and the assumptions can be shown on the future Display Wall at the Joint Task Force level in which the situation is presented (say, the relevant Common Operating Picture) along with alternative Courses of Action and their assessment. A Commander can then make an informed choice and direct the planning staff to prepare the detailed plan for the chosen COA.  

Carrying out simulations using the executable model is not the only way in which COA analysis and evaluation can be conducted. State Space Analysis of the Colored Petri net model of the influence net can be conducted to reveal all of the probability sequences that can be generated by any timed sequence of actionable events. The result of the state space analysis is a State Transition Diagram that is mathematically a lattice. This state transition diagram can be easily converted to a plot showing the range of probability values that can exist at each step in any probability profile. This technique allows the analysts to see, at a glance, all of the potential effects that timing of the actionable events can have. The analyst can then select the profile that gives the best results. Once the untimed profile has been selected, procedures using a temporal logic application called TEMPER 2, (Zaidi and Levis, 1997) can be used to determine the temporal relationships between the actionable events that will generate the selected probability profile. The set of model composed of the influence net, the Colored Petri Net, Timed Point Graphs from the Temporal Logic formulation, and the State Transition Diagram are called the Common Planning Problem. It is these models created in the first activity that can enable the forward and feedback Dynamic Effects Based Command and Control process illustrated in Figures 1.

In the second activity of the process, the operational planners and the situation analysts use the models of the common planning problem to select candidate COAs. The concept for this procedure is shown in Fig. 8. The analyst uses the State Transition Diagram to construct the plot of the untimed probability profiles. He selects candidate profiles using a set of metrics and determines the temporal relationships of the actionable events that will generate these sequences using the temporal logic algorithms. These COAs are run in the executable model to generate the timed probability profile for final selection. In the example of Figure 9, COA 1 is preferred of COA 2 because it has the higher probability values at all time points and reaches the highest probability the fastest.  

Having selected a COA, a detailed executable plan is developed in the third activity of Figure 2. The existence of the executable model (structured in an object oriented manner so that it can be instantiated at different levels of abstraction) gives the opportunity to test the plans in simulation mode and also to monitor their execution by inserting actual event as they occur. This is a required capability for dynamic planning; the state of the system must be known in order to insert new tasks, eliminate existing ones, or redirect ongoing ones.

The fourth activity involves the continual assessment of the execution of the actionable events, the assessment of their effects and the impact they have on achieving the goal. The three loops correspond, very approximately, to measures of performance, measures of Effectiveness, and Measures of Force Effectiveness. The models of the Common Planning Problem can be used in the assessments associated with each feedback loop. During the execution of a plan, there are two major factors that can impact the expected effectiveness of that plan. First, the timing of the actionable events may change as the resources perform the tasks in the plan. The impact of these timing changes in terms of the timed probability profile can be quickly examined using the executable model of the Common Planning Problem. If anticipated timing changes have an adverse effect on the probability profile, adjustments to the timing can be determined that will bring the profile within acceptable levels. The second type of changes involves the occurrence or non-occurrence of anticipated events in the influence net. In the planning mode, events were assumed to occur with some probability; in the assessment mode, events occur with probability one or zero – depending on whether they occurred or not. This changes substantially the computational model incorporated in the Colored Petri Net but not the structure of the model. The impact of these observations on the timed probability profiles can be observed by updating the elements of the Common Planning Problem.

All parts of this process have been prototyped and executed using the suite of tools called CAESAR (Computer Aided Evaluation of System Architectures.) Several case studies have been run and demonstrated, ranging from a small influence net that illustrates the concepts, to a large influence net (about 100 nodes) representing a complex situation. The next section contains the a description of the small illustrative example.  


The operation of CAESAR II/EB is illustrated through a hypothetical “day in the life” of such a system. Assume that a crisis emerges. Country B has invaded a neighboring country and a key issue is whether the leader of the country believes that he can succeed in this undertaking. The crisis action team is constituted and begins to evaluate the situation and consider options. An existing influence net that describes the decision making process of Country B is retrieved from the library of models and the analyst modifies it directly to reflect the specifics of the crisis. There are many actionable events ranging from diplomatic efforts by country A all the way to declaring war by a coalition of nations. The analyst carries out a sensitivity analysis of these alternative actionable events and determines that three particular actionable events may be sufficient at this stage, namely, diplomatic mission by country A to country B; sanctions by the international community (through the United Nations) and a covert mission by country A that causes severe damage to the leader’s arsenal. The influence net with initial values of probability of occurrence of the actionable events 0.5, 0.5, and 0.0, respectively is shown in Fig. 9. The result of the analysis, the probability that country B will withdraw is only 0.3. However, if all three actions take place with probability 1, then the probability of the outcome rises to 0.9, which is the highest value that can be attained in this influence net.

The influence net is then converted automatically by CAESAR II/EB into a Colored Petri net, as shown in Fig. 10. However, temporal information must be entered. This information is of two types: First, the temporal characteristics of the system as represented by the influence net such as communication delays, procedural delays, etc. The second type is the time sequencing of the actionable events. Even though there are only three events here, there is a large number of alternatives since we allow concurrency of events. Using the analyst’s and planner’s experience, the number of event sequences can be reduced substantially. Given that the outcome of the sensitivity analysis was to carry out all three actions and given that the covert action should follow the diplomatic efforts, two sequences were chosen as the alternative courses of action: (a) the incremental approach: first country A’s diplomatic mission; then the international sanctions, and finally the covert action; and (b) the forceful approach: concurrent diplomatic efforts followed by covert action if diplomacy is not successful.

The Colored Petri net (Fig. 11) is used in the simulation mode to produce the two probability profiles shown in Fig. 12. Clearly, approach (b) is preferable; it shows a substantially higher probability of achieving the goal without ever resorting to the covert mission.


An approach to Course of Action development and selection for effect based operations has been described and CAESAR II/EB, a decision support tool prototype, has been described and an example has been used to illustrate the operation.


This work was supported in part by the US Office of Naval Research under grant no. N00014-00-1-0267 and by the US Air Force Office for Scientific Research under grant no. F49620-95-0134. The author would like to acknowledge the contribution of the System Architectures Laboratory staff: Lee Wagenhals, Insub Shin, and Daesik Kim, in the development of CAESAR II/EB.

Rosen, J. A., and Smith, W.L. (1996). “Influence net Modeling with causal Strengths: an Evolutionary Approach,” Proc. Command and Control Research Symposium, Naval Postgraduate School, Monterey, CA. pp. 699-708.

Wagenhals, L. W., Shin, I., and Levis, A. H. (1998). “Creating Executable Models of Influence Nets with Coloured Petri Nets,” Int. J. STTT, Springer-Verlag, Vol. 1998, No. 2, pp. 168-181.

Zaidi, A. K, and Levis, A. H. (1997) TEMPER: A Temporal Programmer for Time-sensitive Control of Air Operations, Paper GMU/ C3I-190A-P, C3I Center, George Mason University, Fairfax, VA.


1 SIAM is a COTS product developed by SAIC (Rosen and Smith, 1996) to support the intelligence community and is used as a module in the CAESAR II suite of tools. Other probabilistic modeling tools such as Hugin, Analytica, and the Effects Based Campaign Planning and Assessment Tool (CAT) under development at AFRL/IF can support the modeling of actionable events and effects.
Infowars Wiki - Help make this become the official wiki of - contribute!


  • Guest
Excellent find.  Will have to read this in depth.

See this also in reference to this: (save locally)
Approved for public release; distribution is unlimited.
Air & Space Power Journal - Chronicles Online Journal

Influencing Global Situations:

A Collaborative Approach

Julie A. Rosen, PhD and Wayne L. Smith

1710 Goodridge Drive
McLean, VA 22102
Phone: (703) 556-7354 (703) 448-6522

e-mail: [email protected]
[email protected]


The authors present an approach to investigating the human decision cycle. Of particular interest for this paper is the decision cycle employed by individuals and organizations during crisis and potential conflict. The collaborative approach described here is especially beneficial in today's world of rapidly evolving, global situations within which U.S. security policies and operational plans are generated. The need for collaborative investigation processes such as the authors' innovative approach, called Influence Net modeling, are discussed. To illustrate the concepts and "mechanics" of the collaborative process, examples are taken from an automated system, called SIAM, which was developed to assist Influence Net modeling.

1. Motivation for this Investigation

With the end of the bi-polar political world, decision makers in the U.S. national security arena are faced with an ever-increasing number of situations that have the potential to become crises. In this paper, the term crisis includes situations of economic instability, ideological or cultural contrasts, as well as the more traditional (and oftentimes military-based) political and diplomatic security concerns. These crisis situations may occur while the involved parties are at peace; however, crises left untended or inaccurately estimated tend towards armed conflict situations that affect U.S. national and global stability interests.

U.S. security decision makers, including military planners, no longer face a single national government opponent whose power derives primarily from its military's capabilities. Today, world "actors" capable of generating crisis and instability, perhaps unintended, also include individuals representing multi-national organizations and multi-national states; examples of the former include economic consortia and terrorist cartels; examples of the latter include pan-Islamic countries and the ASEAN nations. The behavior of this set of actors, and their attendant actions, expand the more traditional list of state-sponsored conflict situations. In addition, as technological advances make the "global economy" a reality, conflicts formerly considered "internal disputes" possess the ability to disrupt, even destroy, the processes governing everyday lives of the citizens of many nations. In recognition of these events, the U.S. security arena has expanded the military's roles and missions to include the following:

    * Urban conflictthe insertion and extraction of forces, such as employed in Somalia,
    * Distributed forcesinsertion of forces, possibly deep insertion, such as currently deployed throughout the Bosnian theater, and
    * Major regional conflict (MRC)force-on-force deployment to a single theater or multiple, concurrent campaigns.

In addition to the increasing number of crisis situations, the characteristics of today's "actors" differ from the traditional single power studied in great detail during the previous 50 years. Significant effort and cost has been invested examining the doctrine, policies, and capabilities possessed by the national government of the former Soviet Union. Although well documented, the results of this extensive investigation do not apply to many situations that will arise in the future. Tomorrow's adversaries may not possess satisfaction with the traditional bi-polar political status-quo. In greater contrast, the alignment of multi-national states and non-political organizations will reduce (or eliminate) the significance of politically-based motivations underlying the behavior and actions that can result in crisis or conflict. Such motivations include personal advancement, economic superiority, and expansion of cultural or religious ideology.

This diversity of characteristics among (potential) opponents continues to generate situations inconsistent with previous national policy making and planning strategies. In response to this evolving global scene, today's security missions must address situations that precede armed conflict. Examples of operations other than war (OOTW) situations for which national security policy and military planning are required today include:

    * Supporting the non-proliferation of massively destructive weapons (WMD) by multiple state-sponsored organizations;
    * Pre-empting disruptive/destructive actions of terrorist organizations;
    * Mitigating the adverse effects of multi-national "black market" economic organizations' activities;
    * Supporting humanitarian efforts conducted throughout the globe; and
    * Maintaining peacekeeping missions in regions around the world that, left untended, may move towards conflict.

In short, today's troublesome actors and situations possess a diversity and complexity unparalleled in our nation's history. The impact of this changing world scene is recognized in part by the U.S. defense community, as evidenced in a recent publication on defense strategy: "Future joint warfighting capabilities [include] near real-time knowledge of the enemy and [we must] communicate that to all forces in near-real time..."

Based on the above discussion, one goal for today's decision makers must be to

Establish a process to identify and evaluate a continuum of options
tailored to the behavior of states, groups, and individuals.

However, the characteristics of potential situations are not the only parameters that define U.S. security concerns. Budget realities that headline today's news also must be considered. As the 21st century approaches, the national security community increasingly is mandated to reduce the size of its infrastructure. Combat forces of the next decade will be significantly diminishedthe size of U.S. forces as well as the numbers available from our traditional allies. Not only are the warfighting forces "taking the hit;" the planning and intelligence communities similarly are undergoing a reduction in force. In addition to the human factor, national security facilities are reducing their focus with attendant consolidation mandating the closure of basesboth CONUS and OCONUS. Similar financial constraints upon our allies are reducing the likelihood that "host-country basing" will be available when regional crises arise.

The resulting reduction in national security infrastructure is occurring at the same time that the world is seeing an increase in the potential (and diversity) of situations that require those very resources. Unless addressed properly, applying the remaining forces can result in significant risk to U.S. citizenry, in general, and military personnel, specifically. Therefore, the decision maker's goal, identified above, must be expanded to include:

Establish a process to identify and evaluate a continuum of options
that reduces cost and risk for a spectrum of crises.>

2. Statement of the Problem

As scenarios for crisis and conflict arise, members of the U.S. national security community are tasked to examine the behavior and capabilities possessed by both our allies and opponents. Traditionally, two categories of investigation and analysis have been employed to identify influence strategies and their operational implementations:

   1. Seminars, workshops, and informal communications that extract knowledge from experts in the field of study. Sometimes this information is captured in a paper report that presents the results of the study to the decision maker; typically, this capture is performed by a single member of the study group. However, whether or not the results of the knowledge elicitation are documented, the experts' underlying source material, assumptions, justifications, and reasoning are maintained very rarely. Such information is crucial not only for the current decision maker, but also for future decision makers and their analysts who require historical, empirical evidence as the situation evolves.

   1. Mathematical and computer-based models/simulations that attempt to estimate current and future states of "physics-based" phenomena. As with the first category, the results of this type of investigation usually are "watered down" for presentation to the consumer or other analysts. The input parameters and internal "rules" of such models are glossed over in the presentation of these results. Many times only the results of such simulations are sufficient to estimate the status of a situation. However, as with the seminar technique, reducing the documentation and presentation of the model's underlying reasoning may lead to confusion and misinterpretation by the decision maker. The problem is exacerbated when such models are revisited by future decision makers and their analysts.

Too often in the past, the traditional techniques for examining a situation have produced assessments that are not borne out in time. For example, workshop-like analysis indicates that a leader is not believed to possess aggressive intentions, but an unstable situation initiates because that leader is not in control of events. Sometimes, both techniques are employed concurrently, producing conflicting results. For example, observable evidence and "physics-based" models/simulations prove that an adversary possesses the technical capability to conduct aggression, but the adversary "backs off" when an outside influence is applied. In this case, the motivation, perception, and intentions of the adversary underscoring the resulting behavior may not have been accounted for correctly, if at all.

Moreover, in today's world, technological advances that "speed up" the time line towards crisis, and the proliferation of this technology to more and more actors, means that (potential) crisis situations will rapidly evolve. In addition, understanding these situations depends on a greater number of parameters that are not "physics-based." The diversity of human motivation and perception must be addressed by today's analysts responsible for identifying influence strategies and plans. Such diversity means that experts from an increasing number of domains must be included in the analysis process; for example, psychologists, historians, economists, international industrialists, diplomats, and philosophers.

However, as the community of analysts diversifies, problems of communication among these experts also increases. Differences in terminology, knowledge, assumptions, and inference/reasoning practices may lead to confusion and irreconcilable disagreement about the anticipated behavior and actions of a troublesome actor. Therefore, not only are a greater number of experts required to identify alternate influence strategies, but the decision maker must understand the interaction of parameters across domains of expertise in order to evaluate the alternate strategies. This requirement expands the original goals (above) to include the following:

Establish a process to identify and evaluate a continuum of minimal cost/risk options for a spectrum of crises and allows experts to collaborate and document their facts, assumptions, and inferences.

3. Influence Net ModelingA Collaborative Approach

In an attempt to address these goals of collaborative analysis, the authors have developed a technique for analyzing the causal relations of complex situations. This techniques, known as Influence Net modeling, is a combination of two established methods of decision analysis: Bayesian inference net analysis originally employed by the mathematical community; and influence diagramming techniques originally employed by operations researchers. As illustrated in the following sections, Influence Net modeling incorporates both an intuitive, graphical method for model construction, and a foundation in Bayesian mathematics for the rigorous analysis of such models.

The domain experts, themselves, create "influence nodes," which depict events that are part of (possibly complicated) cause-effect relations within the situation under investigation. These experts also create "influence links" between cause and effect that graphically illustrate the causal relation between the connected pair of events; this cause-effect relation can be either promoting or inhibiting, as identified by the link "terminator" (an arrowhead or a filled circle). The resulting graphical illustration is called the "Influence Net topology;" a sample topology is pictured in Figure 1.

Notice that this technique allows one influence event to have multiple, possibly conflicting, effects. Similarly, an event may have multiple influences acting upon it. That is, the opinions of experts from diverse fields can be synthesized to account for the combination of influences from distinct domains. Additionally, the Influence Net model incorporates the multi-generation effect of complicated influences. An event may have both a direct and an indirect influence on another event identified in the model. In this way, the accumulated impact of a single, initiating event is accounted for in the expert-constructed model. (Initiating events for the sample topology are located around the perimeter of the net.) At the other end of the Influence Net model, there may be multiple, and possibly conflicting, conclusions of the situation. These Influence Net "roots" may describe ultimate objectives of the influence strategy, or they may depict different "final states" for the situation, including states not addressed by the influence strategy. (The single "root" event for the sample topology is located in the center of the net.)

Figure 1. Sample Influence Net topology.

4. Influence Net Collaborative ModelingImplemented In An Automated Decision Support System

The topology of the Influence Net model, constructed for a specified situation by the domain experts themselves, is only one result of this collaborative technique. The likelihood of the identified influence events, as well as the importance of their causal connections, must be quantified in order to perform analysis of the efficacy of alternate influence strategies. The Influence Net modeling technique allows domain experts to assign "beliefs" to the likelihood of initiating influences and "strengths" to each of the causal connections.

The Influence Net modeling technique has been implemented in an automated decision support system called Situational Influence Assessment Module (SIAM); illustrations presented in the succeeding paragraphs are taken from this software application. The "node belief slider bar" of Figure 2 illustrates how experts assign beliefs with the SIAM system. In addition, source material, expert judgment, and inference reasoning underlying the assignment of this event can be documented by the domain experts themselves. This information is stored in the areas designated as "Description," "Comments," and "Source" in Figure 2. Using these areas, the domain experts/analysts are able to maintain a "reasoning trail" as the situation evolves.

Figure 2. Node belief assignmenta sample.

The experts' judgment concerning the truth or falsity of the event is graphically depicted with color in the SIAM system: Four shades of red are available to indicate the judged degree of falsity; four shades of blue to indicate the judged degree of truth; and gray indicates there is no expert opinion either way. The color-coded illustration of the complete version of our sample Influence Net model is depicted in Figure 3.

5. Influence Net Collaborative ModelingGraphical Construction for Quantitative Analysis

As previewed above, the Influence Net modeling technique can be used in quantitative analysis as well as in producing the model's graphical topology. Quantitative analysis supports the decision maker's need to examine "what if" scenarios for their crisis potential. Furthermore, when the decision maker can identify critical influence events that transform a stable situation into a crisis, then courses of action that effectively mitigate the crisis can be examined. Toward this end, the domain experts are asked to assign quantitative "strengths" of the cause-effect relations illustrated by the graphical "links."

Figure 3. Likelihood of events are illustrated through color.

In the SIAM system, these "causal strengths" are assigned with slider bars similar to the slider bars used to assign event beliefs. Figure 4 illustrates the method through which the domain experts assign these two strengths for each causal connection. As with other elements of the Influence Net model, the source material and commentary justifying assignment of the causal connection is captured in the areas designated as "Comments" and "Source."

Notice that the cause and effect events of the relation are shown on the ends of the directional link. In addition to the graphical node illustrations of the influence events, their descriptions are included to remind the domain experts of the more complete definition of the events. Also note that, regardless of the estimated beliefs in these nodes, the nodes shown in this figure are gray filled. That is, when assigning the strength (or importance) of the causal relation, we are interested only in the causality itselfnot in their truth in today's situation. This causal strength will remain the same even if the situation were to change one's belief in the truth of influencing events. To assign these two causal strengths, the domain experts answer two questions:

   1. If the likelihood of the influencing event (also called "parent") were absolutely true, what would be the impact on the occurrence of the effect (also called "child")? Would the effect event be more likely? Less likely? No impact at all?

   1. If the likelihood of the "parent" were absolutely false, what would be the impact on the occurrence of the "child"? Would it be more likely? Less likely? No impact at all?

Initially this task might seem confusing. After all, if the influencing event is known to be true, then why address the "false" strength, and vice versa? The reason both sides of this relation are examined is that Influence Net modeling is used to investigate the effects of changes in a situation. For example, if the current leader of a stable government is replaced by an aggressive personality, would a crisis result? Analogously, if today's conditions in an unstable nation were properly addressed through economic development and international aid, would a potential crisis be prevented?

If the expert-assigned causal strengths indicate that the relationship is a "reversing" influence, then the filled circle terminator is drawn. On the other hand, if the causal strengths imply that the influencing event produces an effect that "runs in the same direction," then an arrowhead terminator is used. This combined influence "direction" is indicated in the "Link Information" area of Figure 4. When the causal relationship information is completed, the graphical link is displayed in the Influence Net topology, as illustrated in Figure 3.

Once a consensus on the topology of the Influence Net model is achieved, the decision maker (and the gathered domain experts) must be able to perform "what if" analysis to identify effective influence strategies. The Influence Net modeling technique incorporates a mathematically robust algorithm to compute the cumulative effects of all influences on a specified event. This algorithm, called Belief Propagation, automatically "rolls-up" the complex, and possibly contradictory, influences to determine the likelihood of the event's occurrence. The resulting likelihoods are displayed in the SIAM system using the same color coding discussed above to illustrate the user-assigned beliefs in the initiating events; i.e., four shades of red to indicate the degree of falseness in the event; four shades of blue to indicate the degree of truth; and gray fill to indicate that the overall impact of influences on the event indicates neither true nor false.

Using this algorithm (and its automated implementation in SIAM), the impact of modifying the model's topology can be investigated as the situation evolves. For example, as more information is obtained, the likelihood of an initial event may change from unknown to true. The combined impact of this added knowledge can be identified in real-time using the Belief Evaluation option of the SIAM system. Similarly, as additional cause-effect relationships are identified through the collaborative process, these causal connections can be added to the model through the graphical construction already discussed; their quantitative impact then can be determined through the Belief Propagation algorithm. Therefore, not only can a group of experts from diverse fields of study graphically construct a model of complex causality, but the underlying algorithm facilitates the quantitative examination required to perform sensitivity analyses.

Figure 4. Link "strength" assignmenta sample.

In addition to the algorithm's automated "roll-up" of expert-provided beliefs and strengths, Belief Propagation accommodates manual overrides at any point in the network model. For example, suppose the computed likelihood of an influencing event does not "agree with expert intuition." One explanation for this apparent disparity is that the Influence Net model may be incomplete; that is, Influence Net modeling can be employed to identify gaps in knowledge about the situation's influencing relations. Another explanation is that the human mind (even the mind of a domain expert) cannot "juggle" the complex combination of possibly conflicting causal relations. An automated Influence Net system such as SIAM can "keep track" of all combinations of cause-effect relationships.

Using the Belief Propagation algorithm's override capability (as implemented in the SIAM application), the decision maker and supporting domain experts can manually constrain the belief of an influence event. In this fashion, the constrained belief might represent a confidence that is more in agreement with "expert intuition." A second use for the override capability is to identify intermediate events that have significant influence impact on the modeled situation. That is, if the decision maker could alter events to agree with the manual override belief, then would the constrained event produce a sufficient influence?

In addition to the quantitative manipulation supported with an override capability, Influence Net modeling as implemented in the SIAM system provides graphical feedback to the modelers. As illustrated in Figure 5, the overridden event is shadowed with a yellow border. Note that the color fill of this node differs from the belief/color fill obtained when using the default Belief Propagation algorithm; compare the color depicting the constrained belief with that filling the same node pictured in Figure 3. Also note that any influencing events "blocked" by the overridden node are "shaded;" i.e., any event that must "go through" the constrained node in order to influence the ultimate objective ("root") event is "grayed out." This shading is employed to inform the modeler that, although the event is included in the graphical topology of the model, its influence impact is "ignored" during quantitative analysis.

Figure 5. Graphical feedback of manual overrides.

After the modeler has identified the source of the model's apparent disparity with expert intuition, the manual override constraining the selected influence event's belief can be removed. Once overrides are removed from the model, the influence events' color coding again reflects the results of the automated Belief Propagation algorithm. This technique allows the decision maker and domain experts to vary the number and strengths of influencing events until a consensus is reached regarding the most effective course of action.

6. Influence Net Collaborative ModelingComparative Assessment Techniques

The construction and modification of Influence Net models have been the central points of discussion up to this point in this paper. However, the creation and manipulation of influencing events and their causal relationships are only one aspect of this collaborative modeling process. Clearly, the identification of influencing events and documentation of related source material is required to construct an Influence Net model. But once consensus has been reached concerning the model's topology, there are several comparative analysis techniques that can be employed to quantify the impact of influencing events. The Belief Propagation algorithm, just discussed, is one of these techniques. The results generated by this algorithm indicate the overall impact of the model's events on individual events. In this section, we examine the relative contribution of individual events to influence the situation considered as a whole.

The two techniques discussed in this section"Driving Parents" and "Pressure Points"can corroborate "gut" feelings and apparently disparate information that imply a particular conclusion. However, it should be cautioned that Influence Net modeling does not provide "fail safe" proof that an event's occurrence can be "predicted." This modeling technique, including the automated assessment techniques to be presented below, produce analysis results that indicate a relative ranking of influence impact. That is, no single, "right" answer should be derived from the use of Influence Net modeling. Its primary benefit is in the capture of data and inferential knowledge through supporting the collaboration of human experts.

By their nature, automated analysis techniques require implementation as part of an automated system, such as SIAM. Using an automated support utility, these techniques allow decision makers and domain experts to select any event in the Influence Net topology for an in-depth look into its most influential relationships. Specifically, the comparative assessment techniques are employed to examine:

    * The relative impact of contributing influencing events (Driving Parents),

    * The sensitivity of end-state events to contributing initial events (Pressure Points), and

    * Side effectspossibly unintendedof combinations of influencing events.

Clearly modelers could modify individual influence event beliefs and causal relation strengths to perform sensitivity analysis to identify events with greatest impact. However, this method would quickly prove time consuming and requires in-depth understanding of the cumulative impact of complex influencing interactions.

7. Influence Net Assessment TechniquesDriving Parents

The Driving Parents technique identifies the relative impact of events immediately influencing a specified event in the Influence Net topology. The directly influencing events, that is the "parents," of a selected event are examined to determine their individual impact on a selected "child" event. This examination employs the beliefs and causal connection strengths currently active in the model. That is, for the "child" event of interest, the expert-provided parent-child connecting link strengths and the parent event's current belief are employed to evaluate a quantitative impact for the parent-child relation. This evaluation is conducted for each parent event of the selected child. Then the individual impact values are normalized over all parent-child pairs. Note that this technique is not in the category of assessments called "sensitivity analyses." Rather, Driving Parents provides the modelers with a way to partition the complicated Influence Net model into rank-ordered areas of influence, based on the current estimate of the situation.

Figure 6 illustrates the results of the Driving Parents ranking for the indicated selected child node; this screen illustration was generated by the SIAM system, as executed from the Assess menu option. Each of the immediate (parent) influences on the selected child node are illustrated in the "Parent Node" column. These direct influences are listed in descending order sorted by their relative normalized impacts. The "percentage of the influence pie" is illustrated in the column headed by "Relative Impact." In this fashion, the modelers can focus on one particular section of the Influence Net topology. However, since the impact of a parent depends, in turn, on the strengths and beliefs of possibly distant influences, this technique should not be considered the final step in the assessment process.

8. Influence Net Assessment TechniquesPressure Points

As indicated above, Driving Parents evaluations employ the current settings of the modeled situation. One of the greater benefits of automated decision support utilities is the ability to "let the machine do the crunching." In particular, automated sensitivity analyses can be performed by relatively low-end computer processors in near-real time. Unlike Driving Parents, the Pressure Point assessment technique is in the category of evaluation methods called sensitivity analysis. Although Driving Parents helps the modelers focus on the more likely areas of influence, individual initial events with the greatest potential to influence an event must be identified in order to determine effective courses of action.

Figure 6. Driving Parents assessment results.

Pressure Points analysis is employed to identify the critical initial events with the greatest potential to increase or decrease the likelihood of occurrence of a specified event. For example, which one or two initiating influences are more likely to cause the "root" objective to occur? If a manageable number of such influences can be identified, then the decision maker has the beginnings of a course of action, without spreading available resources beyond their effectiveness. In this sense, Influence Net modeling supports the "what if" analysis necessary to identify potential actions.

In addition to supporting the decision maker's allocation of resources, the sensitivity analysis results generated through Pressure Points assessment can be employed by analysts responsible for information gathering. Specifically, these results help identify where gaps in currently available knowledge have the greatest potential to invalidate forecasting efforts. With such results, today's decreasing data gathering resources can be assigned to best "cover" the unknowns.

Rather than examining only the current estimate of the "state of the world," Pressure Points assessment considers the range of possibilities allowed if the situation were to be modified in defined ways. The sensitivity of a selected event to an initial influence event is determined from the complete set of influence paths connecting the initial influence and the selected event. Using all possible paths connecting the two events, the quantitative effect of the selected event on the initial influence is evaluated. This evaluation is performed for the complete range of beliefs in the initial influence. That is, as the belief in the initial influence is varied through the scale from absolutely false to absolutely true, the resulting effect on the selected event is monitored.

If the selected event's likelihood changes significantly as the initial influence's belief traverses this dramatic span, then the initial influence is said to have great potential for influencing the selected event. This strong potential can result when the initial influence has multiple, reinforcing paths through which the selected event is affected. On the other hand, if some of these multiple influencing paths "cancel out" the remaining paths, then the overall effect of the initial influence will be slight. (It is noted that a lack of sensitivity of a selected event to an initial influence also results if the initial influence is "buried" in the Influence Net's topology, confirming human intuition.)

The variability of the selected event on each of its initial influences is illustrated in Figure 7 under the column "Sensitivity." As shown in this illustration, this parameter can be used to rank the initial influences by their potential to affect the selected event. Note here that the term "sensitivity" is intended to imply a potential to change the likelihood of the selected event; this term does not indicate whether the selected event's likelihood will increase or decrease as the initial influence changes. In the SIAM implementation, this direction of the initial influence's effect is indicated by the contents of the column "Influence."

Since there may be several paths of influence between the initial and selected events, the simple terms "promote" and "inhibit," used to describe individual causal connection strengths, are not sufficient for this purpose. The overall effect of the initial influence on the selected event is said to be "reversing" if an increase in the belief of the initial influence produces a decrease in the selected event's belief. That is, the initial influence's combined effects reverse the outcome, when considering all possible connections with the selected event. If the initial influence's effect "runs in the same direction" as the selected event's likelihood, then no entry is shown in this column.

In addition to the overall sensitivity of a selected event to its initial influences, decision makers need to be informed of the degree to which a situation may be improved or degraded, when compared with the current state. Towards this end, the SIAM system displays these indicators under the columns labeled "Promoting Potential" and "Inhibiting Potential." Summed together, these two potentials for change equal the total sensitivity of the selected event to the initial influence. In essence, an initial influence's sensitivity potentials are defined as:

Promoting Potential - the overall capacity to increase the likelihood of the selected event over the current state

Inhibiting Potential - the overall capacity to decrease the likelihood of the selected event over the current state

Figure 7. Pressure Points assessment results.

These additional assessment results identify initial influences that pose a significant risk of degrading the overall issue, while providing minimal chances of improving the situation. On the other hand, initial influences with relatively high potential for promoting a desired selected event and relatively low potential for inhibiting it are prime focus for applying influencing courses of action.

As a side commentthe decision maker may desire that a selected event be false. In the sample scenario illustrated in Figure 3 a U.S. decision maker would desire that the ultimate conclusion "Saddam decides to withdraw from Kuwait peacefully" be true. Suppose a second "root" conclusion were to be added to this topology: "Saddam decides to invade Saudi Arabia." Again from the U.S. decision maker's perspective, it is desired that this conclusion be inhibited, not promoted. Therefore, Pressure Point analysis would been performed in search of influencing actions that have significant inhibiting potential.

9. Influence Net Modeling - Identifying Unintended Side Effects

Once critical initial events are identified, then the decision maker has the option to apply influencing "actions" to the pressure points. (In the SIAM system, such "action" influences are designated with a star-shaped icon, rather than the standard diamond-shaped icon used to depict situation conditions. These action influences and their causal connections are constructed in the same manner as described in Sections 4 and 5.) In this manner, the addition of action influences to the Influence Net topology supports the collaborative analysis required to determine the efficacy of implementing the identified course of action. Moreover, the addition of such influencing events provides the decision maker with a view into unexpected side effects resulting from these actions.

Consider the most significant pressure point identified in Figure 7: "King Fahd permits entry of foreign military units." By applying appropriate influence on King Fahd's decision cycle, there is considerable potential to promote Saddam Hussein's belief in the resolve of the U.S. government "to push Iraq out of Kuwait." However, this same action might adversely affect another event in the situation. For example, suppose our sample Influence Net model had included the influence of King Fahd's decision cycle on the OPEC trading partners. It is conceivable that actions to promote King Fahd's decision to allow foreign military units into his realm would adversely affect oil prices, anger the OPEC trading partners, and possibly increase Saddam Hussein's stature among Arab world nations. These events, in turn, adversely affect influences on Saddam Hussein's decision to withdraw from Kuwait peacefully.

10. Influence Net ModelingIn Conclusion

Since "real world" causality crucially depends on complex, and sometimes conflicting events, collaboration of a group of domain experts is essential to identify effective, low risk, and cost efficient courses of action. Providing an environmentvirtual or realto discuss current situations and "what if" analysis of these situations is critically valuable to the ultimate consumer's decision making cycle. In addition to the documentation of factual source data, this environment must encourage and capture the essential human inference reasoning process.

Advances in computer processing, software development practices, and availability of relatively inexpensive modeling software tools have the potential to dramatically improve the collaboration process. Rather than previously employed procedures that summarized information gathered from domain experts, today's workshop environments offer real-time access to knowledge of diverse and complementary fields of study. This interactive collaboration, conducted in one room or through electronic communications across the globe, assists decision makers and their supporting experts in sorting and evaluating information required to understand complex "real-world" situations.

Influence Net modeling encourages this interactive collaboration through the use of graphical model construction and assessment. Construction of the graphical Influence Net topology by the experts themselves encourages "face-to-face" discussion, which can lead to consensus and, eventually, credibility in the experts' model. A reasonably complete and accurate model is critical if the decision maker is to select actions that will have maximum effectiveness when applied to the situation as a whole.

Automating Influence Net modeling facilitates the collaboration process by documenting the data, expert reasoning, and assessment results. When the experts no longer are available, or the situation evolves, the captured (electronic) model forms the basis for additional study. Finally, automated systems, such as SIAM, can be employed to produce graphical results that, when incorporated into a publication or presentation, offer the consumer the "picture that's worth one thousand words." Any decision maker or supporting expert in today's fast-paced world is well aware of time pressures to "sell" a plan of action. Graphical modeling and presentation techniques more closely match the human brain's ability to communicate effectively. Rather than automation and science replacing human knowledge, Influence Net collaboration brings the benefits of human interaction back into the spotlight.

Dr. Julie A. Rosen

Dr. Rosen is a senior scientist for Science Applications International Corporation (SAIC). She serves as the Principal Investigator for several U.S. government projects involving the Situational Influence Assessment Module (SIAM) software application, in particular, and Influence Net modeling technology, in general. Her duties on these projects include: contract management, scenario development, software system requirements definition, and software design and implementation of the algorithms used to perform mathematical and statistical reasoning under uncertainty in input data.

Prior to her recent efforts in Influence Net modeling, Dr. Rosen served as Principal Investigator for a project with the US Navy (N871) Strategic Deterrence JMA Assessment Program. In this role, her responsibilities included the initial development of an automated decision support utility that allows policy makers and planners to evaluate the effectiveness and cost efficiency of US Navy systems.

Earlier in her career, Dr. Rosen served as Senior Scientist on several projects for which a proprietary data fusion algorithm was implemented in software systems. This probabilistic algorithm, and accompanying implementation software applications, were developed to automatically fuse contact reports from various single source sensors into an integrated picture of the battlefield for use by field commanders.

As a Senior Consultant at Booz, Allen & Hamilton, Dr. Rosen was responsible for the analysis of the impact of several communications protocols on the performance of an Air Force-fielded network system. The parameters of interest in this work included the communication protocols employed, as well as the degree of error correction coding and diversity. As a staff member of other studies conducted for various DoD agencies, she examined the performance of an optical communications system as it depended on proposed synchronization and coding schemes.

Dr. Rosen holds M.A. and Ph.D. degrees in Mathematics from the University of Maryland. She currently is a staff member with Science Applications International Corporation (SAIC).

Mr. Wayne L. Smith

Mr. Smith is an engineer for Science Applications International Corporation (SAIC). He is the lead engineer for several efforts related to the Situational Influence Assessment Module (SIAM) software application, in particular, and Influence Net modeling technologies, in general. In this role, Mr. Smith has been responsible for the full range of systems and software engineering efforts, including: requirements elicitation and definition, architecture definition, object-oriented design and implementation, software testing, and installation.

Prior to the initiation of the SIAM project, Mr. Smith worked in the area of simulation and analytic modeling of underwater warfare. His work in this area has been in support of the SSN-21 program, the Centurion New Design SSN program, and Low Frequency Active Acoustic (LFAA) programs.

Earlier in his career, Mr. Smith served as Principal Investigator and technical leader of the team that developed and maintained the Interpretive Simulation Program (ISP). The ISP is used by the Navy to perform its Independent Verification and Validation (IV&V) of the Tomahawk Land Attack Missile (TLAM) Operational Flight Software (OFS) and to validate planned TLAM missions. He also supported modeling and Validation Testing for the Tomahawk Land Attack Missile (TLAM) Operational Flight Software (OFS). This modeling and analysis included OFS performance evaluation of digital terrain data, terrain correlation, Kalman filtering, and inertial navigation.

While a graduate student at the Rensselaer Polytechnic Institute, Mr. Smith served as Course Coordinator and Research Assistant. In this capacity he supported the NASA/USRA University Advanced Design Program, which included an internship at the NASA Lewis Research Center to augment aerospace design education at RPI. Mr. Smith researched and published results on advanced propulsion concepts, trajectory simulation techniques for transatmospheric flight, and multi-cycle engine requirements for single-stage-to-orbit vehicles. Mr. Smith also taught propulsion theory, orbital dynamics, and trajectory simulation.

Mr. Smith holds B.S. and M.S. degrees in Mechanical Engineering from the Rensselaer Polytechnic Institute. He currently is a staff member with Science Applications International Corporation (SAIC).


The conclusions and opinions expressed in this document are those of the author cultivated in the freedom of expression, academic environment of Air University. They do not reflect the official position of the U.S. Government, Department of Defense, the United States Air Force or the Air University.

This article has undergone security and policy content review and has been approved for public release IAW AFI 35-101.

Offline squarepusher

  • Member
  • *****
  • Posts: 2,013
BTW - just want to mention this - I have been enrolled into a higher vocational degree in software engineering for a few years and only now have I been able to connect the dots as to what they're trying to do. They basically teach all this stuff but they never let you in on why you have to learn all this stuff. Nor do they tell you that all this tech is basically by and for military purposes.

Under the banner of 'quantitative corporate science', they teach you 'Operations Research' - basic network analysis - Bayes - Laplace's probability theory (which also pops up in this document - in Figure 9 you can see that each 'event' has been given a LaPlace probability score) and so on - but they never tell you WHY you're actually studying all of this. When I point out to people that nearly all of this stuff was created by and for military purposes, they tend to act surprised.

Secondly, previously, software engineering would revolve around learning basic programming skills. This they have pretty much done away with. The only thing they teach you at these schools to any great degree is creating Unified Modeling Language schematics - use-cases, sequence diagrams, activity diagrams, 'performance targets', and so on. Most of the students think this sucks and that it is actually an impediment to them creating a program - but the school doesn't care - they still force this stuff down your throat.

Using Anti_Illuminati's research into the field, I have since found out the document where it's stated pretty clearly that the group behind Unified Modeling Language, the OMG, was pretty much steered by Ptech and now that I see the overarching gameplan, I can see why they're rolling this out - they want this stuff ingrained into all future programs because they want EVERYTHING to be interoperable - and of course, I presume they want all of these interoperable programs to be backdoored by the government agencies at will. And the students themselves have to be 'standardized' as well - since they don't teach you 'real' computer science at these schools except for basically working in a group, creating all these UML schematics, and then later on not even implementing half of it.

So they're 'standardizing' the programmers and moving them ever further away from actually independently creating a program on their own - then you have a specific edition of the programming IDE that all of the people at these schools use that is entitled 'Visual Studio For Enterprise Architect', and lastly you have the heavy emphasis on Operations Research (which is basically the 'doctrine' that all corporations use today and was created in World War II by the Allied Forces). And game theory is basically the engine that is driving Operations Research.

Hell, during one of these 'quantitative business science' courses they even brought up 'game theory', the 'zero-sum game', the 'minimax solution theorem' and all that lovely stuff that John von Neumann (another computer pooh-pah that basically invented most, if not all, of game theory) used when he had to create predictions on how many people would be killed by the atomic bombs, and how to maximize the shock effect. The guy felt no remorse by the way for what he did - later on he did much of the same research for the hydrogen bomb.

Beginning in the late 1930s, von Neumann began to take more of an interest in applied (as opposed to pure) mathematics. In particular, he developed an expertise in explosions—phenomena which are difficult to model mathematically. This led him to a large number of military consultancies, primarily for the Navy, which in turn led to his involvement in the Manhattan Project. The involvement included frequent trips by train to the project's secret research facilities in Los Alamos, New Mexico.[1]

Beginning in the spring of 1945, along with four other scientists and various military personnel, von Neumann was included in the target selection committee responsible for choosing the Japanese cities of Hiroshima and Nagasaki as the first targets of the atomic bomb. Von Neumann oversaw computations related to the expected size of the bomb blasts, estimated death tolls, and the distance above the ground at which the bombs should be detonated for optimum shock wave propagation and thus maximum effect.

So - the father of 'game theory' - was directly, implicitly involved in plotting 'planned genocides' on Japanese civilians. Remember folks - this is one of the fathers of modern computers!!! What does that tell you about this profession? Why did IBM of all companies use their IBM Hollerith machines inside the Nazi death camps? It's because computers were tailor made and engineered for this kind of stuff - killing mass amounts of people. That 'serial number' on the arm of a concentration camp victim was the equivalent of a table record inside a modern relational database - the serial number referred to a punch card that carried all of the concentration camp victim's data - his age, his sex, his genetic composition, his measurements, and so on.
Infowars Wiki - Help make this become the official wiki of - contribute!


  • Guest
So - the father of 'game theory' - was directly, implicitly involved in plotting 'planned genocides' on Japanese civilians. Remember folks - this is one of the fathers of modern computers!!! What does that tell you about this profession? Why did IBM of all companies use their IBM Hollerith machines inside the Nazi death camps? It's because computers were tailor made and engineered for this kind of stuff - killing mass amounts of people. That 'serial number' on the arm of a concentration camp victim was the equivalent of a table record inside a modern relational database - the serial number referred to a punch card that carried all of the concentration camp victim's data - his age, his sex, his genetic composition, his measurements, and so on.

Smart Grid is Evil it is "The Grid Weapon 2.0".

It is designed by EDS as outlines by OMG utilizing COBRA programing language.

The "Agile Methodology" as it is essentially now referred too - is the idea that PROMIS has now evolved into a system that infects all it touches. It's applications present users and developers with RBAC policy controlled frameworks designed to limit their ability to "disrupt administrative control." This is a centralized control mechanism that prevent local operation centers from being able to ever possibly compromise administrative control.

Meaning, that the very programing language and core of this system is designed to present only and only those with express administrative control as designated  by DISA DOD policy -- meaning only your average 3 star general with the proper above top secret level clearance for said specific project and his superiors have the clearance required to control this system.

COBRA by design prevents anything else from ever being possible.

SO, No stinking any one other than COINTEL PRO, CIA, and NSA SHILLS can touch it.

As I said before it is designed expressly to wage false flag cyber terror and facilitate martial law and information control as a function of this weapon.
"Hackers Breach Virgina Health Database, Demand $10M Ransom"
OMFG this is a total false flag.

Totally impossible without  this being and inside job.....

State medical systems are written in COBRA.
COBRA prevents that because COBRA is PTECH.


They are  pushing very hard to kill the internet ...
Funny this should come after Anti Illuminati and I start running our show exposing how the entire  internet has been backdoored for Nick and Jay Rockefeller's personal gain.....

[ISN] 'Chinaman' dethrones 'Hacker' on cyber-terror hit parade

Declan McCullagh 06.22.01

Forget the supposed menace of teen hackers causally bypassing the security of U.S. military computers.

The real worry isn't a teen like Analyzer -- the alias for an Israeli youth who penetrated dozens of Defense Department computers -- but foreign governments, according to a hearing organized by the U.S. Congress' Joint Economic Committee.

On Thursday, Sen. Robert Bennett (R-Utah) dismissed malicious hackers as "nothing more than a nuisance" during a hearing entitled "Wired World: Cyber Security and the U.S. Economy."

Even tech-savvy terrorists still pose only "a limited cyber threat" compared with enemy nations, said Lawrence Gershwin, a science and technology specialist at the CIA's National Intelligence Council. He said Russia and China had active programs, as does the U.S.

"For the next 5 to 10 years or so, only nation states appear to have the discipline, commitment and resources to fully develop capabilities to attack critical infrastructures," Gershwin said.

The tone was remarkably different from the official line in 1998, when Deputy Secretary of Defense John Hamre described Analyzer's attacks as highly disturbing, "organized and systematic" intrusions into unclassified military networks.

In June, an Israeli court sentenced Analyzer -- whose name is Ehud Tenenbaum -- to probation instead of jail time. He's currently the chief technologist for the 2XS security firm.

This hearing comes after years of high-level discussions, commissions and debate in Washington about the possibility of so-called cyber attacks that could be launched against U.S. private or government sites. Warnings of a looming electronic "Pearl Harbor" prompted former President Clinton to sign Presidential Decision Directive 63, which created a critical infrastructure protection plan.

A draft (PDF file) of the plan published last year warns: "In the next war, the target could be America's infrastructure and the new weapon could be a computer-generated attack on our critical networks and systems. We know other governments are developing that capability. We need, therefore, to redesign the architecture of our national information infrastructure."

That's a broad and not very well-defined concept that includes, according to the document, shielding "defense facilities, power grids, banks, government agencies, telephone systems and transportation systems" against everything from Osama bin Laden to a rogue Word macro virus.

Some government officials have even called for the military to be involved in protecting civilian networks -- presumably Internet peering points and backbone providers -- against electronic intrusions, a prospect that worries civil libertarians.

The CIA's Gershwin said that U.S. adversaries "have access to the technology needed to pursue computer network operations.... Both the technology and access to the Internet are inexpensive, relative to traditional weapons, and require no large industrial infrastructure."

Peggy Lipps, a director at the BITS Financial Services Security Laboratory, stressed that more international cooperation among police and more laws were needed.

"Physical jurisdiction is irrelevant in coping with crimes conducted across borders," Lipps said. "Several efforts are underway to address the international dimension of critical infrastructure protection, and the Congress should be made aware of their implications."

Andrew Osterman in Washington contributed to this report.

By Thomas C Greene in Washington
Posted: 23/06/2001 at 18:59 GMT

After years of failure trying to generate mass paranoia with the 'pitiless teenage hacker', the US government this week trotted out its new and improved cyber-terror strawman: the one-billion-strong 'Yellow Menace'.

The classic disaffected teenager is "nothing more than a nuisance," US Senator Robert Bennett (Republican, Utah) scoffed during a Congressional Joint Economic Committee hearing entitled "Wired World:  Cyber Security and the US Economy" which convened on Thursday.

Apparently the government is taking no chances with the sort of ridicule it grew accustomed to in the Vatis/Hamre/Clarke era, and has decided to leapfrog over the next logical evolutionary step on the threat escalator (i.e., the 'Islamic Digital Terrorist' or 'Mad-skillz Mafioso') straight to adversary nations whose military establishments are creating vast divisions of deadly Cyberspace Troopers.

The shift in rhetorical focus was neatly summed up by CIA Science and Technology National Intelligence Officer Lawrence Gershwin, who told Congress that for the foreseeable future, "only nation states appear to have the discipline, commitment and resources to fully develop capabilities to attack critical infrastructures."

So that's it then. We're going to miss the pimply young monosexuals with which the Clinton Administration's military apparatus was so obsessed, though of course we look forward to meeting our new national Nemesis as Uncle Sam gradually defines him to the press....


Internet pedophiles are propagating so fast that US law enforcement is completely overwhelmed, and Congress is therefore toying with the idea of rolling back essential civil protections so they can be hunted down properly.

US Representative Nancy Johnson (Republican, Connecticut) has introduced a bill called the "Child Sex Crimes Wiretapping Act of 2001," which would give Feds and cops a heap more freedom to tap the telephones of 'sexual predators' discovered luring children in chat rooms.

According to Johnson's bit of feel-good imbecility, the discovery of Internet-based crimes such as child enticement and trading child pornography would qualify a suspect for a fast-track telephone wiretap. This sounds like something the Feds will adore, so no doubt they'll have to assign even more FBI agents to hang around in chat rooms pretending to be thirteen-year-old girls, as this delightful
satire describes.

Incredibly, the House Judiciary Subcommittee on Crime approved the bill this week, though it's anyone's guess how it would fare in full committee or on the floor. A number of critics -- US Representative Robert Barr (Republican, Georgia) chief among them -- have already expressed doubts.

"I appreciate the concern [for due process of law], and I respect it; but I hope it won't stand in the way of giving our law enforcement the power to combat this epidemic," Johnson is quoted by Newsbytes as saying.

Epidemic? Oh, right; we forgot that pedophiles didn't exist before the Internet....


The US Department of Justice (DoJ) has seen fit to submit a supplementary brief in the appeal of, which is being sued by entertainment industry lobbyists for making the DeCSS utility which descrambles DVDs available via its Web site.

The DoJ simply adores the Digital Millennium Copyright Act (DMCA) under which 2600 is being punished, and its chief concern is persuading the appellate court that the Act is a really fine piece of legislation.

The defendant is arguing, among other points, that the DMCA violates fair use and other provisions of the Audio Home Recording Act. The DoJ, in this case, is primarily concerned with defending the DMCA.

The Act is "reasonable and supported by substantial evidence in the record before Congress," DoJ says, and concludes that "it is therefore Constitutionally sound."

Furthermore, the DoJ FUD-Meisters add, the DMCA was not rash or overreaching, because it's solely responsible for preventing every scrap of copyrighted content on the Internet from vanishing without a trace.

"Congress was under no obligation to wait until the Internet withered from lack of content. Rather, Congress acted wisely to prevent that harm by fortifying a new medium of communication against very real and crippling technological assaults," the Department writes.

'Crippling technological assaults'. It seems we've ended right where we began this edition of the Roundup. No doubt we'll soon be hearing that the People's Liberation Army is involved...

Offline squarepusher

  • Member
  • *****
  • Posts: 2,013
This is such an explosive article that I do not even know where to begin, so let's just post it in full. This ties in directly to the document linked in this thread, 'An Architecture For Effects Based Course Of Action Development (CAESAR II/Eb)'. This is why this is such a big deal - Iraq is the current battle laboratory for Alexander Levis' temporal programs/behavior inferring artificial intelligence systems such as CAESAR III and Pythia.

Alexander Levis: Pentagon Asks Academics for Help in Understanding Its Enemies

From Science Magazine, April 26, 2007

A new program at the U.S. Department of Defense would support research on how local populations behave in a war zone

The Iraq War was going badly in Diyala, a northern province bordering Iran, in late 2005. A rash of kidnappings and roadside explosions was threatening to give insurgents the upper hand. Looking for insights on how to quell the violence, the U.S. Department of Defense invited a handful of researchers funded by the agency to build computer models of the situation combining  recent activity with cultural, political, and economic data about the region collected by DOD-funded anthropologists.

The output from one model, developed by sociologist Kathleen Carley and her colleagues at Carnegie Mellon University in Pittsburgh, Pennsylvania, connected a series of seemingly disparate incidents to local mosques. Results from another model, built by computer scientist Alexander Levis and his colleagues at George Mason University (GMU) in Fairfax, Virginia, offered a better strategy for controlling the insurgency: Getting Iraqis to take over the security of two major highways, and turning a blind eye to the smuggling of goods along those routes, the model found, would be more effective than deploying additional troops. The model also suggested that a planned information campaign in the province was unlikely to produce results within an acceptable period of time.

Researchers and DOD officials say these insights, however limited, demonstrate a role for the social and behavioral sciences in combat zones. And a new program called Human Social Culture Behavior Modeling will greatly expand that role. John Young Jr., director of Defense Research and Engineering and architect of the program, has asked Congress for $7 million for fiscal year 2008, which begins on 1 October, as a down payment on a 6-year, $70 million effort. Agency officials expect to direct an additional $54 million in existing funds to social science modeling over the next 6 years. Under the new program, the agency will solicit proposals from the research community on broad topic areas announced periodically, and grants will be awarded after an open competition.

Officials hope that the knowledge gained from such research will help U.S. forces fight what the Bush Administration calls a global war on terror and help commanders cope with an incendiary mix of poverty, civil and religious enmity, and public opposition to the U.S.–led occupation of Iraq. “We want to avoid situations where nation states have unstable governments and instability within populations, with disenfranchised groups creating violence on unsuspecting citizens,” says Young. “Toward that goal, we need computational tools to understand to the fullest extent possible the society we are dealing with, the political forces within that government, the social and cultural and religious influences on that population, and how that population is likely to react to stimuli —from aid programs to the presence of U.S. troops.”

The approach represents a broader and more scientific way to achieve military objectives than by using force alone, according to Young. “The military is used to thinking about bombs, aircraft, and guns,” he says. “This is about creating a population environment where people feel that they have a voice and opportunity.”

My comment: Wow - that is actually admitting that you are feeding them delusions and a placebo representing some kind of 'input' - however insignificant.

 Such tools would not replace the war games that military commanders currently use to simulate combat between conventional defense forces. Instead, the models would give military leaders knowledge about other options, such as whether improving economic opportunity in a disturbed region is more likely to restore order than imposing martial law and hunting down insurgents.

My comment: Wow again. Apply this to America and Iraq is really a battle lab for the US Northcom takeover.

Once developed in academic labs, the software would be installed in command and control systems. The plan has drawn mixed reactions from defense experts. “They are smoking something they shouldn’t be,” says Paul Van Riper, a retired lieutenant general who served as director of intelligence for the U.S. Army in the mid-1990s. Human systems are far too complex to be modeled, he says: “Only those who don’t know how the real world works will be suckers for this stuff.”

And he would be right. But psychopathic pieces of shit like Levis don't care - they're getting rich off this stuff, and they revel in flaunting their academic 'genius' merely for the sake of flaunting it.

 But retired general Anthony Zinni, former chief of U.S. Central Command and a vocal critic of the Administration’s handling of the Iraq War, sees value in the program. “Even if these models turn out to be basic,” he says, “they would at least open up a way for commanders to think about cultural and behavioral factors when they make decisions—for example, the fact that a population’s reaction to something may not be what one might expect based on the Western brand of logic.”

The new program is not the first time the military has tried to integrate cultural, behavioral, and economic aspects of an adversary into its battle plans. During the Cold War, for example, U.S. defense and intelligence agencies hired dozens of anthropologists to prepare dossiers on Soviet society. Similar efforts were made during the U.S. war in Vietnam, with little success. But proponents say that today’s researchers have a much greater ability to gather relevant data and analyze the information using algorithms capable of detecting hidden patterns.

A few such projects are already under way. At the University of Maryland, College Park, computer scientist V S. Subrahmanian and his colleagues have developed software tools to extract specific information about violent incidents from a plethora of news sources. They then use that information to tease out rules about the enemy’s behavior. For example, an analysis of strikes carried out by Hezbollah, the terrorist group in Lebanon, showed that the group was much more likely to carry out suicide bombings during times when it was not actively engaged in education and propaganda. The insight could potentially help security forces predict and counter suicide attacks. “This is a very coarse finding, not the last word by any means,” cautions Subrahmanian, adding that a lot more data and analysis would be needed to refine that rule as well as come up with other, more useful ones. Last year, the researchers applied their tools to provide the U.S. Army with a detailed catalog of violence committed against the United States and each other by tribes in the Pakistan-Afghanistan region.

Other modeling projects are addressing more fundamental questions. With funding from the Air Force Office of Scientific Research, mathematical economist Scott Page of the University of Michigan, Ann Arbor, and his colleagues are modeling societal change under the competing influences of an individual’s desire to act according to his or her values and the pressure to conform to social norms.

Wow again - you know how generically this is formulated, right? This could apply to an Alex Jones or a Ron Paul all the same. They basically admit they DON'T WANT ANY INDIVIDUAL WITH ANY DIVERGENT POINTS OF VIEW. That's how utterly paranoid and out of their minds these guys are

The work could shed light on which environments are most supportive of terrorist cells, information that could help decide where to focus intelligence-gathering efforts and how to bust those cells. The research could also help estimate, by looking at factors such as rise in unemployment and growing social acceptance of violent behavior, when a population may be plunging into chaos.

For when it's finally time to drop the hammer in the US I suppose...

That in turn could help commanders and policymakers decide when and how to intervene. Accomplishing those goals is a tall order, Page admits. “Despite tons and tons of data from U.S. elections,” he says, “we are still not very good at predicting how people will vote.

Why do they want to know who people vote for? It's not like they run diddly squat. It's not like the New World Order doesn't have them all in their pockets.

Building comprehensive and realistic models of societies is a challenge that will require enormous amounts of empirical data, says GMU’s Levis, a former chief scientist of the U.S. Air Force. But it is doable, he says, adding that the field will benefit greatly from linking social science researchers and computer scientists. “The goal here is to win popular support in the conflict zone,” he says.

And there you have it. Prima face evidence that Iraq is the battle lab for the Pentagon's (and Levis') research into behavior modification, inferred behavior modeling and tons of other total invasive measures to predict behavior, quell dissent and, eventually, 'predict' crime.

Infowars Wiki - Help make this become the official wiki of - contribute!