PrisonPlanet Forum

***THE MAIN BOARDS - Welcome to the Prison Planet Educational Forum and Library*** => General Discussion for the Prison Planet Educational Forum and Library => Topic started by: Dig on August 04, 2010, 02:23:52 am

Title: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 04, 2010, 02:23:52 am
Is there a reason that 9 months ago the Nazi fraudster decepticons at NASA went batshit crazy with all of their solar tsunami armageddon psyops?

Just look at the news reports at the end of 2009 fueled by NASA's crack investigators who have robbed the American people out of $Trillions and who have created the eye in the sky with the NRO and other anti-constitutional Bilderberg monsters.

Mystery of the Solar Tsunami—Solved

Sometimes you really can believe your eyes. That's what NASA's Solar Terrestrial Relations Observatory (STEREO) is telling researchers about a controversial phenomenon on the sun known as the "solar tsunami."

Years ago, when solar physicists first witnessed a towering wave of hot plasma racing across the sun's surface, they doubted their senses. The scale of the wave was staggering: It rose up higher than Earth itself and rippled out from a central point in a circular pattern millions of kilometers in circumference. Skeptical observers suggested it might be a shadow of some kind—a trick of the satellite's eye—but surely not a real wave.

"Now we know," says Joe Gurman of the Solar Physics Laboratory at NASA's Goddard Space Flight Center. "Solar tsunamis are real."

The twin STEREO spacecraft confirmed their reality in February 2009 when sunspot 11012 unexpectedly erupted. The blast hurled a billion-ton cloud of gas (a coronal mass ejection, or CME) into space and sent a tsunami racing along the sun's surface. STEREO recorded the wave from two positions separated by 90 degrees, giving researchers an unprecedented view of the event.

"It was definitely a wave," says Spiros Patsourakos of George Mason University, lead author of a paper reporting the finding in Astrophysical Journal Letters. "Not a wave of water, but a giant wave of hot plasma and magnetism."

The technical name is "fast-mode magnetohydrodynamical wave," or "MHD wave" for short. The one STEREO saw reared up about 100,000 kilometers high, raced outward at 250 km/second (560,000 mph), and packed as much energy as 2.4 million megatons of TNT (1029 ergs).

Solar tsunamis were discovered in 1997 by the Solar and Heliospheric Observatory (SOHO). In May of that year, a CME came blasting up from an active region on the sun's surface, and SOHO recorded a tsunami rippling away from the blast site.

"We wondered," recalls Gurman, "is that a wave, or just a shadow of the CME overhead?"

SOHO's single point of view was not enough to answer the question—neither for that first wave nor for many similar events recorded by SOHO in years that followed.

The question remained open until after the launch of STEREO. At the time of the February 2009 eruption, STEREO-B was directly over the blast site, while STEREO-A was stationed at a right angle —"perfect geometry for cracking the mystery," says co-author Angelos Vourlidas of the Naval Research Laboratory in Washington, D.C.

The physical reality of the waves has been further confirmed by movies of the waves crashing into things. "We've seen the waves reflected by sunspots," says Vourlidas. "And there is a wonderful movie of a solar prominence oscillating after it gets hit by a wave. We call it the 'dancing prominence.'"

Solar tsunamis pose no direct threat to Earth, but they are important to study. "We can use them to diagnose conditions on the sun," notes Gurman. "By watching how the waves propagate and bounce off things, we can gather information about the sun's lower atmosphere available in no other way."

"Tsunami waves can also improve our forecasting of space weather," adds Vourlidas, "Like a bull-eye, they 'mark the spot' where an eruption takes place. Pinpointing the blast site can help us anticipate when a CME or radiation storm will reach Earth."

And they're pretty entertaining, too. "The movies," he says, "are out of this world."

Related Links:

Do Solar Tsunamis Exist? (detailed descriptions of related images)

Astrophysical Journal Letters: "'Extreme Ultraviolet Waves' are Waves: First Quadrature Observations of an Extreme Ultraviolet Wave from STEREO"

Solar Terrestrial Relations Observatory (STEREO)

Solar and Heliospheric Observatory (SOHO)

NSO Telescope Spots Solar Tsunami

"Solar Tsunami" Reflection from a Coronal Hole
Title: NASA Nazi Gaia worshippers laying cover story for Gaia induced false flag
Post by: Dig on August 04, 2010, 02:30:30 am

The 9 month incubation period has produced the perfect NUKE FALSE FLAG COVER STORY FOR THE GAIA PSYCHOS...

Solar Tsunami to Strike Earth
Published August 03, 2010

NASA's Solar Dynamics Observatory snapped this X-ray photo of the Sun early in the morning of Sunday, August 1st. The dark arc near the top right edge of the image is a filament of plasma blasting off the surface -- part of the coronal mass ejection. The bright region is an unassociated solar flare.

Earth is bracing for a cosmic tsunami Tuesday night as tons of plasma from a massive solar flare head directly toward the planet.

The Sun's surface erupted early Sunday morning, shooting a wall of ionized atoms directly at Earth, scientists say. It is expected to create a geomagnetic storm and a spectacular light show -- and it could pose a threat to satellites in orbit, as well.

"This eruption is directed right at us and is expected to get here early in the day on Aug. 4," said Leon Golub of the Harvard-Smithsonian Center for Astrophysics. "It's the first major Earth-directed eruption in quite some time."

Way to go Nazi NASA Gaia psychopaths. You could not make up such bullshit about the amazing coincidence of a directed plasma ray to target Earth if you were Timothy Leary.

The only plasma coming here is caused by these guys...


Title: Re: NASA Nazi scientists laying cover story for ionic cannon test today (?)
Post by: Dig on August 04, 2010, 02:33:54 am
Particle beam

A particle beam is an accelerated stream of charged particles or neutrons (often moving at very near the speed of light) which may be directed by magnets and focused by electrostatic lenses, although they may also be self-focusing (see Pinch).

Subatomic particles such as electrons, positrons, and protons can be accelerated to high velocities and energies, usually expressed in terms of center-of-mass energy, by machines that impart energy to the particles in small stages or nudges, ultimately achieving very high energy particle beams, measured in terms of billions and even trillions of electron volts. Thus, in terms of their scale, particles can be made to perform as powerful missiles for bombarding other particles in a target substance or for colliding with each other as they assume intersecting orbits.

High energy beams are created in particle accelerators, in which a charged particle is drawn forward by an electrostatic (not magnetic) field with a charge opposite to the particle (like charges repel one another, opposites attract); as the particle passes the source of each field, the charge of the field is reversed so that the particle is now pushed on to another field source. Through a series of fields in sequence, the particle accelerates until it is moving at a high speed. A natural analogy to particle beams is lightning, where electrons flow from negatively charged clouds to positively charged clouds or the earth.

Low and medium energy beams are quite common. Traditional cathode-ray tube televisions and computer displays use them to scan out each image, and some radiation therapy methods use them to treat cancer.

Particle beams as weapons

Though particle beams are perhaps most famously employed as weapon systems in science fiction, the U.S. Advanced Research Projects Agency started work on particle beam weapons as early as 1958[1] , two years before the first scientific demonstration of lasers. The general idea of particle-beam weaponry is to hit a target object with a stream of accelerated particles moving at near the speed of light and therefore carrying tremendous kinetic energy; the particles transfer their kinetic energy to the atoms in the molecules of the target upon striking, much as a cue ball transfers its energy to the racked balls in billiards, thus exciting the target's atoms and superheating the target object in a short time, leading to explosion either of the surface layer or the interior of the target.

Currently, the materials for such weapons are "high-risk" and may not be developed for some time[1]. The power needed to project a high-powered beam of this kind surpasses the production capabilities of any standard battlefield powerplant, thus such weapons are not anticipated to be produced in the foreseeable future. Particle beams could possibly be used from fixed locations, or in space, for example as part of the Strategic Defense Initiative (dubbed "Star Wars") or similar initiatives, but the problems related to power source still stand at present, pending future development in that field.

^ a b Roberds, Richard M. (1984). "Introducing the Particle-Beam Weapon". Air University Review July-August.
Title: Re: NASA Nazi scientists laying cover story for ionic cannon test today (?)
Post by: Dig on August 04, 2010, 02:35:07 am
Introducing the Particle-Beam Weapon
Dr. Richard M. Roberds

It is not that the generals and admirals are incompetent, but that the task has passed beyond their competence. Their limitations are due not to a congenital stupidity--as a disillusioned public is so apt to assume--but to the growth of science.

Captain B. H. Liddell Hart, speaking on weapon-development decisions, 1935

CONSIDERABLE debate has been stirred Cby President Reagan's recent suggestion that the United States embark on a program that would use advanced-technology weaponry to produce an effective defense against Soviet ICBMS. On the one hand, critics argue that the idea of a defensive system that would neutralize the ICBM threat is naive and, at best, would require large expenditures in the development of a very "high-risk" technology. Furthermore, they suggest, even if such a system could be developed, it would be too costly and would also be vulnerable to simple and cheap countermeasures. On the other hand, others argue that we must continue to explore such high-technology options until they have been either proved scientifically unachievable or developed into effective systems. If it were possible to build and effectively deploy such weapons, the payoff in terms of national security would be tremendous. And certainly, if this weaponry is achievable, it must be the United States, not the Soviet Union, that first develops it.

The advanced technology that has raised the possibility of defeating an ICBM attack is referred to collectively as directed-energy weapons, which gain their unprecedented lethality from several fundamental characteristics. Among their more important features are their ability to fire their "bullets" at or near the speed of light (186,000 miles a second), which would effectively freeze even high-speed targets in their motion; their ability to redirect their fire toward multiple targets very rapidly; their very long range (thousands of kilometers in space); and their ability to transmit lethal doses of energy in seconds or even a fraction of a second. No conventional ammunition is required; only fuel for the power generator is needed.

There are three principal forms of directed-energy weapons: the directed microwave-energy weapon, the high-energy laser, and the particle-beam. Only the last two types have received substantial government support.

Much has been written on the high-energy laser (HEL), and this category of directed energy weapon appears to be well understood by members of the defense community. Laser weapons have been under active development for twenty years and easily constitute the most advanced of the directed-energy devices.

In contrast, the particle-beam weapon (PBW) has been the "sleeper" among directed-energy weapons until very recently. Enshrouded in secrecy, it began as a project sponsored by the Advanced Research Projects Agency (now called Defense Advanced Research Projects Agency better known as DARPA) as early as 1958, two years before the first scientific laser demonstration in 1960. Code-named Seesaw, the project was designed to study the possible use of particle beams for ballistic missile defense. Today while its development lags that of the high energy laser, the particle-beam weapon is viewed by some military technicians as the follow-on weapon to the laser, because of its higher potential lethality.

The successful development of a particle beam weapon would require significant technology gains across several difficult areas. But even though the technical understanding to support the full-scale development of a PBW will not be available for several years, the technology issues that pace its development are no difficult to understand. The purpose of this article is to provide a basis for understanding the fundamental technology connected with particle-beam weapon, with the hope of assisting DOD leaders and other members of the defense community in making sound decision about the development and possible deployment of PBWs in the days ahead.
What Is a Particle-Beam Weapon?

The characteristic that distinguishes the particle-beam weapon from other directed energy weapons is the form of energy it propagates. While there are several operating concepts for particle-beam weapons, all such devices generate their destructive power by accelerating sufficient quantities of subatomic particles or atoms to velocities near the speed of light and focusing these particles into a very high-energy beam. The total energy within the beam is the aggregate energy of the rapidly moving particles, each particle having kinetic energy due to its own mass and motion.

Currently, the particles being used to form the beam are electrons, protons, or hydrogen atoms. Each of these particles can be illustrated through a schematic of the hydrogen atom, the smallest and simplest of all atoms. (See Figure 1.) The nucleus of the hydrogen atom is a proton, which weighs some 2000 times as much as the electron that orbits the single-proton nucleus. Each proton has an electric charge of a positive one, while each electron carries a charge of a negative one. In the case of hydrogen, the single electron and proton combine to form a neutrally charged atom.

The particle beam itself is analogous to a natural phenomenon with which we are all familiar--the lightning bolt. The analogy is so close that particle-beam pulses are referred to as "bolts." The particles in a lightning bolt are electrons (an electric current) flowing from a negatively charged cloud to a positively charged cloud or section of the earth. While the electric field in lightning that accelerates the electrons is typically 500,000 volts per meter, these electron velocities are still less than that desired in a particle-beam weapon. But the number of electrons (electric current) in the lightning bolt is nominally much greater. In any case, the phenomenon and its destructive results are very much the same.

Neither the proton nor the electron show any conclusive advantage over the other in their use as the appropriate "ammunition" of a PBW. The determining factor of whether to use electrons or protons so far has been simply the specific particle accelerator concept planned for use in a beam weapon. Some accelerating schemes call for the acceleration of electrons, while others use protons.

The use of a hydrogen-atom beam, however, is not based on the choice of a particular acceleration scheme. Because it is neutrally charged, the hydrogen atom has been selected specifically as the likely particle to be used in the initial space weapon. Neutral atoms would not be susceptible to bending by the earth's magnetic field as would a charged-particle beam. Neither would the beam tend to spread due to the mutually repulsive force between particles of like-charge in the beam. (In the atmosphere, a charged-particle beam will neutralize itself by colliding with air molecules, effectively creating enough ions of the opposite charge to neutralize the beam.)

The mechanism by which a particle beam destroys a target is a depositing of beam energy into the material of the target, which might be any material object. As the particles of the beam collide with the atoms, protons, and electrons of the material composing the target, the energy of the particles in the beam is passed on to the atoms of the target much like a cue ball breaks apart a racked group of billiard balls. The result is that the target is heated rapidly to very high temperatures--which is exactly the effect that one observes in an explosion. Thus, a particle beam of sufficient energy can destroy a target by exploding it (although that is not the only means of destruction).

In describing a particle beam, it is conventional to speak of the energy of the beam (in electron-volts), the beam current (in amperes), and the power of the beam (in watts). (See Figure 2.) The specific meaning of these terms as they pertain to a particle beam is derived from the close analogy between a particle beam and an electric current.

The electron-volt is a unit of measure for energy. It is the kinetic energy of an electron that has been accelerated by one volt of electric potential. Nominally, all the particles in a beam will have been accelerated to the same velocity, or energy, so it is possible to characterize the energy of a particle beam in terms of the energy of a typical particle of the beam, usually millions of electron-volts (MeV). Hence, a 20-MeV particle beam would be a beam of particles, each with a nominal energy of 20 million electron-volts.

A measure of the number of particles in the beam (beam intensity) may be made from the magnitude of the electric current (amperes) in the beam. To be able to assign a current to the beam, it is necessary to assume that each particle has an amount of electric charge equivalent to an electron (even if it is a neutral atom). This assumption enables an electric current to be ascribed to the particle beam, and an indication of the number of particles in the beam is inferred by the current magnitude expressed in amperes.

The power of a particle beam is the rate at which it transports its energy, which is also an indication of the rate at which it can deposit energy into a target. Again, the analogy with an electric circuit serves us well. The power developed in an electric circuit is the mathematical product of the voltage (E) and the current (I); its unit of measure is the watt. Since the unit of energy for a particle in a beam is the electron-volt (E), and the beam has an electric current(I)ascribed to it, the power of the particle beam in watts is simply the energy in electron-volts multiplied by the beam current in amperes.
Types of Particle-Beam Weapons

There are two broad types of particle-beam weapons: the charged-particle beam weapon and the neutral-particle beam weapon. The charged-particle variety would be developed for use within the atmosphere (endoatmospheric) and has a set of technological characteristics that are entirely different from the neutral particle beam weapon that would be used in space (exoatmospheric). Primarily, the extremely high power and precisely defined beam characteristics required for a particle beam to propagate through the atmosphere distinguish an endoatmospheric device from a beam weapon designed to operate in space. The development of a power supply and particle accelerator with sufficient power and appropriately shaped pulses for endoatmospheric weapons depends on very "high-risk" technology and is likely years away.1

The technological problems associated with exoatmospheric weapons are considerable also, but they are not as difficult as those associated with endoatmospheric weapons. Here, the greatest challenge is in the area of directing the beam: the weapon must be able to focus its energy to strike a target that may be thousands of kilometers away. There are two aspects to this challenge. First, the weapon must create a high-intensity, neutral beam with negligible divergence as it leaves the accelerator. Second, the weapon must have a system for aiming its beam at the target. This system must be able to detect pointing errors in a beam (which is itself very difficult to detect because of its lack of an electric charge) and, when necessary, redirect a missed "shot" toward the target.

Because of these two different sets of demands, the endo- and exoatmospheric devices represent two different types of weapon systems in appearance and operation. Nevertheless, there are certain fundamental areas of development that are common to both types of PBWS.
Development Areas for PBWs

The realization of an effective particle-beam weapon depends upon technology developments in five areas. Three of these concern hardware developments, while two others are related to advances in the understanding of beam weapon phenomena. (See Figure 3.)


One of the phenomenological aspects under study is lethality. Lethality refers to the general effectiveness of a weapon in engaging and destroying a target. There is no doubt that a particle beam is capable of destroying a military target. However, a knowledge is needed of the precise effect that a particle beam would have when it impinges upon various-type targets composed of different materials and components. The problem is made more difficult from the fact that the particle beam can vary according to particle type, particle energy, and beam power. To gain such an understanding, beam/target interaction is the subject of continuing technological investigations and studies.

In assessing the unique value of a particle beam as a potential weapon system, it is important to consider six characteristics that would give the beam weapon a high degree of lethality.

Beam velocity. The particles "fired" by a PBW will travel at nearly the speed of light (186,000 miles per second). The advantage of such a high-velocity beam is that computing the aim point for a moving target is greatly simplified. The effect of this extremely high velocity is essentially to fix a target, even if the target attempts evasive action. For example, if the weapon were required to shoot at a reentry vehicle (RV) some 50 kilometers distant and traveling at the high speed of 20,000 feet per second, the RV would travel only about 5 feet from the time the weapon fired until it was struck by the beam. It is this aspect of PBWs that makes feasible the task of "shooting a bullet with a bullet," as the ABM targeting problem is sometimes characterized.

Beam dwell time. Beam dwell time refers to the time that a beam remains fixed on a target. In an endoatmospheric weapon, the power of the beam would be sufficient to destroy the target instantaneously (in millionths of a second) upon impact, and no beam dwell time would be required. In space, where the required power of the beam is considerably less, some very short beam dwell time may be necessary.2

Rapid-aim capability. The particle beam may be redirected very rapidly from one target to another by means of a magnetic field. This field would itself be generated by an electric current. Varying the current would change the magnetic field intensity, which would deflect the charged particles in the desired direction. Within certain limits, no physical motion of the weapon would be required as it engages enemy targets. This capability to very rapidly aim and redirect the beam would enhance significantly the weapon's capability to engage multiple targets.

Beam penetration. The subatomic particles that constitute a beam have great penetrating power. Thus, interaction with the target is not restricted to surface effects, as it is with a laser. When impinging upon a target, a laser creates a blow-off of target material that tends to enshroud the target and shield it from the laser beam. Such beam/target interaction problems would not exist for the particle beam with its penetrating nature. Particle beams would be quite effective in damaging internal components or might even explode a target by transferring a massive amount of energy into it (the catastrophic kill mechanism). Furthermore, there would be no realistic means of defending a target against the beam; target hardening through shielding or materials selection would be impractical or ineffective.

Ancillary kill mechanisms. In addition to the direct kill mechanism of the beam, ancillary kill mechanisms would be available. Within the atmosphere, a secondary cone of radiation symmetrical about the beam, would be created by the beam particles as they collide with the atoms of the air. This cone would be comprised of practically every type of ionizing radiation known (i.e., x-rays, neutrons, alpha and beta particles, and so on). A tertiary effect from the beam would be the generation of an electromagnetic pulse (EMP) by the electric current pulse of the beam. This EMP would be very disruptive to any electronic components of a target. Thus, even if the main beam missed, the radiation cone and accompanying EMP could kill a target. While the EMP and the radiation cone would not be present in an exoatmospheric use of the weapon, there are other possible options in space that are not available in the atmosphere. Many intriguing possibilities come to mind. For example, using lower levels of beam power, the particle beam could expose photographic film in any satellite carrying photographic equipment, or it could damage sensitive electronic components in a satellite.

All-weather capability. Another advantage of a particle beam over the high-energy laser in an endoatmospheric application would be an all-weather capability. While a laser can be thwarted completely by such weather effects as clouds, fog, and rain, these atmospheric phenomena would have little effect on the penetrating power of a particle-beam weapon.

propagation of the beam

The successful development of a PBW depends on the ability of the beam to propagate directly and accurately to the target. As we ponder its similarity to lightning, we might consider the jagged, irregular path of a lightning bolt as it darts unpredictably through the sky. Such indeterminacy would never do for the particle beam of a weapon, which must have an extremely precise path of propagation as it traverses the kilometers to the enemy vehicle. This aspect, in fact, may be the Achilles' heel of the endoatmospheric weapon. However, the space weapon, which at this time is envisaged to be a neutral stream of hydrogen atoms, would not suffer from the beam instability problems that may possibly plague a beam of charged particles traveling through the air.

Another problem of propagation is possible beam spreading. An increase in beam diameter would result in a decrease of the energy density (intensity) of the beam as it travels toward the target. Over short ranges, a slight beam divergence can be tolerated, but the very long ranges that would be required of the space weapon place a tremendous restriction on the amount of beam divergence that is acceptable.

Use of a neutral beam in space would ensure that the beam would not spread due to mutual repulsion of the beam particles. Divergence would come strictly from that imparted by the accelerator. In the atmosphere, however, even if the beam particles were neutral, air molecules would strip the surrounding electrons quickly from the beam's neutral atoms, turning the beam into a charged-particle beam. The charged particles within the beam would then tend to repel one another, producing undesirable beam divergence. But as the beam propagates through the air, it would also strip electrons from the surrounding air molecules, creating a region of charged particles (ions) intermingling with the beam. The result of this phenomenon is to neutralize the overall charge of the beam, thereby reducing the undesired effect of mutual repulsion among the charged particles in the beam that is a cause of beam spreading. Another force that tends to prevent beam spreading is a surrounding magnetic field, created by the current of the charged particle beam. This field wraps itself around the beam and produces a conduit that inhibits beam divergence. (See Figure 4.)

The propagation of a charged-particle beam through the atmosphere is, in fact, the pacing issue for the endoatmospheric weapon. It has been theoretically calculated that specific threshold values of the beam parameters (beam current, particle energy, beam pulse length, etc.) are required for a beam to propagate through air with reliability. While the values of these parameters are classified, no particle-beam accelerator is currently capable of creating a beam with the required parameters.

Two crucially important experimental programs are exploring the phenomena of atmospheric beam propagation. The first program, underway at the Lawrence Livermore National Laboratory, involves experiments with an accelerator called the Advanced Test Accelerator (ATA), the construction of which was completed in the fall of 1982. The second program, a joint Air Force/Sandia National Laboratories program, similarly is aimed at investigating beam propagation through the use of a radial-pulse-line accelerator (RADLAC). Continuation of the U.S. program to explore the development of an endoatmospheric weapon will depend on a positive prognosis from these two experimental studies of atmospheric beam propagation.

fire-control/pointing-and-tracking technology

The fire-control/pointing-and-tracking system of a PBW must acquire and track the target, point the weapon at the target, fire the beam at the proper time, and assess target damage. If the beam misses the target, the system must sense the error, repoint the weapon, and fire again. Much of the technology for this part of the weapon is not unique to a PBW, and its development has benefited considerably from the HEL weapon program, which has involved study of this problem for several years. Moreover, recent advances in radar technology and electro-optics, combined with projected developments in next-generation computers, portend a heretofore unimagined capability in this area of technology.

This is not to say that serious development problems do not remain in the area of the fire-control system. Many of the pointing and tracking problems will be entirely unique to a particle-beam weapon and cannot be solved by a transfer of technology from the laser program. Nevertheless, none of these problems are such that they will demand exploration of basic issues in physics and the advancement of the state of the art, as will some other aspects of the beam weapon's development.

accelerator technology

The accelerator is the part of the weapon system that creates the high-energy particle beam. It is composed of a source of ions (electrons, protons, or charged atoms), a device for injecting the particles into the accelerating section, and the accelerating section itself. The accelerating section of all conventional linear accelerators is made up of a series of segments (modules) that sequentially apply an accelerating electric field to the charged particles. While the voltage in each segment may be relatively low, the repeated application of an accelerating voltage by the large number of modules ultimately produces very high particle energies.

The first subatomic particle accelerators were constructed in the 1930s for scientific investigations in the field of elementary-particle physics. The accelerators used for the first-generation PBW system will be embellished variations of the present-day, linear accelerators (linacs), such as the two-mile-long Stanford Linear Accelerator Center (SLAC), which is a state-of-the-art device capable of producing electrons with an energy of 30 GeV (30 billion electron-volts).

The SLAC represents a class of accelerators known as radio frequency (rf) linear accelerators. The great majority of linacs in operation today are rf linacs. Although such devices can accelerate particles to energies high enough for use as a weapon, they are limited severely in their current-carrying capability and would not be candidates for the endoatmospheric weapon system, since beam power is a product of current and voltage.

The space weapon, however, does not call for the tremendously high beam power required for the endoatmospheric weapon. Its accelerator could be based on the design of a state-of the-art rf linac.3 The major demand for a space weapon is to create a high-intensity (high "brightness") beam of neutral atoms with very precise collimation as it exits the accelerator. It is in this area of divergence that the greatest technical problems exist. If the beam were to diverge from a pencil point to only the diameter of a penny after twelve miles of travel, this would represent a divergence of one part in a million (one meter for each 1000 kilometers traveled). A divergence much greater than this would not be acceptable for a space weapon that is to have a range of thousands of kilometers.

A second type of linear accelerator is called the induction linac. The world's first induction linac, the Astron I accelerator, was built at the Lawrence Livermore Laboratory in 1963. It was designed to produce high electron-beam currents that could be used in a magnetic-confinement scheme for controlled thermonuclear fusion. The Advanced Test Accelerator is art induction linac that grew out of this early accelerator technology. The ATA is designed to generate a 50-MeV beam with 10,000 amperes of current in pulses of 50 nanosecond (50 billionths of a second) duration.4

The fundamental principle of operation (applying successively high voltage across a series of accelerating segments) is the same for both the rf and induction linacs. However, the mechanism for generating the electric voltage within the segments of the two types of linacs is quite different. Compared to the rf linac, the induction linac does not impart as much instability to the beam when a modest current limit is exceeded. Therefore, of the two types of accelerators, the induction linac is the more likely candidate for an endoatmospheric beam weapon (which will require very high beam currents).

In examining the Air Force charged-particle-beam technology program, we find that its main thrust is the exploration of nonconventional acceleration techniques (neither rf nor induction linacs), with two main purposes in mind. The first is to develop a means of producing a particle beam with parameters closely resembling those that would be required for successful propagation through the atmosphere, so that beam propagation can be studied in depth and propagation theory refined. To date, a RADLAC I accelerator that has been developed has produced a 10-MeV beam of electrons with a 30,000-ampere current.5 A more powerful RADLAC II is under construction.

The second purpose is to develop an accelerator with higher accelerating fields that would permit the building of a shorter device. The nominal accelerating gradient in conventional accelerators is about 5 to 10 MeV per meter of accelerator length. Thus, to produce a 1-GeV beam, a linear accelerator would need to be 100 to 500 meters in length--far too long and cumbersome, particularly if the device were to be carried aboard an aircraft. The Air Force hopes to build a device eventually that will generate a very powerful particle beam with an accelerator of more reasonable length.

power supply technology

Possibly the most difficult technical problem in developing an atmospheric particle-beam weapon is the development of its electrical power supply. To operate an endoatmospheric PBW requires that a tremendous amount of electrical energy be supplied over very short periods of time. Since power is energy divided by time, large amounts of energy over short spans of time translate into extremely high power levels. Building a power supply to produce high power in short bursts involves a very advanced field of technology known as pulsed-power technology.

Basically, a pulsed-power device can be divided into three component areas: the primary power source that provides electrical energy over the full operating time of the weapon (prime power source), the intermediate storage of the electrical energy as it is generated (energy storage), and the "conditioning" of the electrical power bursts or pulses of suitable intensity and duration (pulse-forming network) to fire the weapon. Each of these three areas represents a technological challenge.

Any electricity-producing device, such as a battery or generator, is a primary power source. The requirement of the particle-beam weapon, however, is for a prime power source that can produce millions to billions of watts of electrical power, yet be as lightweight and compact as possible. A conventional power station could provide the needed power levels, but it would be neither small nor lightweight. There is also a need for mobility in many of the envisaged applications; a power station would not meet this requirement. Some typical prime-power candidates are advanced-technology batteries, turbine-powered generators, or an advanced magnetohydrodynamic (MHD) generator using superconducting circuitry. Whatever the primary source might be, a sizable advance in the present power-generating state of the art will be required, particularly for the endoatmospheric weapon.

Once electrical energy is generated for the weapon, it will likely have to be stored in some fashion. A typical storage method involves charging a series of large capacitors (often called a capacitor bank). Other more exotic methods are possible, e.g., spinning a huge mechanical flywheel or simply storing the energy in the form of a high-energy explosive that is released in a contained explosion. Actually, there are numerous schemes for storing and releasing the required energy; their advantages and disadvantages depend on their particular application (i.e., the type of accelerator that is used and whether the weapon is endo- or exoatmospheric).

The pulse-forming network would be designed to release the stored energy in the desired form. In the atmospheric weapon, a single shot or "bolt" would most likely be comprised of a very short-duration pulse, repeated thousands of times per second. Hopefully, the prime power source would be able to generate energy at least at the same rate as energy was dispatched. If not, the weapon would be required to remain quiescent while its generator rebuilt a charge for another series of bolts.

THE development of a particle-beam weapon by the United States is a logical follow-on to the current high-energy laser development program. The weapon's potential lethality against high-speed, multiple targets, coupled with its capacity for selective destruction, would make the PBW particularly suitable for the space defense role. While some of the technological and operational issues to be resolved appear formidable at this time, it is far too early to discount the eventual operational effectiveness of such a weapon. Several scientists have argued that the PBW cannot be built or effectively deployed, creating or exacerbating doubts in other individuals. Yet those so concerned might do well to recall that in 1949, Vannevar Bush--a highly respected national leader with a Ph.D. in electrical engineering who had served as head of the U.S. Office of Scientific Research and Development during World War II--argued that technical problems made the development of an effective ICBM virtually impossible without astronomical costs.6 Nine years later, in 1958, the United States had its first operational ICBM, the Atlas.

The PBW offers a possibility for defending effectively against a launched ICBM, and even a glimmer of hope toward this end is worthy of pursuit. Should the United States terminate its exploration of particle-beam technology, we would be opening the door for the Soviets to proceed at their own pace toward building such a weapon. We can ill afford technological surprise in an area as crucial as beam weapons.

The current pace of the U.S. program in PBW development is both logical and orderly. Funding levels remain relatively low, as DARPA and the three services continue to focus on the pacing technologies that must be understood if such a weapon is to be built. Since the potential payoff of such activity is tremendous, it seems imperative that the United States continue to pursue the development of PBWs at least at the present level of funding.

Department of Engineering Technology
Clemson University, South Carolina


1. The major technological problems of the endoatmospheric weapon are twofold: to understand and demonstrate the propagation of the particle beam through the air and to create an electrical pulsed-power source capable of generating billions of watts of power in extremely short, repetitive pulses.

2. For a different reason, all high-energy lasers (with the exception of the envisioned x-ray laser) require beam dwell time also. A laser needs such time to burn through the surface of the target.

3. The question of how a beam of neutral atoms might be accelerated in a conventional rf linac may arise in the mind of the perceptive reader. A present approach is to attach an extra electron to a hydrogen atom, accelerate the charged atom in conventional fashion, and then strip off the extra electron by passing the beam through a tenuous gas as it exits the accelerator. This stripping causes the beam to spread slightly and must be controlled if the divergence specifications of a space weapon are to be met.

4. B. M. Schwarzchild. "ATA: 10-kA Pulses of 50 MeV Electrons," Physics Today, February 1982, p. 20.

5. Private communication, Lieutenant Colonel James H. Head, High-Energy Physics Technology Program Manager, Air Force Weapons Laboratory, 6 February 1984.

6. Vannevar Bush, Modern Arms and Free Men: A Discussion of the Role of Science in Preserving Democracy (New York. 1949), pp. 84-87.


Richard M. Roberds (B.A., M.S., University of Kansas; Ph.D., Air Force Institute of Technology) is Associate Professor and Head of the Engineering Technology Department at Clemson University. He is a retired Air Force colonel and was the first technical program manager of the Air Force particle-beam technology program, serving in that capacity from September 1975 until July 1977 at the Air Force Weapons Laboratory, Kirtland AFB, New Mexico. Colonel Roberds is a Distinguished Graduate of Air Command and Staff College and a graduate of the Industrial College of the Armed Forces.


The conclusions and opinions expressed in this document are those of the author cultivated in the freedom of expression, academic environment of Air University. They do not reflect the official position of the U.S. Government, Department of Defense, the United States Air Force or the Air University.
Title: Re: NASA Nazi scientists laying cover story for ionic cannon test today (?)
Post by: Dig on August 04, 2010, 02:38:26 am
Haarp Top Secret Plasma Weapon
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon
Post by: Dig on August 04, 2010, 02:46:04 am
These guys at DARPA do not seem to be psychotic insane control freaks hell bent on destroying humanity itself.

Naaaah, never...

DARPA Project List (Defense Advanced Research Projects Agency)

DARPA (Defense Advanced Research Projects Agency) was established 1958 in response to the Soviet launch of Sputnik. DARPA reports directly to the Secretary of Defense; however, it operates independently of the rest of military research and development. Its basic principles are:

Small and flexible, with a flat organization structure

Autonomous organization

World-class scientists and engineers work with representatives from industry, universities and government labs

Project-based style; technical staff rotated every 3-5 years

Program managers are selected to be technically outstanding and entrepreneurial.

Here's a listing of the DARPA-related projects presented on the Technovelgy site:

Silent Talk 'Telepathy' For Soldiers
'...allow user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals.'

TASC - DARPA's Psychohistory
The agency is seeking whitepapers to fuel the development of a scientific approach to predicting the actions of large masses of people.

Guided Bullets By Exacto From DARPA
How is it possible that a bullet could redirect its own course in mid-flight?

DARPA Seeks Self-Repairing Hunter-Killers?
Tests to date have seen small aerial robots lose large chunks of themselves to hostile fire, yet carry on with their mission.

DARPA Gandalf Project And Philip K. Dick
A new defense department project to locate enemies precisely, and target them, by phone.

EATR - DARPA's Energetically Autonomous Tactical Robot
Project to develop a robotic platform able to perform long missions while refueling itself by foraging.

You Can't Hide From DARPA
Harnessing Infrastructure for Building Reconnaissance (HIBR).

Squishy SquishBot ChemBots Desired By DARPA
ChemBots are soft, flexible robots that are able to alter their shape to squeeze through small openings and then regain their full size.

Fracture Putty For Compound Fractures - DARPA
An alternative to today's standard treatments, which often lead to further complications, and are not fully load-bearing,

Submersible Aircraft - DARPA's Flying Sub?
The minimal required airborne tactical radius of the sub-plane is 1000 nautical miles (nm).

MAHEM Metal Jets Like Clarke's Stiletto Beam
Create compressed magnetic flux generator (CMFG)-driven magneto hydrodynamically formed metal jets and self-forging penetrators (SFP).

Precision Urban Hopper Robot Must 'Stick' Landings
Intended to give wheeled robots an additional edge; the ability to jump up onto or over obstacles up to nine meters high.

Katana Mono-Wing Rotorcraft Nano Air Vehicle
The Katana Mono-Wing Rotorcraft is a coin-sized one-bladed helicopter.

Micro Imagers For Sensing On Nano Air Vehicles
With the impetus toward micro-air and -ground vehicles for military applications, there is a compelling need for imaging micro-sensors compatible with these small platforms.

RESURRECT High-Fidelity Computer Battlefield Simulations
Create high-fidelity computer simulations of in-theatre events for tactical, operational and strategic review

Aqua Sciences Water From Atmospheric Moisture
The program focused on creating water from the atmosphere using low-energy systems.

Shape-Shifting Bomber In Need Of Plowsharing
Shape-shifting supersonic bomber fans are feeling bereft this weekend.

Automated Mammalian Training Devices
The development of an automated mammalian training device would significantly reduce the need for human involvement.

RISE Robot: Six-Legged BIODYNOTICS Runaway
These Robots In Sensorial Environments are being developed by researchers from Carnegie Mellon.

DARPA Vulture Five Year Flying Wing
Vulture is intended to fly for periods of up to five years unattended at 65,000 feet.

LSTAT-lite Life Support For Trauma and Transport-lite Demoed
LSTAT has been around since 1999; however, the LSTAT-lite is considerably lighter and more affordable.

LANdroid WiFi Robots
DARPA is soliciting proposals for intelligent autonomous radio relay nodes.

HI-MEMS: Control Circuits Embedded In Pupal Stage Successfully
Researchers have succeeded in implanting electronic circuit probes into tobacco hornworms as early pupae.

HI-MEMS: Cyborg Beetle Microsystem
The University of Michigan team has successfully created a cyborg unicorn beetle microsystem.

DARPA Wants Exoskeletons
DARPA thinkers are saying that maybe humans themselves need an upgrade.

DARPA Urban Challenge For Autonomous Vehicles
Urban Challenge consists four sets of vehicle behavior requirements.

IR Chemical Communication Graffiti Tags Wanted By DARPA
The Chemical Communications (ChemComm) program objective is to encode and transmit information in a rapid and covert manner.

Hybrid Insect MEMS Sought By DARPA For Bug Army
HI-MEM-based bug armies? Our friends at DARPA seem to have cyborgs on the brain.

DARPA's 'BigDog' Robot Now In Puppy Stage
Project seeks to create algorithms that help multi-legged platforms learn to walk in varied terrain.

DARPA Urban Challenge - KITT, Put Up Or Shut Up
Autonomous ground vehicles will take to a mocked-up urban area to negotiate a 60-mile course.

Star Wars Binoculars A Cognitive Technology Threat Warning
They've dubbed the device "Luke's Binoculars," after the device used by Luke Skywalker in the original Star Wars movie.

DARPA Radar Scope Can Sense Thru Walls
Handheld radar scope can provide troops with an ability that was formerly the province of science fictional superheroes alone.

BigDog Quadruped Robot Update
Good progress on Ray Bradbury's mechanical hound from Fahrenheit 451.

DARPA Wants Exoskeletons
In a briefing today on, a variety of projects from DARPA (Defense Advanced Research Projects Agency) demonstrate that some science fiction thinking is good.

Shark Cyborgs On DARPA Remote control
In those Jaws movies, the shark seemed like it was out to get you. DARPA makes this dream come true.

Bradbury's Mechanical Hound and DARPA's BigDog Robot
In his chilling 1953 novel Fahrenheit 451, Ray Bradbury created the mechanical hound, a robot that accompanied the firemen and helped with their work... DARPA has made a multi-million dollar investment in the soldier of the future's best friend - BigDog.

DARPA Seeks Metabolic Dominance
DARPA has initiated a new program called "Metabolic Dominance" to assure that soldiers have superior physiological qualities. Frank Herbert had the answer sooner, though.

DARPA's Walrus and Griffith's War-Balloons
Not your great-grandfather's airship, the Walrus will be able to lift a fighting force.

DARPA's Radiation Decontamination (And 'Doc' Smith's Dekon)
DARPA and a host of scientists are working on decontamination techniques for dirty bombs.

Obtaining Unobtainium at DARPAtech 2004
DARPA searches for impossible materials - unobtainium - and is succeeding.

Springtail EFV-4B Personal Air Vehicle From Trek Aerospace
The Springtail EFV-4B Personal Air Vehicle (PAV) is a fourth-generation vertical take-off and landing (VTOL) craft powered by a single engine.

Trauma Pod Battlefield Medical Treatment System
DARPA has awarded a $12 million contract to develop an automated medical treatment system that can recieve, assess and stabilize wounded soldiers immediately following injury. The trauma pod is used to treat soldiers on the battlefield using advanced

Cormorant Submarine/Sea Launched MPUAV
The Cormorant submarine and sea launched vehicle concept may remind you of science fiction glories past.

Terminator Tether - EDT Solution To Space Debris Update
Studies have shown that low Earth orbit is not a limitless resource and should be managed more carefully. Some sort of debris-mitigation measures are needed to solve the problem of old, unusable satellites and space junk.

HELLADS: Lightweight Laser Cannon
Ultra-light High Energy Liquid Lasers are coming.
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon
Post by: Dig on August 04, 2010, 02:49:14 am
10 weeks ago...

Star Wars' meets reality?

Military testing laser weapons
By Dan Vergano, USA TODAY

An engineer adjusts a mirror in the "wall of fire," a zigzag-shaped optical path used by the Airborne Laser Test Bed's missile-killing high energy laser, during a test at the Lockheed Martin facility in Sunnyvale, Calif., in 2003. The "wall of fire" was part of the 6,100-pound beam transfer assembly, installed in the Airborne Laser Test Bed aircraft at Edwards Air Force Base, Calif. An infrared image of the Missile Defense Agency's Airborne Laser Testbed destroying a threat representative short-range ballistic missile, left, Feb. 11, 2010. A high-energy laser mounted on a U.S. military aircraft shot down a ballistic missile in the first successful test of the weapon, the agency said on Feb. 12. The experiment was carried out off the central California coast at Point Mugu Naval Air Warfare Center.

Are we finally witnessing the dawn of the "death ray"?

Five decades after the creation of the laser, the ubiquitous technology of the modern era may be ready to serve up that Star Wars science-fiction staple: the laser blaster. Advances in the technology have made it possible for military testers to shoot down incoming mortar rounds with land-based lasers, and military commanders are on the verge of being able to fire laser blasts from the air that could be aimed at tanks or mines.

"We literally are the invisible death ray, let me tell you," says Mike Rinn of Boeing's Airborne Laser Program in Seattle, a missile- defense effort, one among dozens of Defense Department-supported "directed energy" programs run by military contractors such as Boeing, Raytheon and Northrop Grumman.
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 04, 2010, 03:09:32 am
Two researchers attribute Norway light to HAARP, anti-ET space-based weapon of mass destruction
December 12, 7:34 PM

Following a cross-volley of interpretations of the Dec. 9, 2009 blue-green light vortex over Norway on the eve of Barack Obama’s Nobel Prize acceptance speech (Russian missile; ETs destroying Russian missile), two researchers have independently posited that HAARP, a space-based weapon of mass destruction one of whose antenna fields is close to the site of the Norwegian spiral light. 

According to one of the researchers, David Wilcock (see below), one of his confidential sources stated the Norway spiral light was part of Project Blue Beam.  Mr. Wilcock's interview appears on the Project Camelot Radio show with Kerry Cassidy.

One of the alleged purposes of Project Blue Beam is the use of advanced electromagnetic imaging such as that produced by HAARP and exhibited in the Norway spiral light as a psychological mass conditioning device in aid of the implementation of a global corporate new world order.
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: phosphene on August 04, 2010, 03:11:52 am

Power Failure in Canada During 1989

On March 13th, 1989 a huge solar induced magnetic storm played havoc with the ionosphere, and the earth's magnetic field. This storm, the second largest storm experienced in the past 50 years, totally shut down Hydro-Quebec, the power grid servicing Canada's Quebec province.

Montreal, March 15, 1989

Hydro-Quebec confirms that the March 13 blackout was caused by the strongest magnetic storm ever recorded since the 735-kV power system was commissioned. At 2:45 a.m., the storm, which resulted from a solar flare, tripped five lines from James Bay and caused a generation loss of 9,450 MW. With a load of some 21,350 MW at that moment, the system was unable to withstand this sudden loss and collapsed within seconds, thereby causing further loss of generation from Churchill Falls and Mania-Outardes.

Magnetic storms affect power system behaviour, mainly in that they cause transformer saturation, which reduces or distorts voltage. Hydro-Quebec's long lines and static compensators make the system particularly sensitive to such natural phenomena. For example, analysing the events that caused the March 13 blackout, the utility's experts noted a coincidence between the exceptional intensity of the magnetic storm and the tripping of several static compensators, especially at Chibougamau and La Verendrye substations. Immediately after this loss, records show voltage oscillations and power-swings increasing until the the lines from James Bay were lost. Within seconds, the whole grid was out of service.

The system-wide blackout resulted in a loss of some 19,400 MW in Quebec and 1,325 MW of exports. An additional load of 625 MW was also being exported from generating stations isolated from the Hydro-Quebec system.

Service restoration took more than nine hours. This can be explained by the fact that some of the essential equipment, particularly on the James Bay transmission network, was made unavailable by the blackout. Generation from isolated stations normally intended for export was repatriated to meet Quebec's needs and the utility purchased electricity from Ontario, New Brunswick and the Alcan and McLaren Systems.

By noon, the entire generating and transmission system was back in service, although 17 percent of Quebec customers were still without electricity. In fact, several distribution-system failures occurred because of the high demand typical of Monday mornings, combined with the jump in heating load after several hours without power.

Material Prepared by Richard Thompson. © Copyright IPS - Radio and Space Services.
Comments or suggestions can be directed to [email protected]
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 04, 2010, 03:12:38 am
HAARP, Haiti, Brzezinski and the NWO
By Jerry Mazza Online Journal Associate Editor


Yet a 1990 government document claims that the radio frequency (RF) power bolt can drive the ionosphere to “unnatural” activities. Quoting the authors . . .”at the highest HF powers available in the West, the instabilities commonly studied are approaching their maximum RF energy dissipative capability, beyond which the plasma process will ‘runaway’ until the next limiting factor is reached.” The program operates out the University of Alaska Fairbanks (in Sarah Palin-land), providing a ground-based “Star Wars” technology, offering a relatively inexpensive defense shield.

But the University also boasts about the most mind-boggling geophysical manipulations since nuclear bombs of which HAARP is capable. It’s based on the work of electrical genius Nicholas Tesla and the work and patents of Texas’ physicist Bernard Eastlund. The military has deliberately underestimated the deadly possibilities of this uber technology, most pointedly in this case to create earthquakes with the generation of bolts of electrical power aimed at specific targets.

In fact, HAARP’s potential for havoc drew the attention of none other than Zbigniew Brzezinski, former NSA adviser to Jimmy Carter, science advisor to President Johnson, and political advisor to President Obama.

More than 25 years ago, when Brzezinski was a professor at Columbia University, he wrote, “Political strategists are tempted to exploit research on the brain and human behavior [another strange purpose HAARP can be put to]. Geophysicist Gordon J.F. MacDonald, a specialist in problems of warfare, says accurately-timed, artificially-excited electronic strokes could lead to a pattern of oscillations that produce relatively high power levels over certain legions of the earth . . . in this way one could develop a system that would seriously impair the brain performance of very large populations in selected regions over an extended period.”

He capped this statement with “no matter how deeply disturbing the thought of using the environment to manipulate behavior for national advantages, to some, the technology permitting such use will very probably develop within the next few decades.” Let me tell you, dear readers, it’s here.

As of 1970, Brzezinski predicted HAARP could be used for “a more controlled and directed society” linked to technology. This society would be dominated by an elite group which impresses voters by allegedly superior scientific know-how.” Furthermore, Dr. Strangelove states, “Unhindered by the restrains of traditional liberal values, this elite [the New World Order of today] would not hesitate to achieve its political ends by using the latest modern techniques for influencing public behavior and keeping society under close surveillance and control. Technical and scientific momentum would then feed on the situation it exploits.”

And thus spake Brzezinski, who also predicted that it would take an inciting incident like Pearl Harbor (i.e., 9/11) to engage the normally peaceful American population to go to war on a march for world hegemony (i.e., The War on Terror). And he was spot on.

Zbig is not afraid, in fact, is lauded for thinking down avenues that would make most of us shiver with disgust. Regrettably, his forecasts tend to prove accurate, because they inspire the worst people to do the worst things. And so, these “tools for the elite” and their temptation to use them increases incredibly. The policies to use them are in place. As to the “stepping stones” that could be used to reach this highly controlled techno-society, Brezinski expected them to be “persisting social crisis” and the use of mass media to gain the public’s confidence. Again, he’s spot on.

Way back in 1966, Professor Gordon J.F. MacDonald, then associate director of the Institute of Geophysics and Planetary Physics at UC, Los Angeles, was a member of President Johnson’s Science Advisory Committee and later a member of the President’s Council on Environmental Quality. He actually wrote a chapter called “How to Wreck the Environment” in his book, Unless Peace Comes. Of course, this came at the height of the Vietnam brutality. Given the aura of violence similar to today’s, Gordon described in his chapter, among other things, “polar ice cap melting or destabilization, ozone depletion techniques, earthquake engineering, ocean wave control and brain wave manipulation using the planet’s energy fields.”
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 04, 2010, 03:16:56 am
Chapter from Unless Peace Comes
by Gordon J. F. MacDonald U.S.A.

Professor MacDonald is associate director of the Institute of Geophysics and Planetary Physics at the University of California, Los Angeles. His researches have embraced a remarkable diversity of natural phenomena and his professional interests are further extended by his participation in national science policy-making. He is a member of President Johnson’s Science Advisory Committee

Among future means of obtaining national objectives by force, one possibility hinges on man’s ability to control and manipulate the environment of his planet. When achieved, this power over his environment will provide man with a new force capable of doing great and indiscriminate damage. Our present primitive understanding of deliberate environmental change makes it difficult to imagine a world in which geophysical warfare is practised. Such a world might be one in which nuclear weapons were effectively banned and the weapons of mass destruction were those of environmental catastrophe.
Alternatively, I can envisage a world of nuclear stability resulting from parity in such weapons, rendered unstable by the development by one nation of an advanced technology capable of modifying the Earth’s environment. Or geophysical weapons may be part of each nation’s armoury. As I will argue, these weapons are peculiarly suited for covert or secret wars.

Science fiction literature contains many suggestions of how wars would progress if man indeed possessed the ability to change weather, climate, or ocean currents. Many of these fictional suggestions, and other more serious discussions, fail to take into account the limitations of nature. Jules Verne gave a detailed discussion of displacing the Earth’s polar caps, thus making the world’s climatic zones more equitable (Les Voyages Extraordinaires; Sans Dessus Dessous, Metzel, 1889). Verne’s proposal was to eliminate the 23º tilt in the Earth’s axis, putting it at right angles to the Sun-Earth plane. However, as Verne correctly pointed out in a subsequent discussion, the Earth’s equatorial bulge stabilizes our planet and even the launching of a 180,000-ton projectile would produce a displacement of only 1/10 micron. Senator Estes Kefauver, Vice-Presidential candidate in the 1956 American election, rediscovered Verne’s original proposal and was seriously concerned with the tipping of the Earth’s axis. He reported that the Earth’s axis could, as the result of an H-bomb explosion, be displaced by 10º. Either Senator Kefauver or his scientific advisers neglected the stabilizing influence of the Earth’s bulge. The maximum displacement that can be expected from the explosion of a 100-megaton H-weapon is less than one micron, as Walter Munk and I pointed out in our book, Rotation of the Earth (Cambridge, 1960).

Substantial progress within the environmental sciences is slowly overcoming the gap between fact and fiction regarding manipulations of the Earth’s physical environment. As these manipulations become possible, history shows that attempts may be made to use them in support of national ambitions. To consider the consequences of environmental modification in struggles among nations, we need to consider the present state of the subject and how postulated developments in the field could lead, ten to fifty years from now, to weapons systems that would use nature in new and perhaps unexpected ways.

The key to geophysical warfare is the identification of the environmental instabilities to which the addition of a small amount of energy would release vastly greater amounts of energy. Environmental instability is a situation in which nature has stored energy in some part of the Earth or its surroundings far in excess of that which is usual. To trigger this instability, the required energy might be introduced violently by explosions or gently by small bits of material able to induce rapid changes by acting as catalysts or nucleating agents. The mechanism for energy storage might be the accumulation of strain over hundreds of millions of years in the solid Earth, or the super-cooling of water vapour in the atmosphere by updraughts taking place over a few tens of minutes. Effects of releasing this energy could be world-wide, as in the case of altering climate, or regional, as in the case of locally excited earthquakes or enhanced precipitation.


The Earth’s atmosphere is an envelope of air which rotates, for the most part, at the same speed as the underlying continents and oceans. The relative motion between the atmosphere and the Earth arises from sources and sinks of energy which vary in location and strength but which have, as their ultimate source, the Sun’s radiation. The quantities of energy involved in weather systems exceed by a substantial margin the quantity of energy under man’s direct control.

For instance, the typical amount of energy expended in a single tornado funnel is equivalent to about fifty kilotons of explosives; a single thunderstorm tower exchanges about ten times this much energy during its lifetime; an Atlantic hurricane of moderate size may draw from the sea more than 1,000 megatons of energy. These vast quantities of energy make it unlikely that brute-force techniques will lead to sensible weather modification. Results could be achieved, however, by working on the instabilities in the atmosphere.

We are now beginning to understand several kinds of instabilities in the atmosphere. Supercooled water droplets in cold clouds are unstable, but they remain liquid for substantial periods of time unless supplied with nuclei on which they can freeze. Conversion of water droplets to ice through the introduction of artificial nuclei can provide a local source of energy. This released heat can cause rising air currents which in turn lead to further formation of supercooled water. This process may lead to rainfall at the ground greater than that which would have been produced without the artificial nucleation. A second instability may arise, in which water vapour condenses into water, again affecting the distribution of sensible energy. On a larger scale, there is the so-called baroclinic instability of atmospheric waves that girdle the planet. Through the imbalance of heat between equator and pole, energy in this instability is stored, to be released in the creation of large cyclonic storms in the temperate zones. There are other, less well understood instabilities capable of affecting climate; I shall return to them later.

What is the present situation with respect to weather modification and what might be reasonably expected in the future? Experiments over the past eighteen years have demonstrated unequivocally that clouds composed of supercooled water droplets can be transformed into ice-crystal clouds by seeding them with silver iodide, ‘dry ice’ (frozen carbon dioxide) and other suitable chemical agents. This discovery has been applied operationally in the clearance of airports covered by supercooled ground fog. No analogous technique has yet evolved for clearing warm fog, although several promising leads are now being investigated. In the case of warm fog, the atmospheric instability is that water vapour distributed in small drops contains more surface energy than the same water distributed in large drops. The trick for clearance of this warm fog will be to discover some way of getting the small drops to organize themselves into larger ones and then fall to the ground.

There is increasing, though inconclusive, evidence that rainfall from some types of clouds and storm systems in temperate regions can be increased by ten to fifteen per cent by seeding. Somewhat more controversial evidence indicates that precipitation can be increased from tropical cumulus by techniques similar to those employed in temperate regions. Preliminary experiments on hurricanes have the aim of dissipating the clouds surrounding the eye of the storm in order to spread the energy of the hurricane and reduce its force. The results are controversial but indicate that seeding can, in certain circumstaaces, lead to a marked growth in the seeded cloud. This possibility may have merit in hurricane modification, but experimentation has not yet resulted in a definitive statement.

Regarding the suppression of lightning, there is mixed but largely promising evidence that the frequency of cloud-to-ground strokes can be reduced by the introduction of ‘chaff’, strips of metallic foil of the kind used for creating spurious echoes in enemy radars.

In looking to the future, it is quite clear that substantial advances will be made in all of these areas of weather modification. Today, both military and civilian air transport benefit from progress in the clearance of ground fog. Further progress in the technology of introducing the seeding agent into the fog makes it likely that this type of fog dispersal will become routine. In a sense, fog clearing is the first military application of deliberate manipulation of weather, but it is, of course, very limited.

Large field programmes are being undertaken in the United States to explore further the possibility of enhancing precipitation, particularly in the western and north-eastern states. On the high ground of the western states, snow from winter storms provides much of the country’s moisture. Investigations are under way to see if seeding can lead to an increased snowpack and thus enhance the water resources. Intense interest in this form of weather modification, coupled with an increased investigation of the physics of clouds, is likely to lead to effective cloud modification within the next five to fifteen years. At present, the effects are measured only statistically and too little has been done in cloud observation before and after seeding in the way of precisely pinpointing which clouds are most likely to be affected.

As far as military applications are concerned, I conjecture that precipitation enhancement would have a limited value in classical tactical situations, and then only in the future when controls are more thoroughly understood. One could, for example, imagine field commanders calling for local enhancement of precipitation to cover or impede various ground operations. An alternative use of cloud seeding might be applied strategically. We are presently uncertain about the effect of seeding on precipitation down wind from the seeded clouds. Preliminary analysis suggests that there is no effect 200-300 miles down wind, but that continued seeding over a long stretch of dry land clearly could remove sufficient moisture to prevent rain 1,000 miles down wind. This extended effect leads to the possibility of covertly removing moisture from the atmosphere so that a nation dependent on water vapour crossing a competitor country could be subjected to years of drought. The operation could be concealed by the statistical irregularity of the atmosphere. A nation possessing superior technology in environmental manipulation could damage an adversary without revealing its intent.

Modification of storms, too, could have major strategic implications. As I have mentioned, preliminary experiments have been carried out on the seeding of hurricanes. The dynamics of hurricanes and the mechanism by which energy is transferred from the ocean into the atmosphere supporting the hurricane are poorly understood. Yet various schemes for both dissipation and steering can be imagined. Although hurricanes originate in tropical regions, they can travel into temperate latitudes, as the residents of New England know only too well. A controlled hurricane could be used as a weapon to terrorize opponents over substantial parts of the populated world.

It is generally supposed that a hurricane draws most of its energy from the sea over which it passes. The necessary process of heat transfer depends on wave action which permits the air to come in contact with a volume of water. This interaction between the air and water also stirs the upper layers of the atmosphere and permits the hurricane to draw on a substantially larger reservoir of heat than just the warm surface water. There may be ways, using monomolecular films of materials like those developed for covering reservoirs to reduce evaporation, for decreasing the local interaction between sea and air and thus preventing the ocean from providing energy to the hurricane in an accelerated fashion. Such a procedure, coupled with selective seeding, might provide hurricane guidance mechanisms. At present we are a long way from having the basic data and understanding necessary to carry out such experiments; nevertheless, the long-term possibility of developing and applying such techniques under the cover of nature’s irregularities presents a disquieting prospect.


In considering whether or not climate modification is possible, it is useful to examine climate variations under natural conditions. Firm geological evidence exists of a long sequence of Ice Ages, in the relatively recent past, which shows that the world’s climate has been in a state of slow evolution. There is also good geological, archaeological and historical evidence for a pattern of smaller, more rapid fluctuations superimposed on the slow evolutionary change. For example, in Europe the climate of the early period following the last Ice Age was continental, with hot summers and cold winters. In the sixth millennium B.C., there was a change to a warm humid climate with a mean temperature of 5ºF higher than at present and a heavy rainfall that caused considerable growth of peat. This period, known as a climatic optimum, was accentuated in Scandinavia by a land subsidence which permitted a greater influx of warm Atlantic water into the large Baltic Sea.

The climatic optimum was peculiar. While on the whole there was a very gradual decrease of rainfall, the decrease was interrupted by long droughts during which the surface peat dried. This fluctuation occurred several times, the main dry periods being from 2000 to 1900, 1200 to 1000 and 700 to 500 B.C. The last, a dry heat wave lasting approximately 200 years, was the best developed. The drought, though not sufficiently intense to interrupt the steady development of forests, did cause extensive migrations of peoples from drier to wetter regions.

A change to colder and wetter conditions occurred in Europe about 500 B.C. and was by far the greatest and most abrupt alteration in climate since the end of the last Ice Age. It had a catastrophic effect on the early civilization of Europe: large areas of forest were killed by the rapid growth of peat and the levels of the Alpine lakes rose suddenly, flooding many of the lake settlements. This climatic change did not last long; by the beginning of the Christian era, conditions did not differ greatly from current ones. Since then climatic variations have continued to occur and although none has been as dramatic as that of 500 B.C. a perturbation known as the little ice age of the seventeenth century is a recent noteworthy example. The cause of these historical changes in climate remains shrouded in mystery, The rapid changes of climate in the past suggest to many that there exist instabilities affecting the balance of solar radiation.

Indeed, climate is primarily determined by the balance between the incoming short-wave from the Sun (principally light) and the loss of outgoing long-wave radiation (principally heat).

Three factors dominate the balance: the energy of the Sun, the surface character of terrestrial regions (water, ice, vegetation, desert, etc.), and the transparency of the Earth’s atmosphere to different forms of radiated energy. In the last connection, the effect of clouds in making cool days and relatively warm nights is a matter of familiar experience. But clouds are a manifestation rather than an original determinant of weather and climate; of more fundamental significance is the effect of gases in the atmosphere, which absorb much of the radiation in transit from the Sun to the Earth or from the Earth into space. Intense X-rays and ultra-violet from the Sun, together with high-energy atomic particles, are arrested in the upper atmosphere. Only the narrow band of visible light and some short radio waves traverse the atmosphere without serious interruption.

There has been much controversy in recent years about conjectured overall effects on the world’s climate of emissions of carbon dioxide to the atmosphere from furnaces and engines burning fossil fuels, and some about possible influences of the exhaust from large rockets on the transparency of the upper atmosphere. Carbon dioxide placed in the atmosphere since the start of the industrial revolution has produced an increase in the average temperature of the lower atmosphere of a few tenths of a degree Fahrenheit. The water vapour that may be introduced into the stratosphere by the supersonic transport may also result in a similar temperature rise. In principle it would be feasible to introduce material into the upper atmosphere that would absorb either incoming light (thereby cooling the surface) or outgoing heat (thereby warming the surface). In practice, in the rarefied and windswept upper atmosphere, the material would disperse rather quickly, so that military use of such a technique would probably rely upon global rather than local effects. Moreover, molecular material will tend to decompose, and even elemental materials will eventually be lost by diffusion into space or precipitation to the surface. At intermediate levels, in the stratosphere, materials may tend to accumulate though the mixing time for this part of the atmosphere is certainly less than ten years and may be a few months. If a nation’s meteorologists calculated that a general warming or cooling of the Earth was in their national interest, improving their climate while worsening others, the temptation to release materials from high-altitude rockets might exist. At present we know too little about the paradoxical effects of warming and cooling, however, to tell what the outcome might be.

More sudden, perhaps much briefer but nevertheless disastrous effects, are predictable if chemical or physical means were developed for attacking one of the natural constituents of the atmosphere ozone. A low concentration of ozone (03, a rare molecular form of oxygen) in a layer between fifteen and fifty kilometres altitude has the utmost significance for life on land. It is responsible for absorbing the greater part of the ultra-violet from the Sun. In mild doses, this radiation causes sunburn; if the full force of it were experienced at the surface, it would be fatal to all life – including farm crops and herds – that could not take shelter. The ozone is replenished daily, but a temporary ‘hole’ in the ozone layer over a target area might be created by physical or chemical action. For example, ultra-violet at 250 millimicrons wavelength decomposes ozone molecules, and ozone reacts readily with a wide range of materials.

At present, we can only tentatively speculate about modifying the short-wave radiation at its source, the Sun. We have discovered major instabilities on the Sun’s surface which might be manipulated many years hence. In a solar flare, for example, 1010 megatons of energy are stored in distorted magnetic fields. With advanced techniques of launching rockets and setting off large explosions, we may sometime in the future learn to trigger these instabilities. For the near future, however, modification will not be in the short-wave in- coming radiation but in the long-wave outgoing radiation.

The usual schemes for modifying climate involve the manipulation of large ice fields. The persistence of these large ice fields is due to the cooling effects of the ice itself, both in reflecting (rather than absorbing) incoming short-wave radiation and in radiating heat at a higher rate than the usual ground cover. A commonly suggested means of climate modification involves thin layers of coloured material spread on an icy surface, thus inhibiting both the reaction and radiation processes, melting the ice, and thereby altering the climate. Such a procedure presents obvious technical and logistic difficulties. For example, if one wished to create a surface coating of as little as one micron thickness to cover a square 1,000 kilometres in size, the total material for this extremely thin coating would weigh a million tons or more, depending upon its density. So the proposals to dust from the air some of the globe’s extended ice sheets, are unrealistic and reflect a brute-force technique, taking no advantage of instabilities within the environment.

While it may be technologically difficult to change an ice cap’s surface character, and thus its thermal properties, it may be possible to move the ice, taking into account the gravitational instability of ice caps. The gravitational potential energy of water as a thick, high ice cap is much greater than it would be at sea level. This fact makes it possible, at least in principle, to devise schemes for bringing about a redistribution in the ice. Indeed, A. T. Wilson has proposed a cyclical theory for the Ice Ages based on this instability.

The main points of Wilson’s theory are as follows:

-Antarctica is covered by an ice sheet several kilometres thick. Pressure at the bottom of the ice is great enough to keep the ice at or near its melting point; water is an unusual material in that a pressure increase lowers rather than raises its melting point. An increase in thickness of the ice sheet could result in melting at the bottom. The resulting ice-water mixture along the sole of the glacier would permit flow by a process of freezing and melting – a flow process much more effective than ordinary plastic fiow.

-If such an instability occurs, the ice sheet will flow out on to the surrounding sea and a large ice shelf will be formed between Antarctica and the ocean around it. As a consequence, short-wave solar radiation will be reflected and there will be enhanced loss of heat by radiation at the long wave-lengths, causing cooling and the inducement of world-wide glaciation.

-Once the ice shelf is in the ocean, it will begin to melt and eventually will be removed. The ice remaining on land will be much thinner than before. As the reflectivity of the southern hemisphere decreases with the melting of the Antarctic ice cap, the global climate will grow warmer again, corresponding to the start of an interglacial period. The ice cap will slowly form again.

Commenting on Wilson’s theory, J. T. Hollin has noted the possibility of a catastrophic surge or advance of the ice sheet, such as has been recorded from small glaciers on numerous occasions. The largest surge yet reported is probably that of the ice cap in Spitsbergen which advanced up to twenty-one kilometres on a front of thirty kilometres sometime between 1935 and 1938. There are also reports of glacial advances at speeds up to 100 metres per day. Hollin speculates that, once the bottom-melting phase of a gravitationally unstable ice cap is reached, it will move quickly. In addition to trapped geothermal heat melting the ice at the bottom, there are additional contributions from frictional heat generated as the glacier scrapes along the solid ground.

If the speculative theory of Wilson is correct (and there are many attractive features to it) then a mechanism does exist for catastrophically altering the Earth’s climate. The release of thermal energy, perhaps through nuclear explosions along the base of an ice sheet, could initiate outward sliding of the ice sheet which would then be sustained by gravitational energy. One megaton of energy is sufficient to melt about 100 million tons of ice. One hundred megatons of energy would convert 0.1 cm of ice into a thin layer of water covering the entire Antarctic ice cap. Lesser amounts of energy suitably placed could undoubtedly initiate the outward flow of the ice.

What would be the consequences of such an operation? The immediate effect of this vast quantity of ice surging into the water, if velocities of 100 metres per day are appropriate, would be to create massive tsunamis (tidal waves) which would completely wreck coastal regions even in the northern hemisphere. There would then follow marked changes in climate brought about by the suddenly changed reflectivity of the Earth. At a rate of 100 metres per day, the centre of the ice sheet would reach the land’s edge in forty years.

Who would stand to benefit from such application? The logical candidate, would be a landlocked equatorial country. An extended glacial period would ensure near-Arctic conditions over much of the temperate zone, but temperate climate with abundant rainfall would be the rule in the present tropical regions.



The foregoing perhaps represents a more positive view of weather and climate modification than that held by many Earth scientists. I believe this view is justified as it is based on three scientific and technological advances. First, understanding of basic meteorology has advanced to such an extent that mathematical models of the atmosphere here have been developed incorporating the most important elements. Physical processes in clouds, in turbulent exchanges at the surface, and in transmission of radiation through the atmosphere are no longer as mysterious as they once were. The volumes simulated by the models range from the size of a single cloud to the entire atmosphere; these models are no longer primitive representations.

Secondly, the advent of high-speed computers enables atmospheric models to be studied in greater detail. These computers have a peculiar importance to weather modification, since they will enable scientists to carry out extended experiments to test whether or not various schemes for manipulating the atmosphere are indeed possible and what the outcome should be.

The third advance lending support to expectations for weather and climate modification is the new array of instruments developed to observe and detect changes in the atmosphere. The most dramatic and perhaps the most powerful is the meteorological satellite which provides a platform whence the atmosphere can be observed, not only in geographically inaccessible regions, but also with entirely new physical measurements. For example, meteorological satellites of the future will permit the determination of humidity, temperature and pressure as averaged over substantial volumes of the atmosphere, providing quantities which are needed to develop the mathematical models. Sophisticated surface instrumentation, for observing detailed processes within smaller parts of the atmosphere, provides us with far more powerful tools with which to look at clouds and at the interaction of the atmosphere with its boundaries than those which were available ten or twenty years ago.


What causes earthquakes? Over geological time, the irregular distribution of heat-producing radioactive elements in the rock layers gives rise to sub-surface temperature differences between various parts of the Earth. In the continents, granites and similar rocks have concentrated radioactive elements near the surface; no similar concentration has taken place in the sub-oceanic regions, which may as a result be more than 100ºC cooler than the corresponding sub-continental regions. Such variations in temperature along a horizontal line, due to the ddferences in the vertical distribution of heat-producing elements, give rise to large thermal stresses, causing strain analogous to that which cracks a glass tumbler filled with hot water. The strain tends to be greatest in regions of abrupt temperature change along a horizontal line through the Earth’s crust. The strain may be partially relieved by the slow convective flow of material in the deep Earth which is thought by some geophysicists to push continents about. But the strain can also be relieved by sharp fractures or by movements along previous faults in rocks near the surface. Movement along a fault radiates energy outward, which results in an earthquake. Each year approximately 200 megatons of strain energy is released in this fashion, the largest earthquakes corresponding to energy of the order of 100 megatons. The energy released depends on the volume of material affected. The largest earthquakes take place along faults having a linear dimension of 1,000 kilometres, whereas smaller ones take place along faults of one kilometre or less.

Major earthquakes tend to be located along two main belts. One belt, along which about eighty-five per cent of the total energy is released, passes around the Pacific and affects countries whose coastlines border this ocean, for example Japan and the west coast of North America. The second belt passes through the Mediterranean regions eastwards through Asia and joins the first belt in Indonesia. Along these two belts, large earthquakes occur with varying frequencies.

In California, a large earthquake might be expected once every 50 to 100 years, while Chile might expect such a disturbance once every ten to twenty years. Sometimes major earthquakes have occurred in regions ordinarily thought of as being free from risk. For example, the New Madrid earthquake of 1811-12 devastated a large area of central North America but had only slight cultural effects because of the area’s sparse population.

Today, our detailed understanding of the mechanism that causes an earthquake and of how the related instabilities can be triggered is limited. Only within the last few years have serious discussions of earthquake prediction begun, whereas moderately reliable weather forecasts have been available for about the last thirty to fifty years.

Currently, substantial effort is being made, primarily by Japan and the United States, to develop techniques for forecasting earthquakes. These techniques are based to a large extent on the determination of changing strain conditions of materials in the rocks surrounding recognized fault zones. Of possible value is the observation that, before an earthquake, the accumulating strain accelerates.

Control of earthquakes is a prospect even more distant than that of forecasting although two techniques have been suggested through recent experience.

In the course of the underground testing of nuclear weapons at the Nevada test site, it was observed that an explosion apparently released local strain in the Earth. The hypothesis is that the swift build-up of strain due to the sudden release of energy in an explosion discharges strain energy over a large volume of material.

Another method of releasing strain energy has appeared from pumping of underground water in the vicinity of Denver, Colorado, which has led to a series of small earthquakes. The hypothesis here is that underground water has provided local lubrication permitting adjacent blocks to slip by one another.

The use as a weapon system of the strain energy instability within the solid Earth requires an effective triggering mechanism. A scheme for pumping water seems clumsy and easily detectable. On the other hand, if the strain pattern in the crust can be accurately determined, the phased or timed release of energy from smaller faults, designed to trigger a large fault at some distance, could be contemplated. This timed release could be activated through small explosions and thus it might be possible to use this release of energy stored in small faults at some distance from a major fault to trigger that major fault. For example, the San Andreas fault zone, passing near Los Angeles and San Francisco, is part of the great earthquake belt surrounding the Pacific. Good knowledge of the strain within this belt might permit the setting off of the San Andreas zone by timed explosions in the China Sea and Philippine Sea. In contrast with certain meteorological operations, it would seem rather unlikely that such an attack could be carried out covertly under the guise of natural earthquakes.


We are still in the very early stages of developing the theory and techniques for predicting the state of the oceans. In the past two decades, methods have been devised for the prediction of surface waves and surface wind distribution. A warning system for the tsunamis (tidal waves) produced by earthquakes has also been developed.

Certain currents within the oceans have been identified, but we do not yet know what the variable components are; that is, what the weather within the ocean is. Thus we have not been able to identify any instabilities within the oceanic circulation that might be easily manipulated. As in the case of the solid Earth, we can only speculate tentatively about how oceanic processes might be controlled.

One instability offering potential as a future weapon system is that associated with tsunamis. These frequently originate from the slumping into the deep ocean of loosely consolidated sediments and rocks perched on the continental shelf. Movement of these sediments can trigger the release of vast quantities of gravitational energy, part of which is converted in the motion of the tsunami. For example if, along a 1,000-kilometre edge of a continental shelf, a block 100 metres deep and 10 kilometres wide were dropped a distance of 100 metres, about 100 megatons of energy would be released. This release would be catastrophic to any coastal nation. How could it be achieved? A series of phased explosions, perhaps setting off natural earthquakes, would be a most effective way. I could even speculate on planning a guided tidal wave, where guidance is achieved by correctly shaping the source which releases energy.


At heights of forty to fifty kilometres above the Earth’s surface, substantial numbers of charged particles are found which make this part of the atmosphere, the ionosphere, a good conductor of electricity. The rocks and oceans are also more conducting than the lower atmosphere. Thus, we live in an insulating atmosphere between two spherical conducting shells or, as the radio engineer would put it, in an Earth-ionosphere cavity, or waveguide. Radio waves striking either conducting shell tend to be reflected back into the cavity, and this phenomenon is what makes conventional long-distance radio communication possible. Only recently, however, has there been any interest in natural electrical resonances within the Earth-ionosphere waveguide. Like any such cavity, the Earth-ionosphere waveguide will tend to sustain radio oscillation at certain frequencies in preference to others. These resonant frequencies are primarily determined by the size of the Earth and the speed of light, but the properties of the ionosphere modify them to a certain extent. The lowest resonances begin at about eight cycles per second, far below the frequencies ordinarily used for radio communication. Because of their long wavelength and small field strength, they are difficult to detect. Moreover, they die down quickly, within 1/16 second or so; in engineering terms, the cavity has a short time constant.

The natural resonant oscillations are excited by lightning strokes, cloud-to-ground strokes being a much more efficient source than horizontal cloud-to-cloud discharges. On the average, about 100 lightning strokes occur each second (primarily concentrated in the equatorial regions) so that normally about six lightning flashes are available to introduce energy before a particular oscillation dies down. A typical oscillation’s field strength is of the order of 0.3 millivolts per metre.

The power of the oscillations varies geographically. For example, for a source located on the equator in Brazil the maximum intensity of the oscillation is near the source and at the opposite side of the Earth (around Indonesia). The intensity is lower in intermediate regions and towards the poles.

One can imagine several ways in which to increase the intensity of such electrical oscillations. The number of lightning strokes per second could be enhanced by artificially increasing their original number. Substantial progress has been made in the understanding of the physics of lightning and of how it might be controlled. The natural oscillations are excited by randomly occurring strokes. The excitation of timed strokes would enhance the efficiency with which energy is injected into an oscillation. Furthermore, the time constant of the oscillation would be doubled by a four-fold increase in the electrical conductivity of the ionosphere, so that any scheme for enhancing that conductivity (for example, by injecting readily ionized vapour) lowers the energy losses and lengthens the time constant, which would permit a greater number of phased lightaing strokes before the decay of an oscillation.

The enhanced low-frequency electrical oscillations in the Earth-ionosphere cavity relate to possible weapons systems through a little understood aspect of brain physiology. Electrical activity in the brain is concentrated at certain frequencies, some of it extremely slow, a little around five cycles per second, and very conspicuous activity (the so-called alpha rhythm) around ten cycles per second.

Some experiments have been done in the use of a flickering light to pull the brain’s alpha rhythm into unnatural synchrony with it; the visual stimulation leads to electrical stimulation. There has also been work on direct electrical driving of the brain. In experiments discussed by Norbert Wiener, a sheet of tin is suspended from the ceiling and connected to a generator working at ten cycles per second. With large field strengths of one or two volts per centimetre oscillating at the alpha-rhythm frequency, decidedly unpleasant sensations are noted by human subjects.

The Brain Research Institute of the University of California is investigating the effect of weak oscillating fields on human behaviour. The field strengths in these experiments are of the order of a few hundredths of a volt per centimetre. Subjects show small but measurable degradation in performance when exposed to oscillating fields for periods of up to fifteen minutes.

The field strengths in these experiments are still much stronger, by a factor of about 1,000, than the observed natural oscillations in the Earth-ionosphere cavity. However, as previously noted, the intensity of the natural fluctuations could bc increased substantially and in principle could be maintained for a long time, as tropical thunder storms are always available for manipulation. The proper geographical location of the source of lightning, coupled with accurately-timed, artificially-excited strokes, could lead to a pattern of oscillations that produced relatively high power levels over certain regions of the Earth and substantially lower levels over other regions. In this way, one could develop a system that would seriously impair brain performance in very large populations in selected regions over an extended period.

The scheme I have suggested is admittedly far-fetched, but I have used it to indicate the rather subtle connections between variations in man’s environmental conditions and his behaviour. Perturbation of the environment can produce changes in behaviour patterns. Since our understanding of both behavioural and environmental manipulation is rudimentary, schemes of behavioural alteration on the surface seem unrealistic. No matter how deeply disturbing the thought of using the environment to manipulate behaviour for national advantage is to some, the technology permitting such use will very probably develop within the next few decades.


Deficiencies both in the basic understanding of the physical processes in the environment and in the technology of environmental change make it highly unlikely that environmental modification will be an attractive weapon system in any direct military confrontation in the near future. Man already possesses highly effective tools for destruction. Eventually, however, means other than open warfare may be used to secure national advantage. As economic competition among many advanced nations heightens, it may be to a country’s advantage to ensure a peaceful natural environment for itself and a disturbed environment for its competitors. Operations producing such conditions might be carried out covertly, since nature’s great irregularity permits storms, floods, droughts, earthquakes and tidal waves to be viewed as unusual but not unexpected. Such a ‘secret war’ need never be declared or even known by the affected populations. It could go on for years with only the security forces involved being aware of it. The years of drought and storm would be attributed to unkindly nature and only after a nation were thoroughly drained would an armed take-over be attempted.

In addition to their covert nature, a feature common to several modification schemes is their ability to affect the Earth as a whole. The environment knows no political boundaries; it is independent of the institutions based on geography and the effects of modification can be projected from any one point to any other on the Earth.

Because environmental modification may be a dominant feature of future world decades, there is concern that this incipient technology is in total conflict with many of the traditional geographical and political units and concepts.

Political, legal, economic and sociological consequences of deliberate environmental modification, even for peaceful purposes, will be of such complexity that perhaps all our present involvements in nuclear affairs will seem simple. Our understanding of basic environmental science and technology is primitive, but still more primitive are our notions of the proper political forms and procedures to deal with the consequences of modification. All experience shows that less significant technological changes than environmental control finally transform political and social relationships. Experience also shows that these transformations are not necessarily predictable, and that guesses we might make now, based on precedent, are likely to be quite wrong. It would seem, however, that these non-scientific, non-technological problems are of such magnitude that they deserve consideration by serious students throughout the world if society is to live comfortably in a controlled environment.

Author’s note: In the section on weather modification I have drawn heavily on Weather and Climate Modification (National Academy of Sciences, National Research Council, Washington, zg66). A. T. Wilson’s paper on ‘Origin of Ice Ages’ appeared in Nature, vol. aox, pp. z4y-g (xg64), and J. T. Hollin’s comments in vol. ao8, pp. ra-16 (r 965). Release of tectonic strain by underground nuclear explosion was reported by F. Press and C. Archambeau in Journal of Geophysical Research, vol. 67, pp. 337-43 (1962), and man-made earthquakes in Denver by D. Evans in Geotimes, vol. to, pp. rr-rp. I am grateful to J. Homer and W. Ross Adey of the Brain Research Institute of the University of California at Los Angeles, for information on the experimental investigation of the infiuence of magnetic fields on human behaviour.
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 04, 2010, 04:55:37 am
Remember this test of the Star Wars defense system which they said was a BS missile launch for a BS rogue Sat. Yeah right!

CASE STUDY: US Nuclear Missle Defense Space Weapon Test

Part of this board:

Space Based Weapons Technology

Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 07, 2010, 01:36:48 am
From over a month ago...

Attack sequence 1...

AJ caller: "HAARP will initiate untested plasma shields in July"

AJ said: "how do you even know about this stuff, that would be highly classified material"

the issue is that there is a possibility of not being able to stop the experiment once it starts. similar to the nuts that knew there was a risk with the atom bomb that the world would be destroyed. But you know our fine MIC...gotta be on the cutting edge. We need plasma shields to protect from Al-CIA-duh!

anyone know about this?

Problem 1...

Pakistan floods displace millions, aid welcome from US or from militants
As Pakistan struggles with its worst floods in 80 years, much has been made in the West of the influence of hard-line Islamist charities providing relief in a region where the US is trying to win hearts and minds. But in Pakistan, the hope appears simply to be to get as much help to the region as possible – and quickly. On Friday, the United States announced an additional $25 million for flood relief in Pakistan, taking its total contribution to $35 million. At the same time, Jamaat-ud-Dawa, a group banned by the United Nations Security Council for its links with the militant outfit Lashkar-e-Taiba, believed to be behind the Mumbai (Bombay) attacks in India, has some 3,000 volunteers working around the country and is operating nine medical camps, he said. The floods have claimed at least 1,600 lives, displaced more than 4 million people, and affected as many as 12 million, as waters spread from the country’s militancy-affected northwest through the bread-basket of Punjab. As far as Sindh, home to Karachi, the country’s financial hub, river banks are bursting.


Problem 2...

Moscow chokes under a blanket of smog as wildfires rage out of control in Russian heatwave
These eerie pictures show the city of Moscow shrouded in dense smog today as wildfires that have killed 50 people nationwide continue to burn. Flights at international airports have been grounded and visibility in the capital is down to a few dozen yards as the fires continue to tear through forests and villages. The massive blaze has caused airborne pollutants, including carbon monoxide, to be four times higher than average - the worst seen to date in Moscow. Russians wore protective face masks as dozens of forest and peat bog fires around the city continued to burn, fanned by south-easterly winds and the country's most intense heat wave in 130 years. More than 500 separate blazes are active today, mainly across Russia's European territory, according to the Emergencies Ministry.


Problem 3...

Argentina Has Colder Winter Than Antartica, Spurring Record Power Imports
Argentina is importing record amounts of energy as the coldest winter in 40 years drives up demand and causes natural-gas shortages, prompting Dow Chemical Co. and steelmaker Siderar SAIC to scale back production.  Electricity supplied from Brazil and Paraguay rose to a daily combined record of about 1,000 megawatts on July 12, while consumption peaked at 20,396 megawatts three days later, according to Buenos Aires-based energy broker Cammesa. Shipments of liquefied natural gas are set to double this year.  Dow, Siderar and aluminum maker Aluar Aluminio Argentino SAIC are among companies closing plants, cutting output or seeking alternative energy sources after temperatures in parts of Argentina fell below those of Antarctica on July 15. Rising demand is exacerbating a shortage that began six years ago as economic growth accelerated and energy investment fell. The shortage is boosting costs as companies spend more to guarantee supplies.



UN panel: New taxes needed for a climate fund
BONN, Germany – Carbon taxes, add-ons to international air fares and a levy on cross-border money movements are among ways being considered by a panel of the world's leading economists to raise a staggering $100 billion a year to fight climate change. British economist Nicholas Stern told international climate negotiators Thursday that government regulation and public money also will be needed to create incentives for private investment in industries that emit fewer greenhouse gases. In short, a new industrial revolution is needed to move the world away from fossil fuels to low carbon growth, he said. "It will be extremely exciting, dynamic and productive," said Stern, one of 18 experts in public finance on an advisory panel appointed by U.N. Secretary-General Ban Ki-moon. A climate summit held in Copenhagen in December was determined to mobilize $100 billion a year by 2020 to help poor countries adapt to climate change and reduce emissions of carbon dioxide trapping the sun's heat. But the 120 world leaders who met in the Danish capital offered no ideas on how to raise that sum — $1 trillion every decade — prompting Ban to appoint his high-level advisory group.




AJ's show with with Dr. Blaylock was right on the money:

Today Alex welcomes back Dr. Russell Blaylock, board certified neurosurgeon, author and lecturer, to discuss lithium and fluoride being added to local water supplies.

Comment from MarkCentury on another thread:

In the 2008 movie "The Happening" mankind is assaulted by biochemical weapons that destroys their natural survival instincts leading victims to fearlessly commit suicide by walking into a lion's den or throwing themselves under a lawn mower.

In the final frames of this movie we see a school bus and on that bus a large number is visible:  


Now 2010 has arrived and we read bizzare stories of shrimp drugged on prozac committing suicide as they fearlessly throw themselves into danger.

Today Alex Jones released an impassioned video "The Media pushes brain eating vaccines" where he warns us that the same attack is being carried out against mankind.

Vaccines are being used to deliver live virus bioweapons that re-engineer our brains by destroying our natural survival instincts.  This at the same time a sudden chorus of academics are pushing sedating the masses by adding toxic lithium to the water supply.  As Alex explains, this combined biochemcal assault will render the masses unable to resist the globalists as they carry out the next phases of their incremental plan of global enslavement and extermination.

Those seeking the destruction of mankind operate under a code requiring them to give advance notice.  Could the prominent but fleeting date of "2010" in the movie "The Happening" have been a warning that this is the year we would all be thrown under the biochemical bus?


Read this article:

HAARP, Haiti, Brzezinski and the NWO
By Jerry Mazza Online Journal Associate Editor

My comment:

What is important to note is Jerry Mazza's evaluation and comparison of the two moments in historical context:

He actually wrote a chapter called “How to Wreck the Environment” in his book, Unless Peace Comes. Of course, this came at the height of the Vietnam brutality. Given the aura of violence similar to today’s, Gordon described in his chapter, among other things, “polar ice cap melting or destabilization, ozone depletion techniques, earthquake engineering, ocean wave control and brain wave manipulation using the planet’s energy fields.”

Also important to note is Brzezinski's invisible hand in all of these narratives pervasive among the elite controllers of society:

As of 1970, Brzezinski predicted HAARP could be used for “a more controlled and directed society” linked to technology. This society would be dominated by an elite group which impresses voters by allegedly superior scientific know-how.” Furthermore, Dr. Strangelove states, “Unhindered by the restrains of traditional liberal values, this elite [the New World Order of today] would not hesitate to achieve its political ends by using the latest modern techniques for influencing public behavior and keeping society under close surveillance and control. Technical and scientific momentum would then feed on the situation it exploits.”

And thus spake Brzezinski, who also predicted that it would take an inciting incident like Pearl Harbor (i.e., 9/11) to engage the normally peaceful American population to go to war on a march for world hegemony (i.e., The War on Terror). And he was spot on.

Zbig is not afraid, in fact, is lauded for thinking down avenues that would make most of us shiver with disgust. Regrettably, his forecasts tend to prove accurate, because they inspire the worst people to do the worst things. And so, these “tools for the elite” and their temptation to use them increases incredibly. The policies to use them are in place. As to the “stepping stones” that could be used to reach this highly controlled techno-society, Brezinski expected them to be “persisting social crisis” and the use of mass media to gain the public’s confidence. Again, he’s spot on.

As to the “stepping stones” that could be used to reach this highly controlled techno-society, Brezinski expected them to be “persisting social crisis” and the use of mass media to gain the public’s confidence. Again, he’s spot on.

RockefellerFoundation predicts for 2012
caused by pandemic, economic collapse, marauding gangs

So, I was reading this article (I think it has already been linked to by other people on here - since Alan Watt linked to it a couple of days ago - but as far as I know, this is the original article and contains some more stuff that the link Alan Watt put up did not have ) - and it made mention of this consortium called Global Business Network.

The English Ideology and WIRED Magazine

Part Three Of Three

Techno-Utopianism: The Final Imperial Solution
by Mark Stahlman November 22nd, 1996

This snippet in particular piqued my curiosity
No less chilling is the scenarios planning exercise that WIRED's wizards-behind-the-curtain perform on their multi-national clients. From General Motors to AT&T, the Global Business Network (GBN) charges hefty sums to show the yellow-brick-road towards "ByteCity" to strategic planners and top corporate brass. In one recent and rare public discussion of the results, GM's top planning team defined the three "alternative futures" which emerged after years of GBN counseling. The first is just like our world and, so by definition, is not very interesting. The second is an eco-fascist regime in which car designs are completely "Green" and the companies can only follow orders. The third is the fun one, however. This is the world in which armed gangs roam the streets and surface travel is a series of car chases. This scenario has already been anticipated with a Cadillac that includes armored protection and a "panic" button installed in the middle of the dashboard. The car has a satellite tracking system built in and it can call the local authorities (presumably your multi-national's private swat-team) and get help when you get trapped by the natives.

So naturally, I looked into Global Business Network. This consortium is kind of like Wells 'The Samurai' corporate overlord board - every bigwig company you can imagine that is into some type of big industry is among its members.

Global Business Network

Funding / Members - Corporations

I don't have time to list all of the companies, so let's just list for the sake of interest some of the key companies that consider themselves to be members - these companies also provide for the organization's funding:

  • AT&T
  • Bechtel
  • Booz Allen & Hamilton
  • Coca-Cola Company
  • Dow Chemical
  • DuPont
  • EDS (Electronic Data Systems)
  • Fannie Mae
  • Freddie Mac
  • ExxonMobil
  • General Electric
  • Heineken (My note: The Dutch beer-brewer)
  • Hewlett Packard/HP
  • IBM (My note: Big surprise, huh?)
  • Intel
  • L'Oreal
  • Lucent Corporation
  • Microsoft
  • Monsanto
  • Procter & Gamble
  • Reuters
  • Sandia National Laboratories
  • Shell Oil/Dutch Royal Shell
  • Toyota
  • Sun Microsystems (My note: See, I don't want to hear anymore crap from some techies that Sun Microsystems was 'once' good prior to being taken over by the evil 'Oracle' - to hell with that - they were ALWAYS part of the hive mind - stop sucking up to controlled opposition is my message to these people - forget about the SPARC processors and all that shit and realize these guys didn't give a rip about you, the individual, or any semblance of human dignity - same as all the other corporate technocrats)
  • Xerox

The list is far longer than this, but you get the drill - any corporation worth its salt is a member and/or provides funding for it.

Stewart Brand, Whole Earth Catalog, Wired Magazine, Global Business Network

This ties back into Stewart Brand - the guy behind the Whole Earth Catalog, the founding of Wired Magazine, and the main progenitor of this entire techno-utopia 'scientific dictatorship' that the Unabomber railed against. Coincidentally, he was also a co-founder of Global Business Network.

Brand is a co-founder of the Global Business Network (GBN) (1988) [4]; the The Long Now Foundation (1995) [5], "whose core project is the construction of a 10,000 year clock called The Clock of the Long Now"; the ALL Species Foundation (2001) [6], "to find and document every life form on Earth" [7]; and the Long Bets Foundation (2001), "an arena for competitive, accountable predictions" and "to foster better long-term thinking." [8][9]

Anyway, here comes the big one - this is a report produced by the Rockefeller Foundation in cooperation with Global Business Network in May 2010 - and it shows four possible future scenarios that the world might be headed into. Let's discuss this, shall we.

Scenarios for the Future of Technology and International Development
This report was produced by

The Rockefeller Foundation
and Global Business Network.
May 2010


Zbigniew Brzezinski pushes Rockefeller agenda of
to justify more weapons against the American people on LIVE TV!

Zbigniew Brzezinski: Obama The New Mr. 'Malaise'
By Mark Finkelstein Fri, 07/16/2010 - 08:35 ET [/center]
Can you hear the wailing and gnashing of teeth emanating from 1600 Pennslyvania Avenue?  It's Pres. Obama & Co. reacting to Zbigniew Brzezinki pinning on Barack Obama the word that doomed Jimmy Carter: "malaise." On Morning Joe, Carter's former national security adviser said there "is a sense of pervasive malaise" in America. What's worse, suggested Zbig, Pres. Obama hasn't been able to figure out how to deal with the malaise. Ruh-roh!

ZBIGNIEW BRZEZINSKI: I think we're now going through a phase in which there is a sense of pervasive malaise, which affects different groups in society in different ways. So people are dissatisfied; they're slightly worried; they don't see a good certain future for themselves or for the country, but in their own narrow sphere. There's no grand mobilizing idea. And I have a sense that Obama, who started so well, and who really captivated people—he captivated me!—has not been able yet to generate some sort of organizing idea for an age which combines a malaise that's pervasive and percolating, and complexity.. . .  

Leaked UN Documents Reveal Plan
For “Green World Order”
By 2012

Paul Joseph Watson Prison Friday, February 26, 2010


Leaked policy documents reveal that the United Nations plans to create a “green world order” by 2012 which will be enforced by a structure of global governance and funded by a gargantuan $45 trillion transfer of wealth from richer countries, as the globalists’ insidious plan to centralize power, crush sovereignty while devastating the economy is exposed once again.

As we warned at the time, the failure of Copenhagen in December did not spell the end of the global warming heist, but merely a roadblock in the UN’s agenda to create a world government funded by taxes paid by you on the very substance you exhale – carbon dioxide.

Using the justification of the vehemently debunked hoax that carbon dioxide is a deadly threat to the planet, the UN is already working to resurrect the failed Copenhagen agreement, with a series of new Copenhagen process negotiations set to take place in April, May and June.

Leaked Planning Documents (

Leaked planning documents (PDF ( obtained by Fox News lift the lid on the UN’s plan to impose global governance by the time of their 2012 World Summit on Sustainable Development in Rio, which will mark the 20th anniversary since the notorious “Earth Summit” held in the same city.

“The new Rio summit will end, according to U.N. documents obtained by Fox News, with a “focused political document” presumably laying out the framework and international commitments to a new Green World Order,” reports Fox News’ George Russell (,2933,587426,00.html).

“Just exactly what that environmental order will look like, and the extent of the immense financial commitments needed to produce it, are under discussion this week at a special session in Bali, Indonesia, of the United Nations Environment Program’s 58-nation “Governing Council/Global Ministerial Environmental Forum,” which oversees UNEP’s operations.”

The document outlines the globalist’s mission to enact a “radical transformation of the world economic and social order” by putting “a new treaty in place as the capstone of the Green World Order”.

This system will be managed by “an additional governing structure composed of exactly those insiders,” writes Russell.

“Moving towards a green economy would also provide an opportunity to re-examine national and global governance structures and consider whether such structures allow the international community to respond to current and future environmental and development challenges and to capitalize on emerging opportunities,” states the white paper (emphasis mine).

The imposition of such “global governance structures” will be achieved with the help of “vast wealth transfers” from richer countries (in the form of carbon taxes levied on citizens) to poorer nations, amounting to no less than $45 trillion dollars. The paper also outlines the need to change the “consumption patterns” of people living in richer countries, which undoubtedly is a euphemism for lowering living standards.

The policy proposes that the old economic model be discarded in pursuit of a new global green economy focused around “green jobs”.

As we have previously highlighted, the promise that the creation of “green jobs” will offset the inevitable damage to the economy that a 50 per cent reduction in carbon dioxide emissions will cause is a complete fallacy.

The implementation of so-called “green jobs” in other countries has devastated economies and cost millions of jobs. As the Seattle Times reported  ( in June, Spain’s staggering unemployment rate of over 18 per cent was partly down to massive job losses as a result of attempts to replace existing industry with wind farms and other forms of alternative energy.

In a so-called “green economy,” “Each new job entails the loss of 2.2 other jobs that are either lost or not created in other industries because of the political allocation — sub-optimum in terms of economic efficiency — of capital,” states the report.

As we have documented (, a reduction in carbon dioxide emissions of 50-80 per cent would inflict a new great depression in the United States, reducing GDP by 6.9 percent – a figure comparable with the economic meltdown of 1929 and 1930.

The UN’s mission to create a legally binding treaty on the reduction of CO2 emissions is running parallel with measures already being enforced at state level in the U.S. ( which bypass stuttering federal efforts to impose the cap and trade fraud.

The very foundation of the global warming argument has been completely eviscerated by the Climategate scandal, which proved that United Nations IPCC scientists forged and exaggerated data to “hide the decline” in global temperatures while engaging in witch hunts to cull dissenting opinions from appearing in IPPC reports.

Despite this, control freaks intent on taxing the life-giving gas carbon dioxide have signaled that they no longer care about the truth behind man-made climate change and have resolved to slam through their totalitarian agenda anyway. EPA head Lisa Jackson told reporters ( this week that “The science regarding climate change is settled, and human activity is responsible for global warming,” even though she failed to refute the fact that there had been no global warming since 1995, as was admitted by CRU scientist ( Professor Phil Jones.


PDF Leaked Planning Document,2933,587426,00.html

"The threat of
environmental crisis
 will be the
'international disaster key'
 that will unlock the
New World Order."

Mikhail Gorbachev

General Secretary of the Communist Party of the Soviet Union/ First Secretary.: 1985~1991. Quoted in "A Special Report: The Wildlands Project Unleashes Its War On Mankind", by Marilyn Brannan, Associate Editor, Monetary & Economic Review, 1996, p. 5.]
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: chris jones on August 07, 2010, 12:28:36 pm
Mikhail Gorbachev  WAR ON MANKIND

General Secretary of the Communist Party of the Soviet Union/ First Secretary.: 1985~1991. Quoted in "A Special Report: The Wildlands Project Unleashes Its War On Mankind", by Marilyn Brannan, Associate Editor, Monetary & Economic Review, 1996..
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 07, 2010, 04:38:40 pm

Massive ice island breaks off Greenland

Chapter from Unless Peace Comes

If the speculative theory of Wilson is correct (and there are many attractive features to it) then a mechanism does exist for catastrophically altering the Earth’s climate. The release of thermal energy, perhaps through nuclear explosions along the base of an ice sheet, could initiate outward sliding of the ice sheet which would then be sustained by gravitational energy. One megaton of energy is sufficient to melt about 100 million tons of ice. One hundred megatons of energy would convert 0.1 cm of ice into a thin layer of water covering the entire Antarctic ice cap. Lesser amounts of energy suitably placed could undoubtedly initiate the outward flow of the ice.

What would be the consequences of such an operation? The immediate effect of this vast quantity of ice surging into the water, if velocities of 100 metres per day are appropriate, would be to create massive tsunamis (tidal waves) which would completely wreck coastal regions even in the northern hemisphere. There would then follow marked changes in climate brought about by the suddenly changed reflectivity of the Earth. At a rate of 100 metres per day, the centre of the ice sheet would reach the land’s edge in forty years.

Who would stand to benefit from such application? The logical candidate, would be a landlocked equatorial country. An extended glacial period would ensure near-Arctic conditions over much of the temperate zone, but temperate climate with abundant rainfall would be the rule in the present tropical regions.

Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 07, 2010, 10:25:33 pm
Russian heatwave kills 5,000 as fires rage out of control
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: GlobalThinker on August 09, 2010, 09:50:40 pm
Russian heatwave kills 5,000 as fires rage out of control

I find it highly suspicious that suddenly we have all these disasters, Russian smog/pakistan floods/german floods/Ice break off. But that might just be me
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: Dig on August 09, 2010, 11:15:02 pm
I find it highly suspicious that suddenly we have all these disasters, Russian smog/pakistan floods/german floods/Ice break off. But that might just be me

it is because we did not listen to ALgore
Title: Re: Solar Tsunami is cover story for Haarp Top Secret Plasma Weapon Test
Post by: GlobalThinker on August 09, 2010, 11:45:26 pm
lol @ manbearpig. Plus if what he says is true it should be a slow process anyway.

And with the oil spill and the china landslide it just gives these people who suport manbearpig another reason demand carbon tax... Yay!!

I think the questions is could haarp cause more then 1 of these recent problems, and if the answer is yes then the next obvious question is did it.. To me it wouldnt be so suspicious if they wasn't using global warming as a way force carbon tax on us. Taxes do not fix the issue anyway.

Too hot global warming to cold global warming, floods/earthquakes/volcanos ect ect jeez its global warming.

Source. Reuters:
"Devastating floods in Pakistan and Russia's heatwave match predictions of extremes caused by global warming even though it is impossible to blame mankind for single severe weather events, scientists say".