"There is nothing new under the sun."http://www.zdnet.co.uk/news/it-strategy/2004/03/01/us-software-blew-up-russian-gas-pipeline-39147917/
ZDNet UK / News and Analysis / Business of IT / IT Strategy
US software 'blew up Russian gas pipeline'
By Matt Loney, ZDNet.co.uk, 1 March, 2004 15:10
Faulty US software was to blame for one of the biggest non-nuclear explosions the world has ever seen, which took place in a Siberian natural gas pipeline, according to a new book published on Monday.
At the Abyss: An Insider's History of the Cold War, written by Thomas C. Reed, a former Air Force secretary who served in the US National Security Council during the Reagan administration, documents how software and other technology was deliberately created with flaws as part of US attempts to undermine the Soviet economy.
In his book, Reed says the pipeline explosion was just one example of "cold-eyed economic warfare" against the Soviet Union at a time when the US was trying to block Western Europe from importing Soviet natural gas. The CIA slipped the flawed software to the Soviets in a way they would not detect it, according to Reed.
The book is likely to add fuel to the debate over open-source software, which many governments are examining with increasing interest. The Chinese government is one such, with Red Flag Linux gaining increasing traction in China, and proprietary software companies such as Microsoft scrambling to reassure them that the closed-source model does not pose risks.
"In order to disrupt the Soviet gas supply, its hard currency earnings from the West, and the internal Russian economy, the pipeline software that was to run the pumps, turbines, and valves was programmed to go haywire, after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds," Reed wrote. "The result was the most monumental non-nuclear explosion and fire ever seen from space."
"While there were no physical casualties from the pipeline explosion, there was significant damage to the Soviet economy. Its ultimate bankruptcy, not a bloody battle or nuclear exchange, is what brought the Cold War to an end. In time the Soviets came to understand that they had been stealing bogus technology, but now what were they to do? By implication, every cell of the Soviet leviathan might be infected. They had no way of knowing which equipment was sound, which was bogus. All was suspect, which was the intended endgame for the operation."
The faulty software was slipped to the Russians after an agent recruited by the French and dubbed "Farewell" provided a shopping list of Soviet priorities, which focused on stealing Western technology.
Exactly one year ago, China officials announced that the country had signed a pact with Microsoft that would give them access to the highly protected Windows operating system source code. Microsoft chairman Bill Gates hinted at the time that China would be privy to all, not just part, of the source code its government wished to inspect.
The Chinese government and military have previously stated their preference for the rival Linux operating system because its source code is made publicly available.
Source code makes it easier to understand the inner workings of an operating system, and without access to the code, governments like China fear that back doors may be installed to leak out sensitive information.
China is also said to be readying its own 64-bit server chip, as part of an effort to control more of the intellectual property that the country uses.
CIA plot led to huge blast in Siberian gas pipeline
By Alec Russell in Washington 12:00AM GMT 28 Feb 2004
A CIA operation to sabotage Soviet industry by duping Moscow into stealing booby-trapped software was spectacularly successful when it triggered a huge explosion in a Siberian gas pipeline, it emerged yesterday.
Thomas Reed, a former US Air Force secretary who was in Ronald Reagan's National Security Council, discloses what he called just one example of the CIA's "cold-eyed economic warfare" against Moscow in a memoir to be published next month.
Leaked extracts in yesterday's Washington Post describe how the operation caused "the most monumental non-nuclear explosion and fire ever seen from space" in the summer of 1982.
Mr Reed writes that the software "was programmed to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds".
The CIA learned of Soviet ambitions to steal the software via a French KGB source, Col Vladimir Vetrov, codenamed Farewell. His job was to evaluate the intelligence collected by a shadowy arm of the KGB set up a network of industrial spies to steal technology from the West.
The breakthrough came when Vetrov told the CIA of a specific "shopping list" of software technology that Moscow was seeking to update its pipeline as it sought to export natural gas to Western Europe.
Washington was keen to block the deal and, after securing President Reagan's approval in January 1982, the CIA tricked the Soviet Union into acquiring software with built-in flaws.
"In order to disrupt the Soviet gas supply, its hard currency earnings from the West, and the internal Russian economy, the pipeline software that was to run the pumps, turbines and valves was programmed to go haywire after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds," Mr Reed writes.
The project exceeded the CIA's wildest dreams. There were no casualties in the explosion, but it was so dramatic that the first reports are said to have stirred alarm in Washington.
The initial reports led to fears that the Soviets had launched a missile from a place where rockets were not known to be based, or even had detonated "a small nuclear device", Mr Reed writes in his book.
While some of the details of the CIA's counter-offensive have emerged before, the sabotage of the gas pipeline has remained a secret until now. Mr Reed told the Post he had CIA approval to make the disclosures.
Mr Vetrov's spying was discovered by the KGB and he was executed in 1983.
_____________________________________From the CIA's own library:https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/96unclass/farewell.htm
A Deception Operation
As was later reported in Aviation Week and Space Technology, CIA and the Defense Department, in partnership with the FBI, set up a program to do just what we had discussed: modified products were devised and "made available" to Line X collection channels. The CIA project leader and his associates studied the Farewell material, examined export license applications and other intelligence, and contrived to introduce altered products into KGB collection. American industry helped in the preparation of items to be "marketed" to Line X. Contrived computer chips found their way into Soviet military equipment, flawed turbines were installed on a gas pipeline, and defective plans disrupted the output of chemical plants and a tractor factory.
How to Prevent Cyber Espionage
Security expert Gadi Evron has plenty of experience helping governments fight cyber attacks. In this column, he offers a roadmap companies can use to prevent computer espionage
By Gadi Evron
October 22, 2008 — CSO —
This column is about computer-based espionage and how we can defend our organizations against it. But I'd like to start with a mood piece of sorts. There has been too much noise about information warfare lately. Distributed denial of service and defacement attacks like what happened in Estonia and Georgia come to mind. The following two stories give a better understanding of what it is really about, without resorting to more scary stories about what China is or isn't doing. We'll also touch on other interesting cases such as the Israeli Trojan horse case, when we talk about defensive measures against computer-based espionage and targeted attacks.
The first is a report (without much detail or proof) on North Korea being involved in operations against South Korea using Trojan horses for espionage. The second is a lesson from history called the Farewell Dossier - a collection of intelligence documents KGB defector Colonel Vladimir Vetrov (code-named Farewell) handed over to NATO during the Cold War.
This information led to a mass expulsion of Soviet technology spies. The CIA also mounted a counter-intelligence operation that transferred modified hardware and software designs over to the Soviets, resulting in the spectacular trans-Siberian incident of 1982, in which a huge explosion ripped apart a trans-Siberian pipeline. The resulting explosion was so big, it was supposedly confused for a nuclear explosion by American decision makers until the CIA said, "Oh, that's one of our operations."
It wasn't a bomb that destroyed the natural gas pipeline and sent shock waves through the economy of what was then the Soviet Union. Instead, it was a software virus created by the CIA, according to a book by Thomas Reed, a former U.S. Air Force secretary and National Security Council member.
What does this mean?
While destructive attacks are certainly of significance and important to defend against as they impact us directly, regardless of who the attacked party is or where in the world they are (DDoS attacks harm the Internet and its users), smarter, quieter attacks are all around us. How do we defend against them?
I expect most information warfare acts to be targeted, quiet, and covert. Espionage, or spying if you like, is not relevant to us unless we are the target. The diplomats and the intelligence communities of different countries can figure it out for us. It is an old occupation, and well covered by international law. Computers are simply another tool, or capability, to be used by these same people. There is nothing new here as far as how the game is played. And yet, what if you are a target?
Recognizing there is a threat
You may have to defend against computer-based espionage for your own employer. Recent case studies, as well as research, have shown industrial espionage is indeed a big deal, and here are two examples:
One famous case from a few years ago, which I had the unfortunate opportunity to study as the lead incident response guy for the government, is the Israeli Trojan horse case. Leading IT companies (most of which were local Israeli branches of Fortune 100 companies) were spied on using a Trojan horse built by an incompetent programmer, leaving traces of itself everywhere on the affected systems. This went on for a long period of time, undetected by any of these companies.
The issue was only detected by chance when the creator of the Trojan horse used it for his own private purposes and was discovered during an investigation into a harassment incident. The stolen information was fed directly to their competitors, which was most of the rest of the Israeli IT industry. The services themselves were rendered by civilian intelligence and investigation firms. In another case Israeli case, the attackers broke into a local branch of the post office (also a small bank in Israel) and placed a wireless gateway connected to a switch inside. Through it they stole a few tens of thousands of Shekels in the few days they were in operation. This case was also broken by complete chance.
In other cases, intelligence agencies for various countries such as France have been spying on their own to make sure their own local companies have an edge competing with companies from other countries.
Here is an interesting quote from "The Industrious Spies, Industrial Espionage in the Digital Age":
"This transition fosters international tensions even among allies. 'Countries don't have friends, they have interests!' screamed a DOE poster in the mid-1990s. France has vigorously protested U.S. spying on French economic and technological developments - until it was revealed to be doing the same."
Defending against computer-based espionage
For the purpose of defense, while I'd certainly hope for more resources (read a larger budget) and change my focus on where I apply it - there is no inherent difference in how you defend your organization from computer-based espionage and in protecting against any Joe Hacker. In espionage, the attacker has more resources, both technical and operational.
Some of what I would do differently
I'd concentrate more of my resources on network behavior analysis (which unfortunately, not many tools exist for, so good network security analysts are the main alternative), as well as on social engineering training and procedures. Further, I'd prioritize cooperation with the physical security part of the organization, and HR (for personnel screening). I'd also consider putting up a good deterrent as a cyber security policy, both to add to the attacker's risk and increase their cost.
First, I'd make myself too difficult of a target and let people know about it. Second, I'd invest anything I can spare on monitoring my network for anomalies and security incidents, starting with mapping what my network actually looks like. This might add to the risk factor for opponents that can't afford to be caught and scare them. Covertness is the name of the game, or they would have come through the front door. Entering an "industrial espionage defense" clause into your budget or creating a five-year plan to better protect your organization from organized industrial espionage may just get you a larger budget to cope with your organization's security needs. Do you have something you'd do differently from (or in addition to) regular security practices when facing espionage from organized hackers? Any experience or thoughts are welcome.