Author Topic: CIA used US software industry to 'blow up a Russian gas pipeline' [Actual Title]  (Read 20407 times)

0 Members and 1 Guest are viewing this topic.

Offline Effie Trinket

  • member
  • Member
  • *
  • Posts: 2,292
"There is nothing new under the sun."

ZDNet UK / News and Analysis / Business of IT / IT Strategy

US software 'blew up Russian gas pipeline'

By Matt Loney,,  1 March, 2004 15:10

Faulty US software was to blame for one of the biggest non-nuclear explosions the world has ever seen, which took place in a Siberian natural gas pipeline, according to a new book published on Monday.

At the Abyss: An Insider's History of the Cold War, written by Thomas C. Reed, a former Air Force secretary who served in the US National Security Council during the Reagan administration, documents how software and other technology was deliberately created with flaws as part of US attempts to undermine the Soviet economy.

In his book, Reed says the pipeline explosion was just one example of "cold-eyed economic warfare" against the Soviet Union at a time when the US was trying to block Western Europe from importing Soviet natural gas. The CIA slipped the flawed software to the Soviets in a way they would not detect it, according to Reed.

The book is likely to add fuel to the debate over open-source software, which many governments are examining with increasing interest. The Chinese government is one such, with Red Flag Linux gaining increasing traction in China, and proprietary software companies such as Microsoft scrambling to reassure them that the closed-source model does not pose risks.

"In order to disrupt the Soviet gas supply, its hard currency earnings from the West, and the internal Russian economy, the pipeline software that was to run the pumps, turbines, and valves was programmed to go haywire, after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds," Reed wrote. "The result was the most monumental non-nuclear explosion and fire ever seen from space."

"While there were no physical casualties from the pipeline explosion, there was significant damage to the Soviet economy. Its ultimate bankruptcy, not a bloody battle or nuclear exchange, is what brought the Cold War to an end. In time the Soviets came to understand that they had been stealing bogus technology, but now what were they to do? By implication, every cell of the Soviet leviathan might be infected. They had no way of knowing which equipment was sound, which was bogus. All was suspect, which was the intended endgame for the operation."

The faulty software was slipped to the Russians after an agent recruited by the French and dubbed "Farewell" provided a shopping list of Soviet priorities, which focused on stealing Western technology.

Exactly one year ago, China officials announced that the country had signed a pact with Microsoft that would give them access to the highly protected Windows operating system source code. Microsoft chairman Bill Gates hinted at the time that China would be privy to all, not just part, of the source code its government wished to inspect.

The Chinese government and military have previously stated their preference for the rival Linux operating system because its source code is made publicly available.

Source code makes it easier to understand the inner workings of an operating system, and without access to the code, governments like China fear that back doors may be installed to leak out sensitive information.

China is also said to be readying its own 64-bit server chip, as part of an effort to control more of the intellectual property that the country uses.

CIA plot led to huge blast in Siberian gas pipeline
By Alec Russell in Washington 12:00AM GMT 28 Feb 2004

A CIA operation to sabotage Soviet industry by duping Moscow into stealing booby-trapped software was spectacularly successful when it triggered a huge explosion in a Siberian gas pipeline, it emerged yesterday.

Thomas Reed, a former US Air Force secretary who was in Ronald Reagan's National Security Council, discloses what he called just one example of the CIA's "cold-eyed economic warfare" against Moscow in a memoir to be published next month.

Leaked extracts in yesterday's Washington Post describe how the operation caused "the most monumental non-nuclear explosion and fire ever seen from space" in the summer of 1982.

Mr Reed writes that the software "was programmed to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds".

The CIA learned of Soviet ambitions to steal the software via a French KGB source, Col Vladimir Vetrov, codenamed Farewell. His job was to evaluate the intelligence collected by a shadowy arm of the KGB set up a network of industrial spies to steal technology from the West.

The breakthrough came when Vetrov told the CIA of a specific "shopping list" of software technology that Moscow was seeking to update its pipeline as it sought to export natural gas to Western Europe.

Washington was keen to block the deal and, after securing President Reagan's approval in January 1982, the CIA tricked the Soviet Union into acquiring software with built-in flaws.

"In order to disrupt the Soviet gas supply, its hard currency earnings from the West, and the internal Russian economy, the pipeline software that was to run the pumps, turbines and valves was programmed to go haywire after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds," Mr Reed writes.

The project exceeded the CIA's wildest dreams. There were no casualties in the explosion, but it was so dramatic that the first reports are said to have stirred alarm in Washington.

The initial reports led to fears that the Soviets had launched a missile from a place where rockets were not known to be based, or even had detonated "a small nuclear device", Mr Reed writes in his book.

While some of the details of the CIA's counter-offensive have emerged before, the sabotage of the gas pipeline has remained a secret until now. Mr Reed told the Post he had CIA approval to make the disclosures.

Mr Vetrov's spying was discovered by the KGB and he was executed in 1983.
From the CIA's own library:


A Deception Operation

As was later reported in Aviation Week and Space Technology, CIA and the Defense Department, in partnership with the FBI, set up a program to do just what we had discussed: modified products were devised and "made available" to Line X collection channels. The CIA project leader and his associates studied the Farewell material, examined export license applications and other intelligence, and contrived to introduce altered products into KGB collection. American industry helped in the preparation of items to be "marketed" to Line X. Contrived computer chips found their way into Soviet military equipment, flawed turbines were installed on a gas pipeline, and defective plans disrupted the output of chemical plants and a tractor factory.


How to Prevent Cyber Espionage

Security expert Gadi Evron has plenty of experience helping governments fight cyber attacks. In this column, he offers a roadmap companies can use to prevent computer espionage

By Gadi Evron

October 22, 2008 — CSO —

This column is about computer-based espionage and how we can defend our organizations against it. But I'd like to start with a mood piece of sorts.  There has been too much noise about information warfare lately. Distributed denial of service and defacement attacks like what happened in Estonia and Georgia come to mind.  The following two stories give a better understanding of what it is really about, without resorting to more scary stories about what China is or isn't doing. We'll also touch on other interesting cases such as the Israeli Trojan horse case, when we talk about defensive measures against computer-based espionage and targeted attacks.

The first is a report (without much detail or proof) on North Korea being involved in operations against South Korea using Trojan horses for espionage. The second is a lesson from history called the Farewell Dossier - a collection of intelligence documents KGB defector Colonel Vladimir Vetrov (code-named Farewell) handed over to NATO during the Cold War.

This information led to a mass expulsion of Soviet technology spies. The CIA also mounted a counter-intelligence operation that transferred modified hardware and software designs over to the Soviets, resulting in the spectacular trans-Siberian incident of 1982, in which a huge explosion ripped apart a trans-Siberian pipeline. The resulting explosion was so big, it was supposedly confused for a nuclear explosion by American decision makers until the CIA said, "Oh, that's one of our operations."

It wasn't a bomb that destroyed the natural gas pipeline and sent shock waves through the economy of what was then the Soviet Union. Instead, it was a software virus created by the CIA, according to a book by Thomas Reed, a former U.S. Air Force secretary and National Security Council member.

What does this mean?
While destructive attacks are certainly of significance and important to defend against as they impact us directly, regardless of who the attacked party is or where in the world they are (DDoS attacks harm the Internet and its users), smarter, quieter attacks are all around us. How do we defend against them?

I expect most information warfare acts to be targeted, quiet, and covert. Espionage, or spying if you like, is not relevant to us unless we are the target. The diplomats and the intelligence communities of different countries can figure it out for us. It is an old occupation, and well covered by international law. Computers are simply another tool, or capability, to be used by these same people. There is nothing new here as far as how the game is played.  And yet, what if you are a target?

Recognizing there is a threat

You may have to defend against computer-based espionage for your own employer. Recent case studies, as well as research, have shown industrial espionage is indeed a big deal, and here are two examples:

One famous case from a few years ago, which I had the unfortunate opportunity to study as the lead incident response guy for the government, is the Israeli Trojan horse case.  Leading IT companies (most of which were local Israeli branches of Fortune 100 companies) were spied on using a Trojan horse built by an incompetent programmer, leaving traces of itself everywhere on the affected systems. This went on for a long period of time, undetected by any of these companies.

The issue was only detected by chance when the creator of the Trojan horse used it for his own private purposes and was discovered during an investigation into a harassment incident. The stolen information was fed directly to their competitors, which was most of the rest of the Israeli IT industry. The services themselves were rendered by civilian intelligence and investigation firms.  In another case Israeli case, the attackers broke into a local branch of the post office (also a small bank in Israel) and placed a wireless gateway connected to a switch inside. Through it they stole a few tens of thousands of Shekels in the few days they were in operation. This case was also broken by complete chance.

In other cases, intelligence agencies for various countries such as France have been spying on their own to make sure their own local companies have an edge competing with companies from other countries.

Here is an interesting quote from "The Industrious Spies, Industrial Espionage in the Digital Age":

"This transition fosters international tensions even among allies. 'Countries don't have friends, they have interests!' screamed a DOE poster in the mid-1990s. France has vigorously protested U.S. spying on French economic and technological developments - until it was revealed to be doing the same."

Defending against computer-based espionage

For the purpose of defense, while I'd certainly hope for more resources (read a larger budget) and change my focus on where I apply it - there is no inherent difference in how you defend your organization from computer-based espionage and in protecting against any Joe Hacker.  In espionage, the attacker has more resources, both technical and operational.

Some of what I would do differently

I'd concentrate more of my resources on network behavior analysis (which unfortunately, not many tools exist for, so good network security analysts are the main alternative), as well as on social engineering training and procedures.  Further, I'd prioritize cooperation with the physical security part of the organization, and HR (for personnel screening).  I'd also consider putting up a good deterrent as a cyber security policy, both to add to the attacker's risk and increase their cost.

First, I'd make myself too difficult of a target and let people know about it. Second, I'd invest anything I can spare on monitoring my network for anomalies and security incidents, starting with mapping what my network actually looks like. This might add to the risk factor for opponents that can't afford to be caught and scare them. Covertness is the name of the game, or they would have come through the front door.  Entering an "industrial espionage defense" clause into your budget or creating a five-year plan to better protect your organization from organized industrial espionage may just get you a larger budget to cope with your organization's security needs.  Do you have something you'd do differently from (or in addition to) regular security practices when facing espionage from organized hackers? Any experience or thoughts are welcome.

Offline Effie Trinket

  • member
  • Member
  • *
  • Posts: 2,292

Network admins must beware of Stuxnet: A SCADA System worm

By Mark Underwood
July 20, 2010, 12:42 PM PDT

Takeaway: Learn why Mark Underwood looks at Stuxnet as a new kind of threat that network admins should not simply classify with the regular barrage of security advisories. Find out more about this worm and its target.

Sometimes with mind-numbing frequency, patches and security advisories from Microsoft, Adobe, and Apple compete for an ever-increasing amount of attention from administrators. Little wonder then, that most will have greeted with a mild yawn the latest announcement of another zero day attack — this one named the “Stuxnet Attack.” Just as I was about to file this latest message under “Priority - To Be Reviewed,” the sender’s name jarred me to attention: Managing Automation.

Managing Automation is a periodical with a healthy web presence that tends to cover topics from the supply chain, manufacturing, process control, and product lifecycle management. Over the past five years or more, the editorial focus has branched out to cover additional topics more familiar to network administrators: e.g., security event management for industrial systems, defenses against industrial espionage, etc. Despite this new coverage area, Managing Automation topics are rarely vehicles for malware notification. It was noteworthy then, to see author Chris Chiappinelli’s story begin with:

    Manufacturers worldwide have been put on notice that an insidious virus targeting supervisory control and data acquisition (SCADA) systems is on the loose.

    The targets of the malware are Siemens’ SIMATIC WinCC and PCS7 software, integral components of the distributed control and SCADA systems that facilitate production operations in many process manufacturing companies…

Those not in the manufacturing and process engineering fields may be unaware of Siemens SIMATIC and PCS7 software. How important was this emerging threat, in a field rife with worries that are sometimes alarmist and self-serving? Important. This time there is legitimate cause for concern.

Wired’s Kim Zetter wrote in a post the same day as the Managing Automation announcement that “the emergence of malware targeting a SCADA system is a new and potentially ominous development for critical infrastructure protection.” Network World’s Ms. Smith quotes F-Secure’s warning that the vulnerability poses “a risk of virus epidemic at the current moment.” Finally, it may be standard lingo for such announcements, but Microsoft’s July 16th announcement of Security Advisory 2286198 advised customers to visit Microsoft’s general support portal and to “contact the national law enforcement agency in their country.”

All of this was more than enough to get my attention.

While SCADA systems are often not regularly connected to the Internet, they are networked and are subject to the usual array of vulnerabilities. (Promotional web copy for the Siemens product that is the target of this attack explicitly mentions Ethernet switches and wireless LANs.) Public officials such as Richard Clarke have warned about risks to SCADA systems, but there have been few examples to rally the troops. While the particular vulnerability — a hard-coded password allowing access to the Siemens software’s back end data base — is not especially remarkable (though it does both date the software and call into question software quality review processes at Siemens), the malware packs a punch.

Thought to mainly spread by USB stick, or possibly by network shares,  it cannot be defeated by simply turning off Windows autorun; simply viewing an infected file system will install the malware. A security specialist at Tofino believes that this zero-day attack, which affects all versions of Windows, may have been in the wild for a month or more. Preliminary assessments indicate that the malware does not appear designed to cripple infrastructure, but rather to steal information from SIMATIC WinCC / PCS7 implementations — i.e., some form of industrial espionage. Of course that espionage could later be used to wreak havoc on these same or similarly configured systems.

Recent press and analyst coverage has addressed both the threats to SCADA networks, and also the broader Windows vulnerability which the worm uses to spread (it exploits a code that interprets Windows shortcuts, i.e., .lnk files). As Microsoft noted in their analysis of the exploit, which has been named the “Stuxnet” threat, this is a new method of propagation which leverages a flaw in the way the Windows Shell “parses shortcuts.” Stuxnet has been cataloged as CVE-2010-2568 at Mitre’s CVE. For its part, Microsoft has proposed a workaround of sorts, and updated its own detection engines.

There’s more
As if that wasn’t enough, the attack also involved theft of a signed Verisign digital certificate owned by Realtek Semiconductor. This certificate was used to authenticate drivers needed by Stuxnet when it self-installs, though Microsoft has since persuaded Verisign and Realtek to revoke the certificate. This was the icing on the trojan’s cake.

The Dependency Syndrome
What does all this mean? One lesson — not new, but that is borne out by this incident — is that the Internet-centric orientation of most malware models could miss certain types of threats. SCADA vulnerabilities are just that sort of threat. And while infections might not spread directly from them to general purpose networks, those general purpose networks depend upon SCADA systems for connectivity, power — and even human habitability. The “Dependency Syndrome” asserts that connections between traditional networks such as those managed every day by network administrators, and nontraditional networks such as those hosting SIMATIC WinCC / PCS7, will sooner or later be impossible to detect — and defend against.
Some reader comments from the above:

Alert Code Red
While the will not apply to the majority of IT personnel, it serves as a good awareness of what is happening in the other sectors, especially since the "brain" behind a SCADA system is a computer.

It might not affect us as IT jockeys per se, however, its use in controlling water treatment plants, sewerage systems, electrical power transmission and large communication system makes it important for us to at least know something about it.

Other Systems that I see as attack vectors
Besides the SCADA system, I see problems comming on the horizon with BACnet, Zigee, and all these SMART Meters all the power companies are installing. Imagine, someone can shut down your business' HVAC, power, and even other SMART devices.

You and your entire family
I work for a Public water supply (PSD) We have Siemens Equipment in the plant, and from one end of the system to the other. Lots of it on the INTERNET as a comm. link. Used to control chemical feed pumps, monitor water quality at remote system sites. etc etc.....I won't go into any more detail.I'm sure you can see the
potential for a large number people. As a SCADA field Tec let me invite you to go to the kitchen, run out a glass of water and really think about this while you drink it.
Did it taste a little different this time??

Fire is - potentially - everywhere...
Our company producing paper is controlled by SCADA systems from electical energy supply (utility and own generators), through wood processing machines (chippers) and whole production line to waste and water treatment plants. With very little effort (in software) you can destroy whole mill: exceed some parameters (pressure or something else), let it explode and rip some equipment carrying strong chemicals (for example HCl = hydrochloric acid). That carried by wind and/or water will kill local population...

In case of emergency ALL personel including contractors have gas masks. Mine is in the drawer below computer here. It is only to escape. Many windsocks around indicate direction to chose. In the town people don't have all of that. They don't have to have computers to be afected. OS is also irrelevant...

Everyday IT
It's not about the system or sector that STUXNET is attacking. It's abou the concept that an undetectable piece of malware is attacking a network or system that no one probably worried about. I doubt there is NORTON A/V that you run on this stytem. So think about your own network. What non-microsoft, non-mainstream systems to you deploy. How about that new car you bought with built-in bluetooth technology? Your kid d/l's a file on their IPOD and links it to the car stereo. Then when the file is accessed, a code is sent to the cars computer via the link between them for speed volume control. By the way that code was to disable the brakes and increase the throttle. Is this likely, no, but possible. We forget that although some devices may not be directly connected to the internet, they are connected to a network, or become connected at some point. The bad guys understand this and are finding ways to infect these sytems that we thought where "secure". That is Everyday IT as you put it.

Some articles about SCADA attacks

Note the 3K explosion, "the most monumental non-nuclear explosion " ; below is a detailed link for it:,339024596,320283135,00.htm

I am an automation engineer. I never consider being a web developer or a DB admin because I like controlling hardware to see something physically happens by your code.

Several years ago, some robots in our plant started working slowly. We found that it caused by a worm spreading over companies network, and consuming huge bandwidth. It took more than several hours to fix it.

Probably, hackers of future will involve in SCADA attacks more than deleting data or abusing web pages.

Someone get injured and even died in case of SCADA attacks, so I believe that securing SCADA is more critical than securing ordinary IT systems.

RE: Network admins must beware of Stuxnet: A SCADA System worm
Interesting article. Few days ago I was shocked when I found out that a lot of powerplants in my country still keep their 70s dinosaurs in working order. I understand them now. Properly set-up obsolete mainframe is way better than running plant by hand if SCADA viruses start spreading and making real damage.

We have industrial espionage now. How far away are viruses which will actually attempt to sabotage industrial complexes, especially in most critical moments?

Preparing for cyber war:  Bernd Debusmann

Wed Mar 19, 2008 11:07am EDT

(Bernd Debusmann is a Reuters columnist. The opinions expressed are his own)

By Bernd Debusmann

WASHINGTON (Reuters) - At the height of the Cold War, a Soviet oil pipeline blew up in an explosion so huge that the American military suspected a nuclear blast. A quarter of a century later, the incident serves as an object lesson in successful cyber warfare.

The pipeline blew up, with disastrous consequences for the Soviet economy, because its pumps, valves and turbines were run by software deliberately designed to malfunction. Made in the U.S. and doctored by the CIA, it passed into Soviet hands in an elaborate game of deception that left them unaware they had acquired "bugged" software.

"The pipeline software...was programmed to go haywire, after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welts. The result was the most monumental non-nuclear explosion ever seen from space," Thomas C. Reed, a former air force secretary, wrote in his 2004 memoir.

The pipeline explosion was probably the first major salvo in what has since become known as cyber warfare. The incident has been cropping up in increasingly urgent discussions in the U.S. on how to cope with attacks on military and civilian computer networks and control systems - and how and when to strike back.

Air traffic control, power plants, Wall Street trading systems, banks, traffic lights and emergency responder communications could all be targets of attacks that could bring the U.S. to its knees. As Michael McConnell, the Director of National Intelligence, put it in recent testimony to a Senate committee:

"Our information infrastructure - including the Internet, telecommunications networks, computer systems and embedded processors and controllers in critical industries - increasingly is being a growing array of state and non-state adversaries." Cyber attacks, he said, had grown more sophisticated and more serious.

The Pentagon says it detects three million attempts to infiltrate its computer networks every day. There are no estimates of how many probes are successful but last year the Pentagon had to take 1,500 computers off line because of a concerted attack from unknown hackers.


How tight are the U.S. government's defenses? Not very, according to the Government Accountability Office, the audit and investigative arm of the U.S. Congress. In a report last week, it said an audit of 24 government agencies - including Defense and Homeland Security - had shown that "poor information security is a widespread problem with potentially devastating consequences."

Striking back at cyber attackers poses a raft of tricky questions, chiefly because cyber war cannot be waged without involving civilians. Private companies own more than 80 percent of the infrastructure McConnell talked about and without close public-private coordination, effective counter-strikes are next to impossible.

"Unlike traditional defense categories (i.e. land, sea and air), the military capabilities required to respond to an attack on U.S. infrastructure will necessarily involve infrastructure owned and operated by the private sector," according to Jody R. Westby, CEO of the Washington consulting firm Global Cyber Risk and a champion of better public-private coordination to cope with cyber attacks.(here)

Coordination between the military and civilians has yet to be tested. The military stayed away from an exercise this month that brought together experts from the U.S., Canada, Britain, New Zealand and Australia, 18 U.S. federal agencies and around 40 companies, including Microsoft and Cisco Systems. The game featured mock attacks against computer networks, pipelines and railroads.

(The exercise was described as the biggest of its kind. But "big" is relative. To get the scale into perspective: There are 233 countries connected to the Internet today, with an estimated 1.2 billion users. More than 120 countries are estimated to be developing cyber warfare capabilities).

As things stand, could the U.S. or its allies become victim of an attack similar to the Soviet pipeline blast? Probably yes. The threat comes from China, which has been placing heavy emphasis on what it calls "informationized war," and a motley array of hackers and terrorists.

Among the most potent weapons in their arsenal: "bots," malicious software robots that are the digital equivalent of terrorist sleeper cells that lie dormant for months or years before springing into destructive action. In testimony to Congress, Homeland Security's top scientist on cyber security, W. Douglas Maugham, has said that there is currently no effective antidote to bots.


How much damage could they do? Here is a scenario drawn from an interview with Westby, who is a member of the World Federation of Scientists' Permanent Monitoring Panel on Information Security. Her outline is based on the assumption that China has already implanted bots in millions of public and private computer systems.

"Bot herders" around the world unleash their malicious software bots to attack U.S. government, financial, oil and gas systems. One early victim: the U.S. Department of Commerce, which loses all communications because its internet and telephone communications use Voice over Internet Protocol networks. That means if the Internet goes down, all communications go down.

As Commerce is cut off, the U.S. collection point for inter-bank financial transactions discovers that bogus data are being inserted from both the sending and confirming side of the SWIFT (Society for Worldwide Interbank Financial Telecommunication) system. Chaos ensues in financial markets.

The New York Stock Exchange shuts down after massive "denial of service" attacks similar to those that last year forced Estonia to close down websites run by government ministries, banks and telecommunications companies.

At the same time, systems controlling the valves of oil and gas pipelines come under attack as bogus instructions override system controls and false data is sent to control room screens. The pipelines are shut. Some explode. There are casualties.

The government decides it must block the malicious traffic and come to the assistance of the financial, gas and oil companies under cyber attack. This involves deploying classified solutions and counter attacks through the networks of various U.S. communication providers.

The problem: There is no agreement between the Pentagon and the private sector on transferring private networks to military control. Owners are reluctant to turn over their systems to the military for fear their networks and their reputation might be damaged as a result of cyber war actions not under their control. The problem could be solved by the government declaring martial law, a step it is hesitant to take.

And what about the foreign-owned networks that would have to be used to launch an effective counter attack? Does the U.S. have to ask permission before sending cyber war actions across foreign networks? Would NATO have to be involved? (The 50-year-old treaty does not cover cyber warfare). Should the U.N. charter be amended to apply to cyber war rather than only "armed attacks?"

These are all questions that require urgent answers if the U.S., more dependent on computers and the Internet than most countries, wants to protect what a writer in the latest issue of the Armed Forces Journal aptly describes as "America's digital Achilles' heel."

Offline Effie Trinket

  • member
  • Member
  • *
  • Posts: 2,292

U.N. warns of nuclear cyber attack risk

Kevin Poulsen, SecurityFocus 2004-09-27

The United Nations' nuclear watchdog agency warned Friday of growing concern about cyber attacks against nuclear facilities.

The International Atomic Energy Agency (IAEA) announced in a statement that it was developing new guidelines aimed at combating the danger of computerized attacks by outside intruders or corrupt insiders. "For example, software operated control systems in a nuclear facility could be hacked or the software corrupted by staff with insider access," the group said.

The IAEA's new guidelines on "Security of Information Technology Related Equipment and Software Based Controls Against Malevolent Acts" are being finalized now, said the agency. The announcement came out of the agency's 48th annual general conference attended by 137 nations.

Last year the Slammer worm penetrated a private computer network at Ohio's idled Davis-Besse nuclear plant and disabled a safety monitoring system for nearly five hours. The worm entered the plant network through an interconnected contractor's network, bypassing Davis-Besse's firewall.

News of the Davis-Besse incident prompted Rep. Edward Markey (D-MA) last fall to call for U.S. regulators to establish cyber security requirements for the 103 nuclear reactors operating in the U.S., specifically requiring firewalls and up-to-date patching of security vulnerabilities. By that time the U.S. Nuclear Regulatory Commission (NRC) had already begun working on an official manual to guide plant operators in evaluating their cybersecurity posture.

But that document, finalized this month, "is not directive in nature," says Jim Davis, director of operations at the Nuclear Energy Institute, an industry association. "It does not establish a minimum level of security or anything like that. That isn't the purpose of the manual."

A related industry effort will establish management-level cyber security guidelines for plant operators, says Davis, who believes industry efforts are sufficient. "I think we are taking it seriously... and I think if the industry doesn't go far enough in this area we'll see more attention from regulators."

Neither the NRC manual nor the industry guidelines will be made public.

Separately, the NRC is working on a substantial revision of its regulatory guide, "Criteria for Use of Computers in Safety Systems of Nuclear Power Plants," which sets security and reliability criteria for installing new computerized safety systems in plants. It would replace the current guide, written in 1996, which is three pages long.

A working draft of the NRC guide reviewed by SecurityFocus would encourage plant operators to consider the effect of each new safety system on the plant's cyber security, and to develop response plans to deal with computer incidents. Additionally, it would urge vendors to maintain a secure development environment, and to probe their products for backdoors and logic bombs before shipping.

Offline Dig

  • All eyes are opened, or opening, to the rights of man.
  • Member
  • *****
  • Posts: 63,090
    • Git Ureself Edumacated
This is the missing link information which demands that Fukushima be investigated for STUXNET or other remote control operations.

This is the missing link information which exposes most of the compromised software used for state sponsored cyber terror against the American people.
All eyes are opened, or opening, to the rights of man. The general spread of the light of science has already laid open to every view the palpable truth, that the mass of mankind has not been born with saddles on their backs, nor a favored few booted and spurred, ready to ride them legitimately