Posts Tagged ‘Oak Ridge National Laboratory’

Friday, February 21, 2014 @ 09:02 AM gHale

A new suite of computer codes that closely model the behavior of neutrons in a reactor core, called neutronics, could give a more accurate way to analyze nuclear power reactors.

Technical staff at Westinghouse Electric Company, LLC, supported by the research team at the Consortium for Advanced Simulation of Light Water Reactors (CASL), used the Virtual Environment for Reactor Applications core simulator (VERA-CS) to analyze its AP1000 advanced pressurized water reactor (PWR). The testing focused on modeling the startup conditions of the AP1000 plant design.

RELATED STORIES
Duckweed Great Biofuel Potential
Hybrid Fuel Cell Energy from Biomass
Graphene for Wireless Communications
New Wave Organic Solar Cells
More Efficient Solar Cells – In Theory

“In our experience with VERA-CS, we have been impressed by its accuracy in reproducing past reactor startup measurements. These results give us confidence that VERA-CS can be used to anticipate the conditions that will occur during the AP1000 reactor startup operations,” said Bob Oelrich, manager of PWR Core Methods at Westinghouse. “This new modeling capability will allow designers to obtain higher-fidelity power distribution predictions in a reactor core and ultimately further improve reactor performance.”

The AP1000 reactor is an advanced reactor design with enhanced passive that builds on decades of Westinghouse’s experience with PWR design. The first eight units are currently under construction in China and the United States, and represent the first Generation III+ reactor to receive Design Certification from the U.S. Nuclear Regulatory Commission.

CASL is a U.S. Department of Energy (DoE) Innovation Hub established at Oak Ridge National Laboratory (ORNL), a part of DoE’s National Laboratory System. The consortium core partners are a strategic alliance of leaders in nuclear science and engineering from government, industry and academia.

“At CASL, we set out to improve reactor performance with predictive, science-based, simulation technology that harnesses world-class computational power,” said CASL Director Doug Kothe. “Our challenge is to advance research that will allow power uprates and increase fuel burn-up for U.S. nuclear plants. In order to do this, CASL is meeting the need for higher-fidelity, integrated tools.”

During the first generation of nuclear energy, performance and safety margins ended up at conservative levels as industry and researchers gained experience with the operation and maintenance of what was then a new and complex technology. Over the past 50 years, nuclear scientists and engineers gained a deeper understanding of the reactor processes, further characterizing nuclear reactor fuel and structure materials.

By making use of newly available computing resources, CASL’s research aims for a step increase in the improvements in reactor operations that have occurred over the last several decades.

“CASL has been using modern high-performance computing platforms such as ORNL’s Titan, working in concert with the INL Fission computer system, for modeling and simulation at significantly increased levels of detail,” said CASL Chief Computational Scientist John Turner. “However, we also recognized the need to deliver a product that is suitable for industry-sized computing platforms.”

With that understanding, CASL designed the Test Stand project to try out tools such as VERA-CS in industrial applications. CASL partner Westinghouse ended up selected as the host for the first trial run of the new VERA nuclear reactor core simulator (VERA-CS). Westinghouse chose a real-world application for VERA-CS: The reactor physics-analysis of the AP1000 PWR, which features a core design with several advanced features. Using VERA-CS to study the AP1000 provides information to further improve the characterization of advanced cores compared to traditional modeling approaches.

Westinghouse’s test run on VERA-CS focused on modeling one aspect of reactor physics called “neutronics,” which describes the behavior of neutrons in a reactor core. While neutronics is only one of VERA’s capabilities, the results provided by VERA-CS for the AP1000 PWR enhance Westinghouse’s confidence in their startup predictions and expand the validation of VERA by incorporating the latest trends in PWR core design and operational features.

The CASL team now is working on extending the suite of simulation capabilities to the entire range of operating conditions for commercial reactors, including full-power operation with fuel depletion and fuel cycle reload.

Wednesday, November 20, 2013 @ 04:11 PM gHale

By identifying fundamental forces that change plant structures during pretreatment processes used in the production of bioenergy, there may be more effective ways to convert woody plant matter into biofuels.

Pretreatment subjects plant material to extremely high temperature and pressure to break apart the protective gel of lignin and hemicellulose that surrounds sugary cellulose fibers.

RELATED STORIES
Converting Natural Gas to Biofuel
Test to Turn Plants to Fuel Faster
Aluminum Could Boost Fuel Cell Storage
Dolphin Radar Detects Explosives

“While pretreatments are used to make biomass more convertible, no pretreatment is perfect or complete,” said Department of Energy’s Oak Ridge National Laboratory’s (ORNL) Brian Davison, who was a coauthor of a research study and paper on the subject.

“Whereas the pretreatment can improve biomass digestion, it can also make a portion of the biomass more difficult to convert,” he said. “Our research provides insight into the mechanisms behind this ‘two steps forward, one step back’ process.” Also, pretreatment is the most expensive stage of biofuel production.

The team’s integration of experimental techniques including neutron scattering and X-ray analysis with supercomputer simulations revealed unexpected findings about what happens to water molecules trapped between cellulose fibers.

“As the biomass heats up, the bundle of fibers actually dehydrates — the water that’s in between the fibers gets pushed out,” said ORNL’s Paul Langan. “This is very counterintuitive because you are boiling something in water but simultaneously dehydrating it. It’s a really simple result, but it’s something no one expected.”

This process of dehydration causes the cellulose fibers to move closer together and become more crystalline, which makes them harder to break down.

In a second part of the study, the researchers analyzed the two polymers called lignin and hemicellulose that bond to form a tangled mesh around the cellulose bundles. According to the team’s experimental observations and simulations, the two polymers separate into different phases when heated during pretreatment.

“Lignin is hydrophobic so it repels water, and hemicellulose is hydrophilic, meaning it likes water,” Langan said. “Whenever you have a mixture of two polymers in water, one of which is hydrophilic and one hydrophobic, and you heat it up, they separate out into different phases.”

Understanding the role of these underlying physical factors — dehydration and phase separation — could enable scientists to engineer improved plants and pretreatment processes and ultimately bring down the costs of biofuel production.

“Our insight is that we have to find a balance which avoids cellulose dehydration but allows phase separation,” Langan said. “We know now what we have to achieve — we don’t yet know how that could be done, but we’ve provided clear and specific information to help us get there.”

Wednesday, October 30, 2013 @ 07:10 PM gHale

There could soon be new types of nuclear fuel pellets that would be safer in the event of a nuclear disaster.

New materials could end up encasing uranium-bearing fuel as an alternative to zirconium alloys, which have seen use as the outer layer of nuclear fuel pellets for the last 50 years, said a team of scientists from the University of Tennessee (UT) and Oak Ridge National Laboratory (ORNL), who will present their work at the AVS 60th International Symposium and Exhibition in Long Beach, CA.

RELATED STORIES
Hiking Oil Content in Plant Leaves
Engineering Yeast for Biofuels
Sun, Sewage Make Hydrogen Fuel
Bacteria Unite to Produce Electricity

Using sophisticated computer analyses, the UT and ORNL team identified the positive impact of several possible materials that exhibit resistance to high-temperature oxidation and failure, on reactor core evolution, thereby buying more time to cope in the event of a nuclear accident.

“At this stage there are several very intriguing options that are being explored,” said Steven J. Zinkle, the Governor’s Chair in the Department of Nuclear Engineering at the University of Tennessee and Oak Ridge National Laboratory. There is evidence that some of the new materials would reduce the oxidation by at least two orders of magnitude.

“That would be a game-changer,” he said. The materials examined include advanced steels, coated molybdenum and nuclear-grade silicon carbide composites (SiC fibers embedded in a SiC matrix).

The next step, he added, involves building actual fuel pins from these laboratory-tested materials and exposing them to irradiation inside a fission reactor. Once they perform as desired, the new fuel concepts would likely end up tested in a limited capacity in commercial reactors to then enable larger deployment possibilities.

Though it would take years before any new fuel concepts end up used commercially, given the rigorous and conservative qualification steps required, Zinkle said, these new materials may eventually replace the existing zirconium alloy cladding if they prove to be safer.

The typical core of a nuclear power plant uses the heat generated by fission of uranium and plutonium in fuel rods to heat and pressurize water. Steam then generates to drive steam turbines for electricity production. Water continuously circulates as a coolant to harness the thermal energy from the fuel and to keep the core from overheating.

The cooling pumps are a critical part of the reactor design because even when a nuclear reactor shuts down, the power it generates from radioactive decay of fission products remains at 1 percent of its peak for hours after shutdown. Given that nuclear power plants generate a staggering sum of energy under their nominal operating conditions (~4 GW of thermal energy), even one percent power levels after shutdown prove substantial. That’s why it’s essential to have cool water circulating continuously even after the shutdown occurs. Otherwise you risk overheating and ultimately melting the core – like leaving a pot boiling on the burner.

That’s basically what happened at Fukushima. On March 11, 2011, engineers at the plant managed to initially safely shut down the plant following a massive earthquake, but then a large tsunami knocked out the backup generators running the water pumps an hour later. What followed were explosions associated with hydrogen generated from the reduction of steam during high temperature oxidation of core materials, and releases of radioactive fission products. The accident displaced the local population and will take years and require a significant cost to clean-up.

Fukushima has had a profound impact on the safety culture of the industry, said Zinkle. Despite the fact that not a single U.S. nuclear power plant was unsafe or shut down in the wake of the accident, the Nuclear Regulatory Commission has issued new requirements to enhance their safety, including increasing the requirements for backup power generation on site.

Monday, August 27, 2012 @ 05:08 PM gHale

Knowing the position of missing oxygen atoms could be the key to cheaper solid oxide fuel cells with longer lifetimes.

That means new microscopy research could enable scientists to map these vacancies at an atomic scale.

RELATED STORIES
Thin Batteries Power E-Cars
Pulling Uranium from Seawater
Surf’s Up for Wave Energy Facility
Self-Charging Power Cell

Although fuel cells hold promise as an efficient energy conversion technology, they have yet to reach mainstream markets because of their high price tag and limited lifespans.

Overcoming these barriers requires a fundamental understanding of fuel cells, which produce electricity through a chemical reaction between oxygen and a fuel. As conducting oxygen ions move through the fuel cell, they travel through vacancies where oxygen atoms used to be. The distribution, arrangement and geometry of such oxygen vacancies in fuel cell materials should affect the efficiency of the overall device, researchers said.

“A big part of making a better fuel cell is to understand what the oxygen vacancies do inside the material: How fast they move, how they order, how they interact with interfaces and defects,” said Department of Energy’s Oak Ridge National Laboratory’s (ORNL) Albina Borisevich. “The question is how to study them. It’s one thing to see an atom of one type on the background of atoms of a different type. But in this case, you want to see if there are a few atoms missing. Seeing a void is much more difficult.”

ORNL scientists used scanning transmission electron microscopy to determine the distribution of oxygen vacancies in a fuel cell cathode material below the level of a single unit cell. The team verified its findings with theoretical calculations and neutron experiments at the lab’s Spallation Neutron Source.

“Even though the vacancy doesn’t generate any signal in the electron micrograph, it’s still a big disturbance in the structure,” Borisevich said. “You can see that the lattice expands where vacancies are present. So we tracked the lattice expansion around vacancies and compared it with theoretical models, and we were able to develop a calibration for this type of material.”

By providing a means to study vacancies at an atomic scale, the ORNL technique will help inform the development of improved fuel cell technologies in a systematic and deliberate fashion, in contrast to trial and error approaches.

Beyond its relevance to applications in fuel cells and information storage and logic devices, ORNL coauthor Sergei Kalinin said the team’s research is also building a bridge between two scientific communities that traditionally have had little in common.

“From my perspective, it is physics marrying electrochemistry,” Kalinin said. “The idea is that vacancies are important for energy, and vacancies are important for physics. The materials that physicists like to study are exactly the same as the materials used for fuel cells, and unless we understand how vacancies behave at interfaces, ferroic domain walls, and in thin films, we will not be able to fully appreciate the physics of these systems.”

Thursday, August 23, 2012 @ 07:08 PM gHale

A new material in development may make fueling nuclear reactors with uranium harvested from the ocean more feasible.

By combining Department of Energy’s Oak Ridge National Laboratory’s (ORNL) high-capacity reusable adsorbents and a Florida company’s high-surface-area polyethylene fibers creates a material that can rapidly, selectively and economically extract valuable and precious dissolved metals from water.

RELATED STORIES
Surf’s Up for Wave Energy Facility
Self-Charging Power Cell
Generating Electricity from Wastewater
Sunflowers Bring Efficient Solar Power

The material, HiCap, outperforms today’s best adsorbents, which perform surface retention of solid or gas molecules, atoms or ions. HiCap also effectively removes toxic metals from water, according to results verified by researchers at Pacific Northwest National Laboratory.

“We have shown that our adsorbents can extract five to seven times more uranium at uptake rates seven times faster than the world’s best adsorbents,” said Chris Janke, one of the inventors and a member of ORNL’s Materials Science and Technology Division.

HiCap effectively narrows the fiscal gap between what exists today and what they need to economically extract some of the ocean’s estimated 4.5 billion tons of uranium. Although dissolved uranium exists in concentrations of just 3.2 parts per billion, the sheer volume means there would be enough to fuel the world’s nuclear reactors for centuries.

The goal of extracting uranium from the oceans began with research and development projects in the 1960s, with Japan conducting the majority of the work. Other countries pursuing this goal include Russia, China, Germany, Great Britain, India, South Korea, Turkey and the United States. Researchers developed adsorbent materials, but none has emerged as being economically viable.

What sets the ORNL material apart is the adsorbents are made from small diameter, round or non-round fibers with high surface areas and excellent mechanical properties. By tailoring the diameter and shape of the fibers, researchers can significantly increase surface area and adsorption capacity. This and ORNL’s patent pending technology to manufacture the adsorbent fibers results in a material able to selectively recover metals more quickly and with increased adsorption capacity, thereby dramatically increasing efficiency.

“Our HiCap adsorbents are made by subjecting high-surface area polyethylene fibers to ionizing radiation, then reacting these pre-irradiated fibers with chemical compounds that have a high affinity for selected metals,” Janke said.

After the processing, scientists can place HiCap adsorbents in water containing the targeted material, which ends up quickly and preferentially trapped. Scientists then remove the adsorbents from the water and the metals end up extracted using a simple acid elution method. They can then regenerate and reuse the adsorbent after conditioning it with potassium hydroxide.

In a direct comparison to the current state-of-the-art adsorbent, HiCap provides significantly higher uranium adsorption capacity, faster uptake and higher selectivity, according to test results. Specifically, HiCap’s adsorption capacity is seven times higher (146 vs. 22 grams of uranium per kilogram of adsorbent) in spiked solutions containing 6 parts per million of uranium at 20 degrees Celsius. In seawater, HiCap’s adsorption capacity of 3.94 grams of uranium per kilogram of adsorbent was more than five times higher than the world’s best at 0.74 grams of uranium per kilogram of adsorbent. The numbers for selectivity showed HiCap to be seven times higher.

“These results clearly demonstrate that higher surface area fibers translate to higher capacity,” Janke said.

ORNL researchers conducted field tests of the material at the Marine Sciences Laboratory of Pacific Northwest National Laboratory in Sequim, WA, and at the Rosenstiel School of Marine & Atmospheric Science and Broad Key Island in collaboration with the University of Miami.

Tuesday, May 15, 2012 @ 03:05 PM gHale

There is now a new way to accurately measure gas bubbles in pipelines.

In the end, it all comes down to safety as the ability to measure gas bubbles in pipelines is vital to the manufacturing, power and petrochemical industries.

RELATED STORIES
Nano Sponge Sops Up Oil Spills
Biometric Security made Easier
Nano Radiation Sensor on Tap
Keeping Smart Hackers at Bay

In the case of harvesting petrochemicals from the seabed, warning of bubbles present in the harvested crude is crucial because otherwise when these bubbles come up from the seabed (where pressure is very high) to the surface where the rig is, the reduction in pressure causes these bubbles to expand and could cause a “blow out.” A blow out is the sudden release of oil and/or gas from a well and issues with the blow out preventer were key in Deepwater Horizon oil spill in the Gulf of Mexico in 2010.

Currently, the most popular technique for estimating the gas bubble size distribution (BSD) is to send sound waves through the bubble liquid and compare the measured attenuation of the sound wave (loss in amplitude as it propagates) with that predicted by theory.

The key problem is the theory assumes the bubbles exist in an infinite body of liquid. If in fact the bubbles are in a pipe, then the assumptions of the theory do not match the conditions of the experiment. That could lead to errors in the estimation of the bubble population.

There is now a new method, which takes into account that bubbles exist in a pipe, said Professor Tim Leighton from the Institute of Sound and Vibration Research at the University of Southampton, who is leading a project team.

Leighton and his team started the work as part of an ongoing program to devise ways of accurately estimating the BSD for the mercury-filled steel pipelines of the target test facility (TTF) of the $1.4 billion Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL), one of the most powerful pulsed neutron sources in the world.

The research explores how measured phase speeds and attenuations in bubbly liquid in a pipe might invert to estimate the BSD (independently measured using an optical technique). This new technique, appropriate for pipelines such as TTF, gives good BSD estimations if the frequency range is sufficiently broad.

“The SNS facility was built with the expectation that every so often it would need to be shut down and the now highly radioactive container of the mercury replaced by a new one, because its steel embrittles from radiation damage,” Leighton said. “However, because the proton beam impacts the mercury and generates shock waves, which cause cavitation bubbles to collapse in the mercury and erode the steel, the replacement may need to be more often than originally planned at full operating power. Indeed, achieving full design power is in jeopardy.

“With downtime associated with unplanned container replacement worth around $12 million, engineers at the facility are considering introducing helium bubbles, of the correct size and number, into the mercury to help absorb the shock waves before they hit the wall, so that the cavitation bubbles do not erode the steel. Oak Ridge National Laboratory (ORNL) and the Science and Facilities Research Council (Rutherford Appleton Laboratory, RAL) commissioned us as part of their program to devise instruments to check that their bubble generators can deliver the correct number and size of bubbles to the location where they will protect the pipelines from erosion.

“This paper reports on the method we devised half-way through the research contract. It works, but just after we designed it the 2008 global financial crash occurred, and funds were no longer available to build the device into the mercury pipelines of ORNL. A more affordable solution had to be found, which is what we are now working on. The original design has been put on hold for when the world is in a healthier financial state. This has been a fantastic opportunity to work with nuclear scientists and engineers from ORNL and RAL.”

Monday, May 14, 2012 @ 02:05 PM gHale

A carbon nanotube sponge that can soak up oil in water with incredible efficiency is now in the works.

Carbon nanotubes, which consist of atom-thick sheets of carbon rolled into cylinders, have captured scientific attention because of their high strength, potential high conductivity and light weight. However, producing nanotubes in bulk for specialized applications often faced limitations because of difficulties in controlling the growth process as well as dispersing and sorting the produced nanotubes. Not anymore.

RELATED STORIES
Biometric Security made Easier
Nano Radiation Sensor on Tap
Keeping Smart Hackers at Bay
Printing Liquid Solar Cells

That is because Bobby Sumpter at the Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) was part of a multi-institutional research team that set out to grow large clumps of nanotubes by selectively substituting boron atoms into the otherwise pure carbon lattice.

Sumpter and Vincent Meunier, now of Rensselaer Polytechnic Institute (RPI), conducted simulations on supercomputers, including Jaguar at ORNL’s Leadership Computing Facility, to understand how the addition of boron would affect the carbon nanotube structure.

“Any time you put a different atom inside the hexagonal carbon lattice, which is a chicken wire-like network, you disrupt that network because those atoms don’t necessarily want to be part of the chicken wire structure,” Sumpter said. “Boron has a different number of valence electrons, which results in curvature changes that trigger a different type of growth.”

Simulations and lab experiments showed the addition of boron atoms encouraged the formation of “elbow” junctions that help the nanotubes grow into a 3-D network.

“Instead of a forest of straight tubes, you create an interconnected, woven sponge-like material,” Sumpter said. “Because it is interconnected, it becomes three-dimensionally strong, instead of only one-dimensionally strong along the tube axis.”

Further experiments showed the team’s material, which is visible to the human eye, is extremely efficient at absorbing oil in contaminated seawater because it attracts oil and repels water.

“It loves carbon because it is primarily carbon,” Sumpter said. “Depending on the density of oil to water content and the density of the sponge network, it will absorb up to 100 times its weight in oil.”

The material’s mechanical flexibility, magnetic properties, and strength lend it additional appeal as a potential technology to aid in oil spill cleanup, Sumpter said.

“You can reuse the material over and over again because it’s so robust,” he said. “Burning it does not substantially decrease its ability to absorb oil, and squeezing it like a sponge doesn’t damage it either.”

The material’s magnetic properties, a result of the team’s use of an iron catalyst during the nanotube growth process, means a magnet can easily control it or remove it in an oil cleanup scenario. This ability is an improvement over existing substances used in oil removal, which usually stay behind after cleanup and can degrade the environment.

The experimental team submitted a patent application on the technology through Rice University. A research paper entitled “Covalently bonded three-dimensional carbon nanotube solids via boron induced nanojunctions,” is also available.

The research team included researchers from ORNL, Rice University; Universidade de Vigo, Spain; RPI; University of Illinois at Urbana-Champaign; Instituto de Microelectronica de Madrid, Spain; Air Force Office of Scientific Research Laboratory; Arizona State University; Universite Catholique de Louvain, Belgium; The Pennsylvania State University; and Shinshu University, Japan.

Wednesday, April 11, 2012 @ 12:04 PM gHale

By Nicholas Sheble
Social media like Facebook and LinkedIn may be a boon to marketing and human resource departments, but conversely they cause major security fears in production departments and strategic areas of companies.

The personal data and peer networking that have become important sales tools and product referral vehicles are weapons in the hands of hackers seeking entrée to computer systems and databases where the miscreants prospect for value assets.

RELATED STORIES
IT Security: Physical, not Just Cyber
McAfee: Abundant Gaps in Security
GOP Sen.’s Offer Own Security Bill
Cyber Security Bill Launches in Senate

Hackers use information they glean to learn details about the lives of employees of targeted companies so they can trick the victims into opening a malicious application on their work computers.

These ploys – social engineering techniques – exploit vulnerabilities in human nature and make the targeting more effective.

Francis deSouza, group president of enterprise at security company Symantec Corp. told The Wall Street Journal he saw one attack where a hacker learned that a systems administrator had five children. The hacker constructed an email with a malicious file attachment that appeared to come from the company’s human-resources department and contained information about a new benefit program for families with four or more kids.

Attackers often garner clues from social-networking sites like LinkedIn and Facebook where the criminal can identify an employee and his or her department within an organization, deSouza said.

Further, the criminal can troll sites like Facebook to learn the names of the employee’s friends and that person’s interests. The hacker can even visit Twitter to get a sense of how a person writes, how he or she constructs their sentences.

Once the hacker identifies the employee and learns more about him or her, the attack is on. The hacker will send the victim an email that appears to be from a friend or colleague. The email will include an apparently legitimate attachment that actually contains code that will allow the intruder access to the target’s computer. The code is sophisticated and of such quality, that antivirus software won’t detect it. Then, it’s off to the races.

In 2007, the Oak Ridge National Laboratory reported someone successfully targeted that facility using emails socially engineered to appear as though they were legitimate official communications. The escapade compromised computers and a database containing information about visitors to the facility. The hackers had the capability to steal data from that database.

In 2009, coordinated covert and targeted cyber attacks took place against global oil and petrochemical companies, according to McAfee Foundation Professional Services and McAfee Labs. These attacks, dubbed Night Dragon, used socially engineered emails along with Microsoft Windows operating system vulnerabilities to gain access to computers. Using the access obtained, the hackers stole information on operational oil-and-gas-field production systems and financial documents relating to field exploration and contract bidding.

In 2011, RSA told its customers it had suffered attack via socially engineered emails containing malicious attachments that exploited a zero-day Adobe Flash vulnerability. Hackers successfully gained access to the network and exfiltrated information including that related to RSA’s SecurID two-factor authentication products. Subsequently, the stolen information helped in the targeting of defense contractors.

All this tells us humans remain the weakest link in the security chain. Given the success of social engineering and email to hack systems, the security of the systems is moving away from perimeter defense, away from protecting the infrastructure to securing the valued information, the valued asset itself.
Nicholas Sheble (nsheble@isssource.com) is an engineering writer and technical editor in Raleigh, NC.

Thursday, March 22, 2012 @ 03:03 PM gHale

Computer systems of the agency in charge of America’s nuclear weapons stockpile are “under constant attack” and face millions of hacking attempts daily, said officials at the National Nuclear Security Administration (NNSA).

The agency faces cyber attacks from a “full spectrum” of hackers, said Thomas D’Agostino, head of the agency.

RELATED STORIES
Safety, Security Woes at TVA Nukes
Los Alamos Tightens Security Practices
Security Lapse at Prairie Island Nuke
New Nuke Designs Need Security

“They’re from other countries’ [governments], but we also get fairly sophisticated non-state actors as well,” he said. “The [nuclear] labs are under constant attack, the Department of Energy is under constant attack.”

A spokesman for the agency said the Nuclear Security Enterprise experiences up to 10 million “security significant cyber security events” each day.

“Of the security significant events, less than one hundredth of a percent can be categorized as successful attacks against the Nuclear Security Enterprise computing infrastructure,” the spokesman said, which puts the maximum number at 1,000 daily.

The agency wants to beef up its cyber security budget from $126 million in 2012 to $155 million in 2013 and has developed an “incident response center” responsible for identifying and mitigating cyber security attacks.

In April of last year, the Department of Energy’s Oak Ridge National Laboratory was successfully hacked and several megabytes of data stolen, D’Agostino said. Internet access for lab workers ended up disconnected following the breach.

Adam Segal, a cyber security expert with the Council on Foreign Relations, said it’s likely that a majority of those 10 million daily attacks are automated bots “constantly scanning the Internet looking for vulnerabilities.”

“The numbers are kind of inflated on that front,” Segal said, adding that it’s extremely unlikely that hackers would be able to remotely launch a nuclear warhead, because those systems are “airgapped” or disconnected from standard Internet systems. But the Stuxnet computer worm, discovered in 2010, widely spread to supposedly-secure uranium enrichment plants in Iran, Indonesia and India, shutting those systems down.

The NNSA said they are not aware of any viruses or malware that could remotely launch a nuclear warhead, but the “Stuxnet worm is a very real example of how sophisticated malware can cause physical damage to industrial systems.”

Segal said Stuxnet was a lesson — no matter how secure a computer system is, it can always suffer a breach.

“Stuxnet showed that airgapping is not a perfect defense,” Segal said. “Even in secure systems, people stick in their thumb drives, they go back and forth between computers. They can find vulnerabilities that way. If people put enough attention to it, they can possibly be penetrated.”

D’Agostino said with the agency facing so many hacking attempts, its employees have to remain vigilant.

“All it takes is one person to let their guard down,” he said. “This is going to be, in my view, an ever-growing area of concern.”

Segal said any successful hackers would likely have to have an intimate knowledge of the programming languages used by the Department of Energy.

“There’d probably have to be a state-based actor behind it. You have to understand a lot about the systems,” he said. “Hacking into the Department of Energy and looking for nuclear secrets — how to build a bomb — is probably much easier than trying to take over a bomb or a launch code, and probably of more interest to the Russians or the Chinese or the Iranians.”

Thursday, October 27, 2011 @ 06:10 PM gHale

Old radioactive facilities at Oak Ridge National Laboratory (ORNL) known to be pretty hot have proved hotter than expected, and that has caused delays in the clean up, which means higher costs as well.

The contract workers encountered higher levels of radiological contamination than original expected in project planning, said Dept. of Energy spokesman Mike Koentop. Safety and Ecology Corp. (SEC) is doing the work under a prime contract with the U.S. Department of Energy.

RELATED STORIES
Pump Fails, Shuts Seabrook Nuke
Radioactive Water Under GA Nuke
Unplanned Shutdown at SC Nuke
NRC Eyes MA Nuke after Shutdown

“This will almost certainly adversely impact the project’s cost and schedule,” Koentop said.

“We are currently in the process of modifying the contract to allow the contractor to do additional engineering and planning to determine the extent of the impacts on project’s cost and schedule,” he said. “At this time, it is premature to speculate what the ultimate impact will be to the cost and schedule for the project.”

The work is “moving along” despite the disruptions, said Dirk Van Hoesen, environmental manager at ORNL.

Asked about the peak dose rates discovered in the old hot cells, he said, “‘Oh, I don’t know, something in the 5r to 10r per hour rate.”

Regarding the evolving situation, he said, “It’s like anything. Until you get in and look around in great detail, it’s hard to know what all the specific conditions are.”

He said there probably hadn’t been any rad readings in the buildings since around 2005-2006 to indicate what the situation was like in there.

DOE’s Koentop said the C/D hot cell area was initially a “radiological facility,” but the latest date showed the internal dose levels in one cell is closer to a Category 3 nuclear facility.

“This anticipated change in facility conditions has caused the contractor to re-evaluate the technical approach and the method by which they will address the cleanup of the facility,” the DOE spokesman said. “Only limited work can continue on the D-side until the technical approach is approved and that is impacting the schedule.”

Koentop said much of the prep work had already wrapped up for some of the hot cells work on the “C” area.

The DOE spokesman said the federal agency expected to have “an approved technical approach, updated schedule and cost estimates from the contractor, and an independent government estimate by the end of the calendar year.”

Van Hoesen said if the characterization results prove that it’s a Cat-3 nuclear facility, that will definitely “increase the effort required from a safety document perspective and that’ll certainly have some influence on the cost.”

Christopher Leichtweis, SEC chief executive, whose company has got $80 million in Recovery Act cleanup projects in Oak Ridge, said he thought the hot cells project at ORNL was “going well, not great.”

“(Building) 3026 has some cost growth due to inventory . . . that none of us really knew was there until recently. So, we’re characterizing it to further understand the conditions,” he said.

The radiation levels are “about a magnitude higher” than previously thought, he said. The source could be fragment of nuclear fuel or irradiated metal, he said.

As for the source, Leichtweis said measurements indicate the rad is coming from 83 percent strontium, 13 percent cesium and then “additional miscellaneous fission product material.”

Work has now slowed. “Well, when you get a chance of condition, the safety basis then starts to slow down,” he said. “Because it requires more paper, more readiness, to get the project back on track. So you have to kind of go into a holding pattern, rewrite your documents, make them a little more robust, do more training and then you execute.”

 
 
Archived Entries