Posts Tagged ‘Oak Ridge National Laboratory’

Monday, July 13, 2015 @ 07:07 PM gHale

Turning trees, grass, and other biomass into fuel for automobiles and airplanes is a costly and complex process, but it is not impossible.

The goal is to have cellulosic ethanol, an alcohol derived from plant sugars, as common and affordable at the gas station as gasoline.

That goal will end up accomplished after researchers unravel the tightly wound network of molecules — cellulose, hemicellulose, and lignin — that make up the cell wall of plants for easier biofuel processing.

RELATED STORIES
Modified Switchgrass Means Better Biofuel
Converting Algal Oil to Fuels
Power from Evaporating Water
Fuel Cell Boost with Nano-Raspberries

Using high-performance computing, a group of researchers at Oak Ridge National Laboratory (ORNL) provided insight into how this might occur by simulating a well-established genetic modification to the lignin of an aspen tree in atomic-level detail.

The team’s conclusion: Hydrophobic, or water repelling, lignin binds less with hydrophilic, or water attracting, hemicelluloses. That knowledge points researchers toward a promising way to engineer better plants for biofuel.

The study is important because lignin, which is critical to the survival of plants in the wild, poses a problem for ethanol production, preventing enzymes from breaking down cellulose into simple sugars for fermentation.

Jeremy Smith, the director of ORNL’s Center for Molecular Biophysics and a Governor’s Chair at the University of Tennessee, led the project. His team’s simulation of a genetically modified lignin molecule linked to a hemicellulose molecule adds context to work conducted by researchers at the Department of Energy’s (DoE) BioEnergy Science Center (BESC), who demonstrated genetic modification of lignin can boost the amount of biofuel derived from plant material without compromising the structural integrity of the plant.

“BESC scientists created lots of different lignins randomly through genetic modification,” Smith said. “They found one that worked for them, but they wanted to know why it worked.”

To find the answer, Smith’s team turned to Titan, a 27-petaflop supercomputer at the Oak Ridge Leadership Computing Facility (OLCF).

Altering a Tree
Aspens are among the most widespread trees in North America, with a habitable zone that extends across the northern United States and Canada. As part of the genus Populus, which includes poplars and cottonwoods, they are fast growing and have the ability to adapt to diverse environments. Those are two qualities that make them prime candidates for cellulosic ethanol. Compared to traditional biofuel crops like corn and sugarcane, aspens require minimal care; they also can grow in areas where food crops cannot grow.

But the hardiness that allows aspens to thrive in nature makes them resistant to enzymatic breakdown during fermentation, an important step for converting biomass into ethanol. This problem can trace back to the molecular makeup of the plant cell wall, where lignin and hemicellulose bond to form a tangled mesh around cellulose.

Cellulose, a complex carbohydrate made up of glucose strands, comprises nearly half of all plant matter. It gives plants their structure, and it’s the critical substance needed to make cellulosic ethanol. To break down cellulose, one must get past lignin, a waste product of biofuel production that requires expensive treatments to isolate and remove. By throwing a wrench in the plant cell’s lignin assembly line, BESC scientists found they could boost biofuel production by 38 percent.

In nature, lignin adds strength to cellulosic fibers and protects the plant from predators and disease. Lignin molecules consist of multiple chemical groups made up of carbon, oxygen, and hydrogen assembled within the cell during a process called biosynthesis. During assembly, enzymes catalyze molecules into more complex units. By suppressing a key enzyme, Cinnamyl-alcohol dehydrogenase, BESC scientists created an “incomplete” lignin molecule. Instead of a hydrophilic alcohol group (an oxygen–hydrogen molecule bound to a hydrogen-saturated carbon atom), the final lignin polymer contained a hydrophobic aldehyde group (a carbon atom double-bonded to an oxygen atom).

“We wanted to see if there was a difference in the lignin–hemicellulose network if you substituted water-resisting aldehydes in the lignin for water-attracting alcohols,” said Loukas Petridis, an ORNL staff scientist. “Geneticists knew the modified plant could be more easily broken down, but they didn’t have an atomic-level explanation that a supercomputer like Titan can provide.”

Finding a Shortcut
Using a molecular dynamics code called NAMD, the team ran simulations of the wild lignin and the genetically modified lignin in a water cube, modeling the presence of the aldehydes by altering the partial charges of the oxygen and hydrogen atoms on the modified lignin’s allylic site.

The team simulated multiple runs of each 100,000-atom system for a few hundred nanoseconds, tracking the position of atoms in time increments of a femtosecond, or 1 thousand trillionth of a second. A comparison of the simulations showed weaker interaction between hemicellulose and the modified lignin than with wild lignin, suggesting that hydrophobic lignin interacts less with hydrophilic hemicellulose.

“From this you could make the testable assumption that making lignin more hydrophobic may lead to plants that are easier to deconstruct for biofuel,” Petridis said. “That’s the kind of rational insight we can provide using computer simulation.

“It took a decade of work to determine all the steps of lignin biosynthesis and find ways to manipulate genes. In the future, we hope to circumvent some of the work by continuing to test our models against experiment and making good suggestions about genes using supercomputers. That’s where the predictive power of molecular dynamic codes like NAMD comes in.”

“This modification is a bit more subtle and more complex to simulate,” Petridis said. “Finding out how good a predictive tool NAMD can be is the next step.”

Tuesday, January 20, 2015 @ 12:01 PM gHale

One of the issues behind detecting malware is how can you discover the bad software if you don’t even know it is bad software.

That issue can soon go away as a cyber security technology created at the Department of Energy’s (DoE) Oak Ridge National Laboratory (ORNL), can recognize malicious software even if the specific program has not been identified as a threat.

RELATED STORIES
Breach: When Minutes Count
Data Breach Awareness on Rise
Malware Creation Skyrockets in Q3
ICS Targeted in Malware Campaign

By computing and analyzing program behaviors associated with harmful intent, ORNL’s Hyperion technology can look inside an executable program to determine the software’s behavior without using its source code or running the program, said one of its inventors, Stacy Prowell of ORNL’s Cyber Warfare Research team.

“These behaviors can be automatically checked for known malicious operations as well as domain-specific problems,” Prowell said. “This technology helps detect vulnerabilities and can uncover malicious content before it has a chance to execute.”

Hyperion, which has been under development for a decade, offers more comprehensive scanning capabilities than existing cyber security methods.

“This approach is better than signature detection, which only searches for patterns of bytes,” Prowell said. “It’s easy for somebody to hide that — they can break it up and scatter it about the program so it won’t match any signature.”

Washington, D.C.-based R&K Cyber Solutions LLC (R&K) licensed Hyperion and is looking to go to market with the program this month.

“Software behavior computation is an emerging science and technology that will have a profound effect on malware analysis and software assurance,” said R&K Cyber Solutions Chief Executive Joseph Carter. “Computed behavior based on deep functional semantics is a much-needed cyber security approach that has not been previously available. Unlike current methods, behavior computation does not look at surface structure. Rather, it looks at deeper behavioral patterns.”

Carter said technology’s malware analysis capabilities can apply to multiple related cyber security problems, including software assurance in the absence of source code, hardware and software data exploitation and forensics, supply chain security analysis, anti-tamper analysis and potential first intrusion detection systems based on behavior semantics.

The licensed intellectual property includes two patent-pending technologies invented by Kirk Sayre of the Computational Sciences and Engineering Division and Richard Willems and former ORNL employee Stephen Lindberg of the Electrical and Electronics Systems Research Division. Others contributing to the technology were David Heise, Kelly Huffer, Logan Lamb, Mark Pleszkoch and Joel Reed of the Computational Sciences and Engineering Division.

Hyperion strengthens the cyber security of critical energy infrastructure by providing evidence of the secure functioning of energy delivery control system devices without requiring disclosure of the source code. This can advance resilient energy delivery systems designed, installed, operated and maintained to survive a cyber incident while sustaining critical functions.

Wednesday, November 12, 2014 @ 04:11 PM gHale

There are benefits for microgrids, small systems powered by renewables and energy storage devices, to break away from the main grid and working off its own island.

The benefit is microgrids can disconnect from larger utility grids and continue to provide power locally.

RELATED STORIES
Fuel Cell Works at Room Temperature
Biodiesel Fuels Cross Country Trip
Enhancing Rechargeable Batteries
Converting Crop Waste to Chemicals

“If the microgrid is always connected to the main grid, what’s the point?” Department of Energy and Oak Ridge National Laboratory researcher Yan Xu asked. “If something goes wrong with the main grid, like a dramatic drop in voltage, for example, you may want to disconnect.”

The idea behind microgrids is to not only continue power to local units such as neighborhoods, hospitals or industrial parks, but also improve energy efficiency and reduce cost when connected to the main grid.

Researchers predict an energy future more like a marketplace in which utility customers with access to solar panels, battery packs, plug-in vehicles and other sources of distributed energy can compare energy prices, switch on the best deals and even sell back unused power to utility companies.

However, before interested consumers can plug into their own energy islands, researchers at facilities such as ORNL’s Distributed Energy Control and Communication (DECC) lab need to develop tools for controlling a reliable, safe and efficient microgrid.

To simulate real scenarios where energy would end up used on a microgrid, DECC houses a functional microgrid with a generation capacity of 250 kilowatts (kW) that seamlessly switches on and off the main grid.

This grid includes an energy storage system that generates 25kW of power and uses 50kW hours of energy built from second-use electric vehicle batteries, a 50kW- and a 13.5 kW-solar system and two smart inverters that serve as the grid interfaces for the distributed energy emulators. Programmable load banks that mimic equipment consuming energy on the grid can provide sudden large load changes and second-by-second energy profiles.

“A microgrid should run an automated optimization frequently, about every five to 10 minutes,” Xu said.

To optimize grid operations, microgrid generators, power flow controllers, switches and loads must end up outfitted with sensors and communication links that can provide real-time information to a central communications control.

“Microgrids are not widely deployed yet. Today, functional microgrids are in the R&D phase, and their communications are not standardized,” Xu said. “We want to standardize microgrid communications and systems so they are compatible with the main grid and each other.”

Now two years into the inception of ORNL’s microgrid project — “Complete System-Level Efficient and Interoperable Solution for Microgrid Integrated Controls,” or CSEISMIC — the microgrid test bed at DECC is functional and employs an algorithm developed at ORNL that directs automatic transition on and off ORNL’s main grid.

Xu said the next year will focus on getting the energy management system (EMS) running. The EMS will drive optimization by allowing microgrid components to fluctuate an operation based on parameters such as demand and cost.

“The EMS may, for instance, tell the PVs [solar cells] how much power to generate for the next five to 10 minutes based on the time of day and energy demand,” Xu said.

The CSEISMIC team has long-term goals of partnering with industries to conduct field demonstrations of standardized grid prototypes.

“As soon as microgrids are standardized and easy to integrate into the main grid,” Xu said, “we’ll start seeing them in areas with a high penetration of renewables and high energy prices.”

Thursday, August 28, 2014 @ 06:08 PM gHale

By modifying the microstructural characteristics of carbon black, it may be possible for recycled tires to come to life in lithium-ion batteries that provide power to plug-in electric vehicles and store energy produced by wind and solar.

Carbon black is the recoverable substance from discarded tires and that may develop a better anode for lithium-ion batteries, said a Department of Energy Oak Ridge National Laboratory (ORNL) research team led by Parans Paranthaman and Amit Naskar. An anode is a negatively charged electrode used as a host for storing lithium during charging.

RELATED STORIES
Recycling Car Batteries into Solar Cells
Chip Thinks Like a Brain
Bionic Liquids Close Biofuel Refinery Loop
Cigarette Filters Go Green

The method has advantages over conventional approaches to making anodes for lithium-ion batteries.

“Using waste tires for products such as energy storage is very attractive not only from the carbon materials recovery perspective but also for controlling environmental hazards caused by waste tire stock piles,” Paranthaman said.

The ORNL technique uses a proprietary pretreatment to recover pyrolytic carbon black material, which is similar to graphite but man-made. When used in anodes of lithium-ion batteries, researchers were able to produce a small, laboratory-scale battery with a reversible capacity that is higher than what is possible with commercial graphite materials.

After 100 cycles, the capacity measures nearly 390 milliamp hours per gram of carbon anode, which exceeds the best properties of commercial graphite. Researchers attribute this to the unique microstructure of the tire-derived carbon.

“This kind of performance is highly encouraging, especially in light of the fact that the global battery market for vehicles and military applications is approaching $78 billion and the materials market is expected to hit $11 billion in 2018,” Paranthaman said.

Anodes are one of the leading battery components, with 11 to 15 percent of the materials market share, said Naskar, who noted the new method could eliminate a number of hurdles.

“This technology addresses the need to develop an inexpensive, environmentally benign carbon composite anode material with high-surface area, higher-rate capability and long-term stability,” Naskar said.

ORNL plans to work with U.S. industry to license this technology and produce lithium-ion cells for automobile, stationary storage, medical and military applications.

Paranthaman and Naskar wrote a paper entitled, “Tailored Recovery of Carbons from Waste Tires for Enhanced Performance as Anodes in Lithium-Ion Batteries.”

Click here to view the abstract or to purchase the paper.

Wednesday, May 7, 2014 @ 12:05 PM gHale

There is a new way to build nanowires just three atoms wide that should help scientists eventually create paper-thin, flexible tablets smartphones.

To create these new nanowires, you have to use a finely focused beam of electrons to build what some of the smallest wires ever made, said Junhao Lin, a Vanderbilt University doctoral student and visiting scientist at Oak Ridge National Laboratory in Tennessee who made the discovery. The tiny metallic wires are one-thousandth the width of the microscopic wires used today to connect the transistors in integrated computer circuits.

RELATED STORIES
Solving Thin Film Solar Cell Mystery
From Copper to Solar Cells
Fuel Cell View: Splitting Hydrogen
Hybrid Fuel Cell Energy from Biomass

“It’s at the cutting edge of everything,” said Sokrates Pantelides, Lin’s adviser and the university’s Distinguished Professor of Physics and Engineering. “People have obviously made nanowires, but they often might be 50 or 100 nanometers across. We have nanotubes one nanometer across. These are 0.4 nanometers. I would expect them to be fragile but they’re not at all. They are extremely robust.”

Lin made the nanowires using semiconducting materials that naturally form monolayers, which are layers one molecule thick, Pantelides said.

The materials, called transition-metal dichalcogenides, are the combination of the metals molybdenum or tungsten with either sulfur or selenium, university researchers said. The best-known member of the family is molybdenum disulfide, a common mineral used as a solid lubricant.

Scientists have used transition-metal dichalcogenides to build an atomic-scale honeycomb lattice of atoms that have shown important properties, like electricity, strength and heat conduction, Pantelides said.

Researchers have already created functioning transistors and flash memory gates out this material. By creating tiny nanowires out of this same material, the transistors and flash memory gates can end up connected.

The new nanowires are not stand-alone wires. They end up built into the honeycomb lattice, along with the transistors and gates. It’s all one thin, flexible material, which could end up used to build thin electronics, like smartphones and tablets.

“Looking to the future, we can create a flexible two-dimensional material,” said Pantelides. “You could potentially have screens or pages that are flexible like a sheet of paper. You might be able to fold them and then open them up to see the screen. The material is flexible already because it’s just one layer of atoms.”

Scientists around the world are working on the thin, flexible material, Pantelides said. The tiny nanowires are a key piece that has been missing in this scenario.

“This will likely stimulate a huge research interest in monolayer circuit design,” Lin said.

Friday, April 25, 2014 @ 11:04 AM gHale

Treating cadmium-telluride (CdTe) solar cell materials with cadmium-chloride improves their efficiency, but how that happens remained a mystery – until now.

After an atomic-scale examination of the thin-film solar cells, the light bulb has turned on and this decades-long debate about the materials’ photovoltaic efficiency increase after treatment appears solved.

RELATED STORIES
From Copper to Solar Cells
Fuel Cell View: Splitting Hydrogen
Hybrid Fuel Cell Energy from Biomass
Solar Can Work in Cloudy Cities

A research team from led by the Department of Energy’s (DoE) Oak Ridge National Laboratory (ORNL), the University of Toledo and DoE’s National Renewable Energy Laboratory used electron microscopy and computational simulations to explore the physical origins of the unexplained treatment process.

Thin-film CdTe solar cells are a potential rival to silicon-based photovoltaic systems because of their theoretically low cost per power output and ease of fabrication. Their comparatively low historical efficiency in converting sunlight into energy, however, has limited the technology’s widespread use.

Research in the 1980s showed that treating CdTe thin films with cadmium-chloride significantly raises the cell’s efficiency, but scientists have been unable to determine the underlying causes. ORNL’s Chen Li, first author of a study on the subject, said the answer lay in investigating the material at an atomic level.

“We knew that chlorine was responsible for this magical effect, but we needed to find out where it went in the material’s structure,” Li said. “Only by understanding the structure can we understand what’s wrong in this solar cell — why the efficiency is not high enough, and how can we push it further.”

By comparing the solar cells before and after chlorine treatment, the researchers realized atom-scale grain boundaries ended up implicated in the enhanced performance. Grain boundaries are tiny defects that normally act as roadblocks to efficiency, because they inhibit carrier collection which greatly reduces the solar cell power.

Using state of the art electron microscopy techniques to study the thin films’ structure and chemical composition after treatment, the researchers found chlorine atoms replaced tellurium atoms within the grain boundaries. This atomic substitution creates local electric fields at the grain boundaries that boost the material’s photovoltaic performance instead of damaging it.

The research team’s finding, in addition to providing a long-awaited explanation, could help guide engineering of higher-efficiency CdTe solar cells. Controlling the grain boundary structure is a new direction that could help raise the cell efficiencies closer to the theoretical maximum of 32 percent light-to-energy conversion, Li said. Currently, the record CdTe cell efficiency is only 20.4 percent.

“We think that if all the grain boundaries in a thin film material could be aligned in same direction, it could improve cell efficiency even further,” Li said.

Friday, April 25, 2014 @ 10:04 AM gHale

A fuel cell catalyst that converts hydrogen into electricity needs to rip open a hydrogen molecule.

By capturing a view of how the catalyst does this gives insight into how to make the catalyst work better for alternative energy uses.

RELATED STORIES
Hybrid Fuel Cell Energy from Biomass
Solar Can Work in Cloudy Cities
Zinc Oxide Brings Solar Cell Efficiency
New Material Boosts Solar Cells

This study is the first time scientists have shown precisely where the hydrogen halves end up in the structure of a molecular catalyst that breaks down hydrogen. The design of this catalyst ended up inspired by the innards of a natural protein called a hydrogenase enzyme.

“The catalyst shows us what likely happens in the natural hydrogenase system,” said Morris Bullock of the Department of Energy’s (DoE) Pacific Northwest National Laboratory. “The catalyst is where the action is, but the natural enzyme has a huge protein surrounding the catalytic site. It would be hard to see what we have seen with our catalyst because of the complexity of the protein.”

Hydrogen-powered fuel cells offer an alternative to burning fossil fuels, which generates greenhouse gases. Molecular hydrogen — two hydrogen atoms linked by an energy-rich chemical bond — feeds a fuel cell. Generating electricity through chemical reactions, the fuel cell spits out water and power.

If renewable power can store energy in molecular hydrogen, these fuel cells can be carbon-neutral. But fuel cells aren’t cheap enough for everyday use.

To make fuel cells less expensive, researchers turned to natural hydrogenase enzymes for inspiration. These enzymes break hydrogen for energy in the same way a fuel cell would. But while conventional fuel cell catalysts require expensive platinum, natural enzymes use cheap iron or nickel at their core.

Researchers have been designing catalysts inspired by hydrogenase cores and testing them. In this work, an important step in breaking a hydrogen molecule so the bond’s energy can end up captured as electricity is to break the bond unevenly. Instead of producing two equal hydrogen atoms, this catalyst must produce a positively charged proton and a negatively charged hydride.

The physical shape of a catalyst – along with electrochemical information — can reveal how it does that. So far, scientists found the overall structure of catalysts with cheap metals using X-ray crystallography, but for hydrogen atoms X-rays won’t cut it. Based on chemistry and X-ray methods, researchers have a best guess for the position of hydrogen atoms, but imagination is no substitute for reality.

Bullock, Tianbiao “Leo” Liu and their colleagues at the Center for Molecular Electrocatalysis at PNNL, one of DoE’s Energy Frontier Research Centers, collaborated with scientists at the Spallation Neutron Source at Oak Ridge National Laboratory in Tennessee to find the lurking proton and hydride. Using a beam of neutrons like a flashlight allows researchers to pinpoint the nucleus of atoms that form the backbone architecture of their iron-based catalyst.

To use their iron-based catalyst in neutron crystallography, the team had to modify it chemically so it would react with the hydrogen molecule in just the right way. Neutron crystallography also requires larger crystals as starting material compared to X-ray crystallography.

“We were designing a molecule that represented an intermediate in the chemical reaction, and it required special experimental techniques,” Liu said. “It took more than six months to find the right conditions to grow large single crystals suitable for neutron diffraction. And another six months to pinpoint the position of the split H2 molecule.”

Crystallizing their catalyst of interest into a nugget almost 40 times the size needed for X-rays, the team succeeded in determining the structure of the iron-based catalyst.

The structure, they found, confirmed theories based on chemical analyses. For example, the barbell-shaped hydrogen molecule snuggles into the catalyst core. Once split, the negatively charged hydride attaches to the iron at the center of the catalyst; meanwhile, the positively charged proton attaches to a nitrogen atom across the catalytic core. The researchers expected this set-up, but no one had accurately characterized it in an actual structure before.

In this form, the hydride and proton form a type of bond uncommonly seen by scientists — a dihydrogen bond. The energy-rich chemical bond between two hydrogen atoms in a molecule is a covalent bond and is very strong. Another bond called a “hydrogen bond” is a weak one formed between a slightly positive hydrogen and another, slightly negative atom.

Hydrogen bonds stabilize the structure of molecules by tacking down chains as they fold over within a molecule or between two independent molecules.

The dihydrogen bond in the structure is much stronger than a single hydrogen bond. Measuring the distance between atoms reveals how tight the bond is. The team found the dihydrogen bond was much shorter than typical hydrogen bonds but longer than typical covalent bonds. In fact, the dihydrogen bond is the shortest of its type so far identified, the researchers report.

This unusually strong dihydrogen bond likely plays into how well the catalyst balances tearing the hydrogen molecule apart and putting it back together. This balance allows the catalyst to work efficiently.

“We’re not too far from acceptable with its efficiency,” said Bullock. “Now we just want to make it a little more efficient and faster.”

Friday, February 21, 2014 @ 09:02 AM gHale

A new suite of computer codes that closely model the behavior of neutrons in a reactor core, called neutronics, could give a more accurate way to analyze nuclear power reactors.

Technical staff at Westinghouse Electric Company, LLC, supported by the research team at the Consortium for Advanced Simulation of Light Water Reactors (CASL), used the Virtual Environment for Reactor Applications core simulator (VERA-CS) to analyze its AP1000 advanced pressurized water reactor (PWR). The testing focused on modeling the startup conditions of the AP1000 plant design.

RELATED STORIES
Duckweed Great Biofuel Potential
Hybrid Fuel Cell Energy from Biomass
Graphene for Wireless Communications
New Wave Organic Solar Cells
More Efficient Solar Cells – In Theory

“In our experience with VERA-CS, we have been impressed by its accuracy in reproducing past reactor startup measurements. These results give us confidence that VERA-CS can be used to anticipate the conditions that will occur during the AP1000 reactor startup operations,” said Bob Oelrich, manager of PWR Core Methods at Westinghouse. “This new modeling capability will allow designers to obtain higher-fidelity power distribution predictions in a reactor core and ultimately further improve reactor performance.”

The AP1000 reactor is an advanced reactor design with enhanced passive that builds on decades of Westinghouse’s experience with PWR design. The first eight units are currently under construction in China and the United States, and represent the first Generation III+ reactor to receive Design Certification from the U.S. Nuclear Regulatory Commission.

CASL is a U.S. Department of Energy (DoE) Innovation Hub established at Oak Ridge National Laboratory (ORNL), a part of DoE’s National Laboratory System. The consortium core partners are a strategic alliance of leaders in nuclear science and engineering from government, industry and academia.

“At CASL, we set out to improve reactor performance with predictive, science-based, simulation technology that harnesses world-class computational power,” said CASL Director Doug Kothe. “Our challenge is to advance research that will allow power uprates and increase fuel burn-up for U.S. nuclear plants. In order to do this, CASL is meeting the need for higher-fidelity, integrated tools.”

During the first generation of nuclear energy, performance and safety margins ended up at conservative levels as industry and researchers gained experience with the operation and maintenance of what was then a new and complex technology. Over the past 50 years, nuclear scientists and engineers gained a deeper understanding of the reactor processes, further characterizing nuclear reactor fuel and structure materials.

By making use of newly available computing resources, CASL’s research aims for a step increase in the improvements in reactor operations that have occurred over the last several decades.

“CASL has been using modern high-performance computing platforms such as ORNL’s Titan, working in concert with the INL Fission computer system, for modeling and simulation at significantly increased levels of detail,” said CASL Chief Computational Scientist John Turner. “However, we also recognized the need to deliver a product that is suitable for industry-sized computing platforms.”

With that understanding, CASL designed the Test Stand project to try out tools such as VERA-CS in industrial applications. CASL partner Westinghouse ended up selected as the host for the first trial run of the new VERA nuclear reactor core simulator (VERA-CS). Westinghouse chose a real-world application for VERA-CS: The reactor physics-analysis of the AP1000 PWR, which features a core design with several advanced features. Using VERA-CS to study the AP1000 provides information to further improve the characterization of advanced cores compared to traditional modeling approaches.

Westinghouse’s test run on VERA-CS focused on modeling one aspect of reactor physics called “neutronics,” which describes the behavior of neutrons in a reactor core. While neutronics is only one of VERA’s capabilities, the results provided by VERA-CS for the AP1000 PWR enhance Westinghouse’s confidence in their startup predictions and expand the validation of VERA by incorporating the latest trends in PWR core design and operational features.

The CASL team now is working on extending the suite of simulation capabilities to the entire range of operating conditions for commercial reactors, including full-power operation with fuel depletion and fuel cycle reload.

Wednesday, November 20, 2013 @ 04:11 PM gHale

By identifying fundamental forces that change plant structures during pretreatment processes used in the production of bioenergy, there may be more effective ways to convert woody plant matter into biofuels.

Pretreatment subjects plant material to extremely high temperature and pressure to break apart the protective gel of lignin and hemicellulose that surrounds sugary cellulose fibers.

RELATED STORIES
Converting Natural Gas to Biofuel
Test to Turn Plants to Fuel Faster
Aluminum Could Boost Fuel Cell Storage
Dolphin Radar Detects Explosives

“While pretreatments are used to make biomass more convertible, no pretreatment is perfect or complete,” said Department of Energy’s Oak Ridge National Laboratory’s (ORNL) Brian Davison, who was a coauthor of a research study and paper on the subject.

“Whereas the pretreatment can improve biomass digestion, it can also make a portion of the biomass more difficult to convert,” he said. “Our research provides insight into the mechanisms behind this ‘two steps forward, one step back’ process.” Also, pretreatment is the most expensive stage of biofuel production.

The team’s integration of experimental techniques including neutron scattering and X-ray analysis with supercomputer simulations revealed unexpected findings about what happens to water molecules trapped between cellulose fibers.

“As the biomass heats up, the bundle of fibers actually dehydrates — the water that’s in between the fibers gets pushed out,” said ORNL’s Paul Langan. “This is very counterintuitive because you are boiling something in water but simultaneously dehydrating it. It’s a really simple result, but it’s something no one expected.”

This process of dehydration causes the cellulose fibers to move closer together and become more crystalline, which makes them harder to break down.

In a second part of the study, the researchers analyzed the two polymers called lignin and hemicellulose that bond to form a tangled mesh around the cellulose bundles. According to the team’s experimental observations and simulations, the two polymers separate into different phases when heated during pretreatment.

“Lignin is hydrophobic so it repels water, and hemicellulose is hydrophilic, meaning it likes water,” Langan said. “Whenever you have a mixture of two polymers in water, one of which is hydrophilic and one hydrophobic, and you heat it up, they separate out into different phases.”

Understanding the role of these underlying physical factors — dehydration and phase separation — could enable scientists to engineer improved plants and pretreatment processes and ultimately bring down the costs of biofuel production.

“Our insight is that we have to find a balance which avoids cellulose dehydration but allows phase separation,” Langan said. “We know now what we have to achieve — we don’t yet know how that could be done, but we’ve provided clear and specific information to help us get there.”

Wednesday, October 30, 2013 @ 07:10 PM gHale

There could soon be new types of nuclear fuel pellets that would be safer in the event of a nuclear disaster.

New materials could end up encasing uranium-bearing fuel as an alternative to zirconium alloys, which have seen use as the outer layer of nuclear fuel pellets for the last 50 years, said a team of scientists from the University of Tennessee (UT) and Oak Ridge National Laboratory (ORNL), who will present their work at the AVS 60th International Symposium and Exhibition in Long Beach, CA.

RELATED STORIES
Hiking Oil Content in Plant Leaves
Engineering Yeast for Biofuels
Sun, Sewage Make Hydrogen Fuel
Bacteria Unite to Produce Electricity

Using sophisticated computer analyses, the UT and ORNL team identified the positive impact of several possible materials that exhibit resistance to high-temperature oxidation and failure, on reactor core evolution, thereby buying more time to cope in the event of a nuclear accident.

“At this stage there are several very intriguing options that are being explored,” said Steven J. Zinkle, the Governor’s Chair in the Department of Nuclear Engineering at the University of Tennessee and Oak Ridge National Laboratory. There is evidence that some of the new materials would reduce the oxidation by at least two orders of magnitude.

“That would be a game-changer,” he said. The materials examined include advanced steels, coated molybdenum and nuclear-grade silicon carbide composites (SiC fibers embedded in a SiC matrix).

The next step, he added, involves building actual fuel pins from these laboratory-tested materials and exposing them to irradiation inside a fission reactor. Once they perform as desired, the new fuel concepts would likely end up tested in a limited capacity in commercial reactors to then enable larger deployment possibilities.

Though it would take years before any new fuel concepts end up used commercially, given the rigorous and conservative qualification steps required, Zinkle said, these new materials may eventually replace the existing zirconium alloy cladding if they prove to be safer.

The typical core of a nuclear power plant uses the heat generated by fission of uranium and plutonium in fuel rods to heat and pressurize water. Steam then generates to drive steam turbines for electricity production. Water continuously circulates as a coolant to harness the thermal energy from the fuel and to keep the core from overheating.

The cooling pumps are a critical part of the reactor design because even when a nuclear reactor shuts down, the power it generates from radioactive decay of fission products remains at 1 percent of its peak for hours after shutdown. Given that nuclear power plants generate a staggering sum of energy under their nominal operating conditions (~4 GW of thermal energy), even one percent power levels after shutdown prove substantial. That’s why it’s essential to have cool water circulating continuously even after the shutdown occurs. Otherwise you risk overheating and ultimately melting the core – like leaving a pot boiling on the burner.

That’s basically what happened at Fukushima. On March 11, 2011, engineers at the plant managed to initially safely shut down the plant following a massive earthquake, but then a large tsunami knocked out the backup generators running the water pumps an hour later. What followed were explosions associated with hydrogen generated from the reduction of steam during high temperature oxidation of core materials, and releases of radioactive fission products. The accident displaced the local population and will take years and require a significant cost to clean-up.

Fukushima has had a profound impact on the safety culture of the industry, said Zinkle. Despite the fact that not a single U.S. nuclear power plant was unsafe or shut down in the wake of the accident, the Nuclear Regulatory Commission has issued new requirements to enhance their safety, including increasing the requirements for backup power generation on site.

 
 
Archived Entries