Sunday, August 28, 2011

Researchers Create Functioning Synapse Using Carbon Nanotubes

Engineering researchers the University of Southern California have made a significant breakthrough in the use of nanotechnologies for the construction of a synthetic brain. They have built a carbon nanotube synapse circuit whose behavior in tests reproduces the function of a neuron, the building block of the brain. The team, which was led by Professor Alice Parker and Professor Chongwu Zhou in the USC Viterbi School of Engineering Ming Hsieh Department of Electrical Engineering, used an interdisciplinary approach combining circuit design with nanotechnology to address the complex problem of capturing brain function.

In a paper published in the proceedings of the IEEE/NIH 2011 Life Science Systems and Applications Workshop in April 2011, the Viterbi team detailed how they were able to use carbon nanotubes to create a synapse.

Carbon nanotubes are molecular carbon structures that are extremely small, with a diameter a million times smaller than a pencil point. These nanotubes can be used in electronic circuits, acting as metallic conductors or semiconductors.

"This is a necessary first step in the process," said Parker, who began the looking at the possibility of developing a synthetic brain in 2006. "We wanted to answer the question: Can you build a circuit that would act like a neuron? The next step is even more complex. How can we build structures out of these circuits that mimic the function of the brain, which has 100 billion neurons and 10,000 synapses per neuron?"

Parker emphasized that the actual development of a synthetic brain, or even a functional brain area is decades away, and she said the next hurdle for the research centers on reproducing brain plasticity in the circuits.

The human brain continually produces new neurons, makes new connections and adapts throughout life, and creating this process through analog circuits will be a monumental task, according to Parker.

She believes the ongoing research of understanding the process of human intelligence could have long-term implications for everything from developing prosthetic nanotechnology that would heal traumatic brain injuries to developing intelligent, safe cars that would protect drivers in bold new ways.

Read more...

Columbia Engineering Study Links Ozone Hole To Climate Change All The Way To The Equator

In a study to be published in the April 21st issue of Science magazine, researchers at Columbia University's School of Engineering and Applied Science report their findings that the ozone hole, which is located over the South Pole, has affected the entire circulation of the Southern Hemisphere all the way to the equator. While previous work has shown that the ozone hole is changing the atmospheric flow in the high latitudes, the Columbia Engineering paper, "Impact of Polar Ozone Depletion on Subtropical Precipitation," demonstrates that the ozone hole is able to influence the tropical circulation and increase rainfall at low latitudes in the Southern Hemisphere. This is the first time that ozone depletion, an upper atmospheric phenomenon confined to the polar regions, has been linked to climate change from the Pole to the equator. "The ozone hole is not even mentioned in the summary for policymakers issued with the last IPCC report," noted Lorenzo M. Polvani, Professor of Applied Mathematics and of Earth & Environmental Sciences, Senior Research Scientist at the Lamont-Doherty Earth Observatory, and co-author of the paper. "We show in this study that it has large and far-reaching impacts. The ozone hole is a big player in the climate system!"

"It's really amazing that the ozone hole, located so high up in the atmosphere over Antarctica, can have an impact all the way to the tropics and affect rainfall there — it's just like a domino effect," said Sarah Kang, Postdoctoral Research Scientist in Columbia Engineering's Department of Applied Physics and Applied Mathematics and lead author of the paper.

The ozone hole is now widely believed to have been the dominant agent of atmospheric circulation changes in the Southern Hemisphere in the last half century. This means, according to Polvani and Kang, that international agreements about mitigating climate change cannot be confined to dealing with carbon alone— ozone needs to be considered, too. "This could be a real game-changer," Polvani added.

Located in the Earth's stratosphere, just above the troposphere (which begins on Earth's surface), the ozone layer absorbs most of the Sun's harmful ultraviolet rays. Over the last half-century, widespread use of manmade compounds, especially household and commercial aerosols containing chlorofluorocarbons (CFCs), has significantly and rapidly broken down the ozone layer, to a point where a hole in the Antarctic ozone layer was discovered in the mid 1980s. Thanks to the 1989 Montreal Protocol, now signed by 196 countries, global CFC production has been phased out. As a result, scientists have observed over the past decade that ozone depletion has largely halted and they now expect it to fully reverse, and the ozone hole to close by midcentury.

But, as Polvani has said, "While the ozone hole has been considered as a solved problem, we're now finding it has caused a great deal of the climate change that's been observed." So, even though CFCs are no longer being added to the atmosphere, and the ozone layer will recover in the coming decades, the closing of the ozone hole will have a considerable impact on climate. This shows that through international treaties such as the Montreal Protocol, which has been called the single most successful international agreement to date, human beings are able to make changes to the climate system.

Read more...

Researchers Find Fat Turns İnto Soap İn Sewers, Contributes To Overflows

Researchers from North Carolina State University have discovered how fat, oil and grease (FOG) can create hardened deposits in sewer lines: it turns into soap! The hardened deposits, which can look like stalactites, contribute to sewer overflows. "We found that FOG deposits in sewage collection systems are created by chemical reactions that turn the fatty acids from FOG into, basically, a huge lump of soap," says Dr. Joel Ducoste, a professor of civil, construction and environmental engineering at NC State and co-author of a paper describing the research. Collection systems are the pipes and pumping stations that carry wastewater from homes and businesses to sewage-treatment facilities.

These hardened FOG deposits reduce the flow of wastewater in the pipes, contributing to sewer overflows – which can cause environmental and public-health problems and lead to costly fines and repairs.

The research team used a technique called Fourier Transform Infrared (FTIR) spectroscopy to determine what the FOG deposits were made of at the molecular level. FTIR spectroscopy shoots a sample material with infrared light at various wavelengths. Different molecular bonds vibrate in response to different wavelengths. By measuring which infrared wavelengths created vibrations in their FOG samples, researchers were able to determine each sample's molecular composition.

Using this technique, researchers confirmed that the hardened deposits were made of calcium-based fatty acid salts – or soap.

"FOG itself cannot create these deposits," Ducoste says. "The FOG must first be broken down into its constituent parts: glycerol and free fatty acids. These free fatty acids – specifically, saturated fatty acids – can react with calcium in the sewage collection system to form the hardened deposits.

Read more...

Stanford Research Moves Nanomedicine One Step Closer To Reality

A class of engineered nanoparticles — gold-centered spheres smaller than viruses — has been shown safe when administered by two alternative routes in a mouse study led by investigators at the Stanford University School of Medicine. This marks the first step up the ladder of toxicology studies that, within a year and a half, could yield to human trials of the tiny agents for detection of colorectal and possibly other cancers. "These nanoparticles' lack of toxicity in mice is a good sign that they'll behave well in humans," said Sanjiv Sam Gambhir, MD, PhD, professor of radiology and senior author of the study, which will be published April 20 as the featured paper in Science Translational Medicine.

"Early detection of any cancer, including colorectal cancer, markedly improves survival," said Gambhir. For example, the widespread use of colonoscopy has significantly lowered colon-cancer mortality rates, he said. "But colonoscopy relies on the human eye. So this screening tool, while extremely useful, still misses many cancer lesions such as those that are too tiny, obscure or flat to be noticed."

A promising way to catch cancer lesions early is to employ molecular reporters that are attracted to cancer-lesion sites. One method in use involves fluorescent dyes coupled with antibodies that recognize and bind to surface features of cancer cells.

But that approach has its drawbacks, said Gambhir, who is the director of the Molecular Imaging Program at Stanford. The body's own tissues also fluoresce slightly, complicating attempts to pinpoint tumor sites. Plus, the restricted range of colors at which antibody-affixed dyes fluoresce limits the number of different tumor-associated features that can be simultaneously identified. Some versions of this approach have also proved toxic to cells.

The new study is the first-ever successful demonstration of the safety of a new class of agents: tiny gold balls that have been coated with materials designed to be detected with very high sensitivity, then encased in see-through silica shells and bound to polyethylene glycol molecules to make them more biologically friendly. Molecules that home in on cancer cells can be affixed to them. The resulting nanoparticles measure a mere 100 nanometers in diameter.

The materials surrounding the nanoparticles' gold centers have special, if subtle, optical properties. Typically, light bounces off of a material's surface at the same wavelength it had when it hit the surface. But in each of the specialized materials, about one ten-millionth of the incoming light bounces back in a pattern of discrete wavelengths characteristic of that material. The underlying gold cores have been roughed up in a manner that greatly amplifies this so-called "Raman effect," allowing the simultaneous detection of many different imaging materials by a sensitive instrument called a Raman microscope.

Read more...

Laser Sparks Revolution İn İnternal Combustion Engines

For more than 150 years, spark plugs have powered internal combustion engines. Automakers are now one step closer to being able to replace this long-standing technology with laser igniters, which will enable cleaner, more efficient, and more economical vehicles. In the past, lasers strong enough to ignite an engine's air-fuel mixtures were too large to fit under an automobile's hood. At this year's Conference on Lasers and Electro Optics (CLEO: 2011), to be held in Baltimore May 1 - 6, researchers from Japan will describe the first multibeam laser system small enough to screw into an engine's cylinder head.

Equally significant, the new laser system is made from ceramics, and could be produced inexpensively in large volumes, according to one of the presentation's authors, Takunori Taira of Japan's National Institutes of Natural Sciences.

According to Taira, conventional spark plugs pose a barrier to improving fuel economy and reducing emissions of nitrogen oxides (NOx), a key component of smog.

Spark plugs work by sending small, high-voltage electrical sparks across a gap between two metal electrodes. The spark ignites the air-fuel mixture in the engine's cylinder—producing a controlled explosion that forces the piston down to the bottom of the cylinder, generating the horsepower needed to move the vehicle.

Engines make NOx as a byproduct of combustion. If engines ran leaner – burnt more air and less fuel – they would produce significantly smaller NOx emissions.

Spark plugs can ignite leaner fuel mixtures, but only by increasing spark energy. Unfortunately, these high voltages erode spark plug electrodes so fast, the solution is not economical. By contrast, lasers, which ignite the air-fuel mixture with concentrated optical energy, have no electrodes and are not affected.

Lasers also improve efficiency. Conventional spark plugs sit on top of the cylinder and only ignite the air-fuel mixture close to them. The relatively cold metal of nearby electrodes and cylinder walls absorbs heat from the explosion, quenching the flame front just as it starts to expand.

Read more...

Singapore's First Locally Made Satellite Launched İnto Space

SAT, lifted off on board India's Polar Satellite Launch Vehicle PSLV-C16 at 10.12am Indian Standard Time (12.42pm, Singapore time) on 20 April 2011. The X-SAT, developed and built by Singapore's Nanyang Technological University (NTU), in collaboration with DSO National Laboratories, was launched from the Satish Dhawan Space Centre at Sriharikota in Andhra Pradesh, India.

The wholly made-in-Singapore satellite was one of the two "piggyback" mission satellites loaded on the PSLV-C16 rocket owned by the Indian Space Research Organisation. The PSLV-C16 successfully inserted the X-SAT into its planned orbit around the Earth.

"We are delighted with the successful launch of Singapore's first experimental micro-satellite into space. This represents a huge leap for our local research and development endeavours in space technology and building micro-satellites," said NTU President Dr Su Guaning. "I congratulate the X-SAT team of scientists, researchers and students. Their perseverance in the quest for new knowledge in space technology is most admirable and commendable."

"The sky is not the limit. There are enormous amounts in the world of science and technology that have not been explored and our academics are continually exploring and pushing the boundaries. We hope that the successful launch of X-SAT will excite and inspire more of our youths to take up engineering, and possibly venture into space technology," said Dr Su.

The NTU team members involved in the X-SAT project are currently trying to establish communication contact with the satellite from the Mission Control Station at NTU's Research Techno Plaza. Once contact with X-SAT is established, an initial health status of the satellite will be ascertained and confirmed.

Read more...

Contemporary Climate Change Alters The Pace and Drivers Of Extinction

Local extinction rates of American pikas have increased nearly five-fold in the last 10 years, and the rate at which the climate-sensitive species is moving up mountain slopes has increased 11-fold, since the 20th century, according to a study soon to be published in Global Change Biology. The research strongly suggests that the American pika's distribution throughout the Great Basin is changing at an increasingly rapid rate. The pika (Ochotona princeps), a small, hamster-looking animal sensitive to climate, occurs commonly in rocky talus slopes and lava flows throughout the western U.S. The study demonstrates a dramatic shift in the range of this rabbit relative, and illustrates the increasingly important role of climate in the loss of local pika populations across the nearly 150,000 square miles of the hydrological Great Basin. The authors investigated data across 110 years on pika distribution and 62 years of data on regional climate to first describe the patterns of local pika loss, and then examined strength of evidence for multiple competing hypotheses to explain why the losses are occurring. They found that among 25 sites in the Basin with 20th-century records of pikas, a species dependent on cool, high-mountain habitats, nearly half (four of ten) of the local pika extinctions have occurred after 1999. In addition, since 1999 the animals are moving up mountain slopes at an average (Basin-wide) rate of about 145 m (475 feet) per decade, as compared with an estimated Basin-wide average of about 13 m per decade during the 20th century. In contrast, a recent (2003) review found that worldwide, species demonstrating distributional shifts averaged upslope movement of 6.1 m per decade. The species does not seem to be losing ground everywhere across its geographic range, but at least in the Great Basin, it may be one of a group of species that can act as 'early-warning' indicators of how distributions of species may shift in the future.

The study's most novel scientific contribution was that the factors apparently driving the local-extinction process were strongly different during the 20th Century than during 1999-2008. This may mean that knowledge of past population dynamics of a particular species may not always help researchers predict how and why distributions change in the future. That is, the rules of the 'extinction game' seem to be shifting. This study was distinctive in that it relied upon fieldwork across an entire region rather than at just a few sites; had temperature data from the talus spaces that were previously or currently occupied by pikas (rather than simply estimated temperatures from weather recorders far from the study sites); and had three periods of data collection, which allowed for comparison of dynamics during the two intervening periods. Unlike most other mammals that have attracted management and conservation attention in the past, pikas are not widely hunted, don't require large areas of habitat for their individual home ranges, and live in remote high-elevation areas that experience a smaller array of land uses than that experienced by other species. Additionally, with a few localized exceptions, these pika losses have occurred without significant change in the amount or geographic arrangement of their rocky talus habitat. Habitat loss or degradation has typically been the most common cause of species decline, not only in mammals, but also among all animals. In addition to being sentinels, pikas are important because they are food for an array of animals, and as the 'ecosystem engineers' that they are, their presence affects the local plant composition and nutrient distributions.

Read more...

Researchers Create Elastic Material That Changes Color İn UV Light

Researchers from North Carolina State University have created a range of soft, elastic gels that change color when exposed to ultraviolet (UV) light – and change back when the UV light is removed or the material is heated up. The gels are impregnated with a type of photochromic compound called spiropyran. Spiropyrans change color when exposed to UV light, and the color they change into depends on the chemical environment surrounding the material.

The researchers made the gels out of an elastic silicone substance, which can be chemically modified to contain various other chemical compounds – changing the chemical environment inside the material. Changing this interior chemistry allows researchers to fine-tune how the color of the material changes when exposed to UV light.

"For example, if you want the material to turn yellow when exposed to UV light, you would attach carboxylic acid," explains Dr. Jan Genzer, Celanese Professor of Chemical and Biomolecular Engineering at NC State and co-author of a paper describing the research. "If you want magenta, you'd attach hydroxyl. Mix them together, and you get a shade of orange."

Photochromic compounds are not new, but this is the first time they've been incorporated into an elastic material, without impairing the material's elasticity.

The researchers were also able to create patterns by using a shaped mold to change the chemical make-up of specific regions in the material. For example, applying hydroxyl around a star-shaped mold (like a tiny cookie cutter) on the material would result in a yellow star-shaped pattern appearing on a dark magenta elastic when it is exposed to UV light.

"There are surely applications for this material – it's flexible, changes color in UV light, reverts to its original color in visible light, and can be patterned," Genzer says. "At this stage we have not identified the best application yet."

Read more...

Search For Weapons Of Mass Destruction Expands To East Africa

The United States government is expanding a 20-year-old program to secure and help destroy Cold War-era nuclear and other weapons of mass destruction (WMD) to an unlikely area of the world —East Africa, according to an article in the current edition of Chemical & Engineering News (C&EN), ACS's weekly newsmagazine. In the article, Glenn Hess, C&EN Senior Editor, explains that the focus of the Cooperative Threat Reduction Program (CTR) does not stem from any new intelligence indicating that Kenya, Tanzania, and Uganda have secretly developed nuclear weapons. Rather, it is part of a realization that some of the world's deadliest diseases — Ebola, Marburg and Rift Valley Fever viruses –– occur naturally in Africa, a volatile area where political instability raises concerns about terrorist use of those deadly microbes.

The article describes how CTR, which began in the former Soviet Union, now, is being expanded to confront the threat of bioterrorism in other regions of the world. Sen. Richard G. Lugar (R-IN), former chair of the Senate Foreign Relations Committee, who co-sponsored legislation creating CTR in 1991, last fall visited Kenya and Uganda. "Just one of the deadly viruses I witnessed could, if in the wrong hands, cause death and economic chaos," he said.

Read more...

Making Temporary Changes To Brain Could Speed Up Learning, Study Reports

In a breakthrough that may aid treatment of learning impairments, strokes, tinnitus and chronic pain, UT Dallas researchers have found that brain nerve stimulation accelerates learning in laboratory tests. Another major finding of the study, published in the April 14 issue of Neuron, involves the positive changes detected after stimulation and learning were complete. Researchers monitoring brain activity in rats found that brain responses eventually returned to their pre-stimulation state, but the animals could still perform the learned task. These findings have allowed researchers to better understand how the brain learns and encodes new skills.

Previous studies showed that people and animals that practice a task experience major changes in their brains. Learning to read Braille with a single finger leads to increased brain responses to the trained digit. Learning to discriminate among a set of tones leads to increased brain responses to the trained tones.

But it was not clear whether these changes are just coincidence or whether they truly help with learning. The current research demonstrates that changes in the brain are meaningful and not merely coincidental, said Dr. Amanda Reed, who wrote the article with colleagues from The University of Texas at Dallas' School of Behavioral and Brain Sciences.

Reed and her fellow researchers used brain stimulation to release neurotransmitters that caused the brain to increase its response to a small set of tones. The team found that this increase allowed rats to learn to perform a task using these tones more quickly than animals that had not received stimulation. This finding provides the first direct evidence that a larger brain response can aid learning.

Future treatments that enhance large changes in the brain may also assist with recovery from stroke or learning disabilities. In addition, some brain disorders such as tinnitus or chronic pain occur when large-scale brain changes are unable to reverse. So this new understanding of how the brain learns may lead to better treatments for these conditions.

Researchers examined the laboratory animals' brains again after the rats had practiced their learned task for a few weeks. The brains appeared to have returned to normal, even though the animals had not forgotten how to perform the task they had learned. This means that, although large changes in the brain were helpful for initial learning, those changes did not have to be permanent, Reed wrote.

"We think that this process of expanding the brain responses during learning and then contracting them back down after learning is complete may help animals and people to be able to perform many different tasks with a high level of skill," Reed said. "So for example, this may explain why people can learn a new skill like painting or playing the piano without sacrificing their ability to tie their shoes or type on a computer."

The study by Reed and colleagues supports a theory that large-scale brain changes are not directly responsible for learning, but accelerate learning by creating an expanded pool of neurons from which the brain can select the most efficient, small "network" to accomplish the new skill.

This new view of the brain can be compared to an economy or an ecosystem, rather than a computer, Reed said. Computer networks are designed by engineers and operate using a finite set of rules and solutions to solve problems. The brain, like other natural systems, works by trial and error.

The first step of learning is to create a large set of diverse neurons that are activated by doing the new skill. The second step is to identify a small subset of neurons that can accomplish the necessary computation and return the rest of the neurons to their previous state, so they can be used to learn the next new skill.

Read more...

Saturday, August 27, 2011

Researcher To Present Discoveries On Medical Uses Of Ultrasound To London's Royal Society

Jamie Tyler, assistant professor in the Virginia Tech Carilion Research Institute and the Virginia Tech-Wake Forest University School of Biomedical Engineering and Sciences, has been invited to speak at a Royal Society of London high level workshop on May 11-12 on the security implications of advances in neuroscience. The workshop is part of a four-part policy study on neuroscience and society called Brain Waves. This third module, entitled Neuroscience, conflict and security focuses on the international security implications and associated policy issues related to applications of advances in neuroscience and neurotechnology to the enhancement, manipulation, or degradation of human performance. The project is overseen by a Royal Society Working Group.

The technical sessions of the roundtable will focus on neuropharmacology, functional neuroimaging, and neural interfaces. As part of the the Mind Machine Interface session, Tyler will speak on the potential medical uses and concerns regarding emerging technologies for selectively activating or inactivating populations of dysfunctional nerve cells within the brain in order to develop effective non-invasive therapies for treatment of neurological and psychiatric disorders , such as ultrasound and transcranial magnetic stimulation.

"These are important discussions about mechanisms, applications, safety and policies around powerful new approaches for modulating electrical activity in the brain. This is an emerging area of interest at the interface of science and society and we are enthusiastic to have one of our leading biomedical researchers in neurotechnology, Dr. Tyler, playing a key role in this dialogue," said Michael Friedlander, director of the Virginia Tech Carilion Research Institute.

Read more...

New Biosensor Microchip Could Speed Up Drug Development, Stanford Researchers Say

Stanford researchers have developed a new biosensor microchip that could significantly speed up the process of drug development. The microchips, packed with highly sensitive "nanosensors," analyze how proteins bind to one another, a critical step for evaluating the effectiveness and possible side effects of a potential medication. A single centimeter-sized array of the nanosensors can simultaneously and continuously monitor thousands of times more protein-binding events than any existing sensor. The new sensor is also able to detect interactions with greater sensitivity and deliver the results significantly faster than the present "gold standard" method.

"You can fit thousands, even tens of thousands, of different proteins of interest on the same chip and run the protein-binding experiments in one shot," said Shan Wang, a professor of materials science and engineering, and of electrical engineering, who led the research effort.

"In theory, in one test, you could look at a drug's affinity for every protein in the human body," said Richard Gaster, MD/PhD candidate in bioengineering and medicine, who is the first author of a paper describing the research that was published online this month by Nature Nanotechnology.

The power of the nanosensor array lies in two advances. First, the use of magnetic nanotags attached to the protein being studied – such as a medication – greatly increases the sensitivity of the monitoring.

Second, an analytical model the researchers developed enables them to accurately predict the final outcome of an interaction based on only a few minutes of monitoring data. Current techniques typically monitor no more than four simultaneous interactions and the process can take hours.

"I think their technology has the potential to revolutionize how we do bioassays," said P.J. Utz, associate professor of medicine (immunology and rheumatology) at Stanford University Medical Center, who was not involved in the research.

Members of Wang's research group developed the magnetic nanosensor technology several years ago and demonstrated its sensitivity in experiments in which they showed that it could detect a cancer-associated protein biomarker in mouse blood at a thousandth of the concentration that commercially available techniques could detect. That research was described in a 2009 paper in Nature Medicine.

Read more...

Miniature İnvisibility 'Carpet Cloak' Hides More Than İts Small Size İmplies

Invisibility cloaks are seemingly futuristic devices capable of concealing very small objects by bending and channeling light around them. Until now, however, cloaking techniques have come with a significant limitation—they need to be orders of magnitude larger than the object being cloaked. This places serious constraints on practical applications, particularly for the optoelectronics industry, where size is a premium and any cloaking device would need to be both tiny and delicate.

An international team of physicists from the Technical University of Denmark (DTU), the University of Birmingham, UK, and Imperial College London, however, may have overcome this size limitation by using a technology known as a "carpet cloaks," which can conceal a much larger area than other cloaking techniques of comparable size. The researchers achieved their result by using metamaterials, artificial materials engineered to have optical properties not found in nature. They describe their approach in the Optical Society's (OSA) open-access journal Optics Express.

Jingjing Zhang, a postdoctoral researcher at DTU's Fotonik Department of Photonics Engineering and Structured Electromagnetic Materials, and an author of the Optics Express paper, explains that the team's new carpet cloak, which is based on an alternating-layer structure on a silicon-on-insulator (SOI) platform, introduces a flexible way to address the size problem.

"This new cloak, consisting of metamaterials, was designed with a grating structure that is simpler than previous metamaterial structures for cloaks," she says.

Grating structures channel light of a particular wavelength around an object. A grating structure is simply a series of slits or openings that redirect a beam of light.

"The highly anisotropic material comprising the cloak is obtained by adopting semiconductor manufacturing techniques that involve patterning the top silicon layer of an SOI wafer with nanogratings of appropriate filling factor. This leads to a cloak only a few times larger than the cloaked object," says Zhang. In this case, filling factor simply refers to the size of the grating structure and determines the wavelengths of light that are affected by the cloak.

By precisely restoring the path of the reflecting wave from the surface, the cloak creates an illusion of a flat plane for a triangular bump on the surface—hiding its presence over wavelengths ranging from 1480nm to 1580nm (see figure).

Read more...

Learn To Run A Biorefinery İn A Virtual Control Room Developed By Iowa State Researchers

David Grewell flipped on the augers that carry corn from a truck to a biorefinery. Then, with a few more clicks of his computer mouse, he turned on the pumps that send grain all the way through an ethanol plant, from storage to hammer mill to slurry tanks to jet cooker to liquefaction, fermentation, distillation, water separation and ultimately to ethanol storage.

Don't forget the centrifuges, evaporators and driers that recover distillers grains for livestock feed.

All of this happened in a small office on the north side of the Food Sciences Building and the Center for Crops Utilization Research at Iowa State University. Grewell, an associate professor of agricultural and biosystems engineering, calls his virtual control room "Nintendo for biofuel nerds."

But I-BOS (the Interactive Biorefinery Operations Simulator) is no game. It's based on real Iowa biorefineries that are producing ethanol and biodiesel. It's designed to help students in Iowa State's biorenewable resources and technology program learn about biofuel production. And it could be used by the biofuel industry to help train employees to operate a biorefinery.

"This could be the major component of a curriculum for teaching biofuels operators how to run a plant," Grewell said. "It's like a flight simulator for pilots."

And like a good flight simulator, the virtual control room is calibrated to match real-world performance. It's based on differential calculations that describe the fundamental transport phenomena and incorporate the principles of mass and energy conservation. The simulations also take into account more than 20 specific production attributes including moisture, starch content, contaminants, temperature and particle size. All the attributes change as biomass is converted into biofuel. And they can be changed by instructors, giving students experience with a variety of production conditions.

The virtual control room is now written to simulate the operation of ethanol and biodiesel plants. It keeps track of energy consumption, production efficiency and fuel quality. It also features interactive video clips from real biofuel plants that gives students a good look at the entire production process.

It can also give them an inside look at a plant emergency. The virtual control room, for example, can simulate a fire in an ethanol plant's distillation column, right down to a red emergency light flashing on the control room wall.

"Students will have to respond to the fire and learn what to turn off to minimize and contain the damage," Grewell said.

Read more...

Pitt-led Researchers Create Super-small Transistor,Artificial Atom Powered By Single Electrons

A University of Pittsburgh-led team has created a single-electron transistor that provides a building block for new, more powerful computer memories, advanced electronic materials, and the basic components of quantum computers. The researchers report in Nature Nanotechnology that the transistor's central component—an island only 1.5 nanometers in diameter—operates with the addition of only one or two electrons. That capability would make the transistor important to a range of computational applications, from ultradense memories to quantum processors, powerful devices that promise to solve problems so complex that all of the world's computers working together for billions of years could not crack them.

In addition, the tiny central island could be used as an artificial atom for developing new classes of artificial electronic materials, such as exotic superconductors with properties not found in natural materials, explained lead researcher Jeremy Levy, a professor of physics and astronomy in Pitt's School of Arts and Sciences. Levy worked with lead author and Pitt physics and astronomy graduate student Guanglei Cheng, as well as with Pitt physics and astronomy researchers Feng Bi, Daniela Bogorin, and Cheng Cen. The Pitt researchers worked with a team from the University of Wisconsin at Madison led by materials science and engineering professor Chang-Beom Eom, including research associates Chung Wun Bark, Jae-Wan Park, and Chad Folkman. Also part of the team were Gilberto Medeiros-Ribeiro, of HP Labs, and Pablo F. Siles, a doctoral student at the State University of Campinas in Brazil.

Levy and his colleagues named their device SketchSET, or sketch-based single-electron transistor, after a technique developed in Levy's lab in 2008 that works like a microscopic Etch A SketchTM, the drawing toy that inspired the idea. Using the sharp conducting probe of an atomic force microscope, Levy can create such electronic devices as wires and transistors of nanometer dimensions at the interface of a crystal of strontium titanate and a 1.2 nanometer thick layer of lanthanum aluminate. The electronic devices can then be erased and the interface used anew.

The SketchSET—which is the first single-electron transistor made entirely of oxide-based materials—consists of an island formation that can house up to two electrons. The number of electrons on the island—which can be only zero, one, or two—results in distinct conductive properties. Wires extending from the transistor carry additional electrons across the island.

One virtue of a single-electron transistor is its extreme sensitivity to an electric charge, Levy explained. Another property of these oxide materials is ferroelectricity, which allows the transistor to act as a solid-state memory. The ferroelectric state can, in the absence of external power, control the number of electrons on the island, which in turn can be used to represent the 1 or 0 state of a memory element. A computer memory based on this property would be able to retain information even when the processor itself is powered down, Levy said. The ferroelectric state also is expected to be sensitive to small pressure changes at nanometer scales, making this device potentially useful as a nanoscale charge and force sensor.

Read more...

For Testing Skin Cream, Synthetic Skin May Be As Good As The Real Thing

New research suggests that currently available types of synthetic skin may now be good enough to imitate animal skin in laboratory tests, and may be on their way to truly simulating human skin in the future. Researchers compared the response of synthetic skins to rat skin when they were both exposed to a generic skin cream treatment, and the results indicated they both reacted similarly.

The scientists used high-resolution images of two types of synthetic skin and samples of rat skin to discover similarities on microscopic scales.

The findings have implications for the treatment of burn victims.

When a person's body is severely burned, he or she may not have enough healthy skin remaining to attempt healing the burns through skin cell regeneration with his or her own skin. In this case, synthetic skin or animal skin provides a potential substitute. But the use of animal skin comes with a variety of problems.

"In addition to ethical issues, animal skin is hard to obtain, expensive, and gives highly variable results because of individual skin variability," said Bharat Bhushan, Ohio Eminent Scholar and the Howard D. Winbigler Professor of mechanical engineering at Ohio State University.

"Animal skin will vary from animal to animal, which makes it hard to anticipate how it might affect burnt victims, individually," Bhushan said. "But, synthetic skin's composition is consistent, making it a more reliable product," he continued.

Bhushan's research will appear in the June 5 issue of the Journal of Applied Polymer Science.

Bhushan and his colleague Wei Tang, an engineer at China University of Mining and Technology, compared two different types of synthetic skin to rat skin. The first synthetic skin was a commercially available skin purchased from Smooth-On, Inc. of Easton, Pennsylvania. The second synthetic skin was produced in Bhushan's lab. Ohio State's University Lab Animal Resources provided the rat skin samples.

Whether a synthetic skin feels and acts like real skin is very important, Bhushan explained. The skin must stand up to environmental effects such as sunlight or rain, while maintaining its texture and consistency. Scientists have continued to improve the practical and aesthetic properties of synthetic skin, which suggests it may soon be ready to replace animal skin and, farther in the future, human skin.

Read more...

Report Cites 'Liquefaction' As Key To Much Of Japanese Earthquake Damage

The massive subduction zone earthquake in Japan caused a significant level of soil "liquefaction" that has surprised researchers with its widespread severity, a new analysis shows. The findings also raise questions about whether existing building codes and engineering technologies are adequately accounting for this phenomenon in other vulnerable locations, which in the U.S. include Portland, Ore., parts of the Willamette Valley and other areas of Oregon, Washington and California.

A preliminary report about some of the damage in Japan has just been concluded by the Geotechnical Extreme Events Reconnaissance, or GEER advance team, in work supported by the National Science Foundation.

The broad geographic extent of the liquefaction over hundreds of miles was daunting to experienced engineers who are accustomed to seeing disaster sites, including the recent earthquakes in Chile and New Zealand.

"We've seen localized examples of soil liquefaction as extreme as this before, but the distance and extent of damage in Japan were unusually severe," said Scott Ashford, a professor of geotechnical engineering at Oregon State University and a member of this research team.

"Entire structures were tilted and sinking into the sediments, even while they remained intact," Ashford said. "The shifts in soil destroyed water, sewer and gas pipelines, crippling the utilities and infrastructure these communities need to function. We saw some places that sank as much as four feet."

Some degree of soil liquefaction is common in almost any major earthquake. It's a phenomenon in which saturated soils, particularly recent sediments, sand, gravel or fill, can lose much of their strength and flow during an earthquake. This can allow structures to shift or sink and significantly magnify the structural damage produced by the shaking itself

Read more...

Sandia and UNM Lead Effort To Destroy Cancers

Melding nanotechnology and medical research, Sandia National Laboratories, the University of New Mexico, and the UNM Cancer Research and Treatment Center have produced an effective strategy that uses nanoparticles to blast cancerous cells with a mélange of killer drugs. In the cover article of the May issue of Nature Materials, available online April 17 , the researchers describe silica nanoparticles about 150 nanometers in diameter as honeycombed with cavities that can store large amounts and varieties of drugs.

"The enormous capacity of the nanoporous core, with its high surface area, combined with the improved targeting of an encapsulating lipid bilayer [called a liposome], permit a single 'protocell' loaded with a drug cocktail to kill a drug-resistant cancer cell," says Sandia researcher and UNM professor Jeff Brinker, the principal investigator. "That's a millionfold increase in efficiency over comparable methods employing liposomes alone — without nanoparticles — as drug carriers."

The nanoparticles and the surrounding cell-like membranes formed from liposomes together become the combination referred to as a protocell: the membrane seals in the deadly cargo and is modified with molecules (peptides) that bind specifically to receptors overexpressed on the cancer cell's surface. (Too many receptors is one signal the cell is cancererous.) The nanoparticles provide stability to the supported membrane and contain and release the therapeutic cargo within the cell.

A current Food and Drug Administration-approved nanoparticle delivery strategy is to use liposomes themselves to contain and deliver the cargo. In a head-to-head comparison of targeted liposomes and protocells with identical membrane and peptide compositions, Brinker and colleagues report that the greater cargo capacity, stability and targeting efficacy of protocells leads to many times greater cytotoxicity [destruction] directed specifically toward human liver cancer cells.

Another advantage to protocells over lipsomes alone, says lead author Carlee Ashley, a Harry S. Truman post-doctoral fellow at Sandia's California site in Livermore, is that liposomes used as carriers need specialized loading strategies that make the process more difficult. "We've demonstrated we can just soak nanoparticles to load them with unique drug combinations needed for personalized medicine. They effectively encapsulate toxins as well as siRNA [ribonucleic acid] that silence expressions of proteins."

RNA, the biological messenger that tells cells which proteins to manufacture, in this case is used to silence the cellular factory, a way of causing apoptosis or cell death. "Si" is short for "silence."

The lipids also serve as a shield that restricts toxic chemotherapy drugs from leaking from the nanoparticle until the protocell binds to and takes hold within the cancer cell. This means that few poisons leak into the system of the human host, if the protocells find no cancer cells. This cloaking mitigates toxic side effects expected from conventional chemotherapy.

Read more...

Uncovering The Spread Of Deadly Cancer

For the first time, scientists can see pathways to stop a deadly brain cancer in its tracks. Researchers at Case Western Reserve University School of Medicine have imaged individual cancer cells and the routes they travel as the tumor spreads. The researchers used a novel cryo-imaging technique to obtain the unprecedented look at a mouse model of glioblastoma multiforme, a particularly aggressive cancer that has no treatments to stop it from spreading.

A description of their work, and images, is being published Sept. 1 in the journal Cancer Research.

"We're able to see things we couldn't before, and we can use these images to understand how tumor cells invade and disperse," said Susann M. Brady-Kalnay, a professor of molecular biology and microbiology at the Case Western Reserve School of Medicine, and senior author of the paper.

That information, in turn, can be used to help develop and test the effectiveness of drugs and other therapies used to treat the cancer, she said.

To obtain the view, the scientists used a model that included four different cell lines of brain cancers at various stages of tumor development and dispersion. The cancer cells were modified with fluorescent markers and implanted in the model's brain in collaboration with Biomedical Engineering Professor James Basilion's lab.

The cryo-imaging system, developed by David Wilson, also a professor of biomedical engineering at Case Western Reserve, disassembles the brain layer by layer and reassembles the model into a color three-dimensional digital image.

Using software and algorithms designed by the researchers, they are able to differentiate the main tumor mass, the blood vessels that feed the cancer and dispersing cells. The imaging system enables them to peer at single cells and see exactly where they are in the brain

Read more...

Hand-held Unit To Detect Cancer İn Poorer Countries

An engineering researcher and a global health expert from Michigan State University are working on bringing a low-cost, hand-held device to nations with limited resources to help physicians detect and diagnose cancer. Syed Hashsham, a professor of civil and environmental engineering at MSU, is developing the Gene-Z device, which is operated using an iPod Touch or Android-based tablet and performs genetic analysis on microRNAs and other genetic markers. MicroRNAs are single-stranded molecules that regulate genes; changes in certain microRNAs have been linked to cancer and other health-related issues.

He is working with Reza Nassiri, director of MSU's Institute of International Health and an assistant dean in the College of Osteopathic Medicine, on the medical capabilities for the device and establishing connections with physicians worldwide.

Cancer is emerging as a leading cause of death in underdeveloped and developing countries where resources for cancer screening are almost non-existent, Nassiri said.

"Until now, little effort has been concentrated on moving cancer detection to global health settings in resource-poor countries," he said. "Early cancer detection in these countries may lead to affordable management of cancers with the aid of new screening and diagnostic technologies that can overcome global health care disparities."

Hashsham demonstrated the potential of the Gene-Z at the National Institutes of Health's first Cancer Detection and Diagnostics Conference. The conference, held recently in Bethesda, Md., was sponsored by the Fogarty International Center and the National Cancer Institute.

"Gene-Z has the capability to screen for established markers of cancer at extremely low costs in the field," Hashsham said. "Because it is a hand-held device operated by a battery and chargeable by solar energy, it is extremely useful in limited-resource settings."

The NIH conference was attended by several U.S. research institutions, including MSU. One of the primary objectives of the meeting was to address the utility of new cancer detection technologies.

Read more...

Friday, August 26, 2011

Unexpected Adhesion Properties Of Graphene May Lead To New Nanotechnology Devices

Graphene, considered the most exciting new material under study in the world of nanotechnology, just got even more interesting, according to a new study by a group of researchers at the University of Colorado Boulder. The new findings -- that graphene has surprisingly powerful adhesion qualities -- are expected to help guide the development of graphene manufacturing and of graphene-based mechanical devices such as resonators and gas separation membranes, according to the CU-Boulder team. The experiments showed that the extreme flexibility of graphene allows it to conform to the topography of even the smoothest substrates.

Graphene consists of a single layer of carbon atoms chemically bonded in a hexagonal chicken wire lattice. Its unique atomic structure could some day replace silicon as the basis of electronic devices and integrated circuits because of its remarkable electrical, mechanical and thermal properties, said Assistant Professor Scott Bunch of the CU-Boulder mechanical engineering department and lead study author.

A paper on the subject was published online in the Aug. 14 issue of Nature Nanotechnology. Co-authors on the study included CU-Boulder graduate students Steven Koenig and NarasimhaBoddeti and Professor Martin Dunn of the mechanical engineering department.

"The real excitement for me is the possibility of creating new applications that exploit the remarkable flexibility and adhesive characteristics of graphene and devising unique experiments that can teach us more about the nanoscale properties of this amazing material," Bunch said.

Not only does graphene have the highest electrical and thermal conductivity among all materials known, but this "wonder material" has been shown to be the thinnest, stiffest and strongest material in the world, as well as being impermeable to all standard gases. It's newly discovered adhesion properties can now be added to the list of the material's seemingly contradictory qualities, said Bunch.

The CU-Boulder team measured the adhesion energy of graphene sheets, ranging from one to five atomic layers, with a glass substrate, using a pressurized "blister test" to quantify the adhesion between graphene and glass plates.

Adhesion energy describes how "sticky" two things are when placed together. Scotch tape is one example of a material with high adhesion; the gecko lizard, which seemingly defies gravity by scaling up vertical walls using adhesion between its feet and the wall, is another. Adhesion also canplay a detrimental role, as in suspended micromechanical structures where adhesion can cause device failure or prolong the development of a technology, said Bunch.

Read more...

New Theory May Shed Light On Dynamics Of Large-polymer Liquids

A new physics-based theory could give researchers a deeper understanding of the unusual, slow dynamics of liquids composed of large polymers. This advance provides a better picture of how polymer molecules respond under fast-flow, high-stress processing conditions for plastics and other polymeric materials. Kenneth S. Schweizer, the G. Ronald and Margaret H. Professor of materials science and engineering at the University of Illinois, and graduate student Daniel Sussman published their findings in the journal Physical Review Letters.

"This is the first microscopic theory of entangled polymer liquids at a fundamental force level which constructs the dynamic confinement potential that controls slow macromolecular motion," said Schweizer, who also is a professor of chemistry and of chemical and biomolecular engineering and is affiliated with the Frederick Seitz Materials Research Laboratory at the U. of I. "Our breakthrough lays the foundation for an enormous amount of future work relevant to both the synthetic polymers of plastics engineering and the biopolymers relevant to cell biology and mechanics."

Polymers are long, large molecules that are ubiquitous in biology, chemistry and materials, from the stiff filaments that give cells their structure to plastics. Linear polymers fall into two classes: rigid rods like uncooked spaghetti or flexible strands like al dente noodles.

When in a dense solution, linear polymers become entangled like spaghetti in a pot, intertwining and crowding each other. Each polymer is hemmed in by its neighbors, so that the liquid behaves like an elastic, viscous rubber. Given enough time, the liquid will eventually flow slowly as polymers crawl along like snakes, a movement called reptation. Researchers have long assumed that each polymer's reptation is confined to a tube-shaped region of space, like a snake slithering through a pipe, but have had difficulty understanding how and why the polymers behave that way.

Schweizer and Sussman's new theory, based on microscopic physics, explains the slow dynamics of rigid entangled polymers and quantitatively constructs the confining dynamic tube from the forces between molecules. The tube concept emerges as a consequence of the strong interactions of a polymer with its myriad of intertwining neighbors. The theory's mathematical approach sheds greater light on entanglement and better explains experimental data.

"Our ability to take into account these crucial physical effects allows us to predict, not assume, the confining tube concept, identify its limitations, and predict how applied forces modify motion and elasticity," Schweizer said.

Not only does the new theory predict tube confinement and reptative motion, it reveals important limitations. The researchers found that the "tubes" weaken as applied forces increase, to the point where the tube concept fails completely and the liquid loses its rubbery nature. This is particularly important in plastics processing, which exposes polymer liquids to high stress conditions.

Read more...

Engineers Discover Nanoscale Balancing Act That Mirrors Forces At Work İn Living Systems

A delicate balance of atomic forces can be exploited to make nanoparticle superclusters that are uniform in size -- an attribute that's important for many nanotech applications but hard to accomplish, University of Michigan researchers say. The same type of forces are at work bringing the building blocks of viruses together, and the inorganic supercluster structures in this research are in many ways similar to viruses.

U-M chemical engineering professors Nicholas Kotov and Sharon Glotzer led the research. The findings are newly published online in Nature Nanotechnology.

In another instance of forces behaving in unexpected ways at the nanoscale, they discovered that if you start with small nanoscale building blocks that are varied enough in size, the electrostatic repulsion force and van der Waals attraction force will balance each other and limit the growth of the clusters. This equilibrium enables the formation of clusters that are uniform in size.

"The breakthrough here is that we've discovered a generic mechanism that causes these nanoparticles to assemble into near perfect structures," Glotzer said. "The physics that we see is not special to this system, and could be exploited with other materials. Now that we know how it works, we can design new building blocks that will assemble the same way."

The inorganic superclusters -- technically called "supraparticles" -- that the researchers created out of red, powdery cadmium selenide are not artificial viruses. But they do share many attributes with the simplest forms of life, including size, shape, core-shell structure and the abilities to both assemble and dissemble, Kotov said.

"Having these functionalities in totally inorganic system is quite remarkable," Kotov said. "There is the potential to combine them with the beneficial properties of inorganic materials such as environmental resilience, light adsorption and electrical conductivity."

Zhiyong Tang, a collaborating professor at the National Center of Nanoscience and Technology in China, said, "It is also very impressive that such supraparticles can be further used as the building blocks to fabricate three-dimensional ordered assemblies. This secondary self-assembly behavior provides a feasible way to obtain large-scale nanostructures that are important for practical application."

Read more...

Building A Better Antipsychotic Drug By Treating Schizophrenia's Cause

The classic symptoms of schizophrenia -- paranoia, hallucinations, the inability to function socially -- can be managed with antipsychotic drugs. But exactly how these drugs work has long been a mystery. Now, researchers at Pitt have discovered that antipsychotic drugs work akin to a Rube Goldberg machine -- that is, they suppress something that in turn suppresses the bad effects of schizophrenia, but not the exact cause itself. In a paper published in the Journal of Neuroscience, they say that pinpointing what's actually causing the problem could lead to better avenues of schizophrenia treatment that more directly and efficiently target the disease.

"In the past five years or so, we've really started to understand what may be going wrong with the schizophrenic brain," says Anthony Grace, Distinguished Professor of Neuroscience and professor of psychology in Pitt's School of Arts and Sciences and professor of psychiatry in the Pitt School of Medicine, who is senior author of the paper.

Schizophrenia is made up of three different types of symptoms. Positive symptoms, which are added onto a "normal" personality, include hallucinations and delusions, such as hearing voices, thinking people are after you, or thinking you're being targeted by aliens. Those are the classic symptoms of schizophrenia and the ones antipsychotic medications work on best. Grace says these are the symptoms most likely related to a neurotransmitter called dopamine.

The other two categories of symptoms are negative (what's missing from the normal personality -- the ability to interact socially or hold down a job; some emotional flattening) and cognitive (the ability to think linearly or concentrate on one thing at a time). These two really aren't addressed well by antipsychotic drugs. "Blocking the dopamine system seems to fix classic hallucinations and delusions a whole lot better than it fixes the other problems," says Grace.

Grace has been studying the role dopamine plays in the schizophrenic brain since 1978. It's long been known that after several weeks of treatment with antipsychotic drugs, dopamine-producing neurons are inactivated. "It would suggest to us that in schizophrenia there is not too much dopamine, but rather the dopamine system is too responsive," says Grace.

Therefore, by inactivating the neurons, this overresponsivity should be able to be treated. "If there were just too much dopamine in the brain, one would expect the biggest treatment effect would be at the beginning and then it would diminish," Grace says.

Read more...

Learning İnformation The Hard Way May Be Best 'Boot Camp' For Older Brains

Canadian researchers have found the first evidence that older brains get more benefit than younger brains from learning information the hard way -- via trial-and-error learning. The study was led by scientists at Baycrest's Rotman Research Institute in Toronto and appears online Aug. 24, 2011 in the journal Psychology and Aging, ahead of the print edition.

The finding will surprise professional educators and cognitive rehabilitation clinicians as it challenges a large body of published science which has shown that making mistakes while learning information hurts memory performance for older adults, and that passive "errorless" learning (where the correct answer is provided) is better suited to older brains.

"The scientific literature has traditionally embraced errorless learning for older adults. However, our study has shown that if older adults are learning material that is very conceptual, where they can make a meaningful relationship between their errors and the correct information that they are supposed to remember, in those cases the errors can actually be quite beneficial for the learning process," said Andreé-Ann Cyr, the study's lead investigator.

Cyr conducted the research at Baycrest as a doctoral student in Psychology (University of Toronto), in collaboration with senior author and scientist Dr. Nicole Anderson of Baycrest's Rotman Research Institute. Dr. Anderson specializes in cognitive rehabilitation research with older adults.

In two separate studies, researchers compared the memory benefits of trial-and-error learning (TEL) with errorless learning (EL) in memory exercises with groups of healthy young and older adults. The young adults were in their 20s; the older adults' average age was 70. TEL is considered a more effortful cognitive encoding process where the brain has to "scaffold" its way to making richer associations and linkages in order to reach the correct target information. Errorless learning (EL) is considered passive, or less taxing on the brain, because it provides the correct answer to be remembered during the learning process.

The researchers presented participants with a meaningful "cue" (e.g. type of tooth). The correct target word (e.g. molar) was shown to learners in the EL condition. In the TEL condition, the cue was presented alone, and participants made two guesses (such as canine, incisor) before the correct target "molar" was shown. After a short while, participants performed a memory test that required them to remember the context in which the words were learned (i.e. were they learned through trial-and-error or not).

In both studies, participants remembered the learning context of the target words better if they had been learned through trial-and-error, relative to the errorless condition. This was especially true for the older adults whose performance benefited approximately 2.5 times more relative to their younger peers.

Read more...

George Mason Research Team Uncovers New Factor İn HIV İnfection

A George Mason University researcher team has revealed the specific process by which the HIV virus infects healthy T cells -- a process previously unknown. The principal investigator, HIV researcher Yuntao Wu, says he hopes this breakthrough will start a new line on inquiry into how researchers can use this knowledge to create drugs that could limit or halt HIV infection. Wu, a professor of molecular and microbiology at Mason, published these findings in an April 2011 edition of the Journal of Biological Chemistry, along with researchers Paul J. Vorster, Jia Guo, Alyson Yoder, Weifeng Wang, Yanfang Zheng, Dongyang Yu and Mark Spear from Mason's National Center for Biodefense and Infectious Diseases and the Department of Molecular and Microbiology and Xuehua Xu from Georgetown University School of Medicine's Department of Oncology.

This paper outlined a new understanding on how T cells -- which are the target cells that the HIV virus infects -- move and migrate when hijacked by the virus.

"The discovery adds to our understanding of how HIV initiates the infection of human T cells, which leads to their eventual destruction and the development of AIDS," Wu says.

Researchers and doctors have known for some time that the HIV virus, rather than directly killing healthy T cells, actually hijacks them. This eventually leads to their destruction. So the virus essentially turns the infected T cells (also known as CD4T cells or helper T cells) into a factory for creating even more HIV. Learning more about how the cells are infected could be a key step toward figuring out how to stop infection altogether.

Wu's latest discovery builds upon his previous work, published in the journal Cell in 2008, which described the basic process of how HIV infects T cells. After discovering that cofilin -- a protein used to cut through a cell's outer layer, or cytoskeleton -- is involved in HIV infection, Wu's new research provides the detailed framework for this process.

Read more...

Cars Could Run On Recycled Newspaper,Tulane Scientists Say

Here's one way that old-fashioned newsprint beats the Internet. Tulane University scientists have discovered a novel bacterial strain, dubbed "TU-103," that can use paper to produce butanol, a biofuel that can serve as a substitute for gasoline. They are currently experimenting with old editions of the Times Picayune, New Orleans' venerable daily newspaper, with great success. TU-103 is the first bacterial strain from nature that produces butanol directly from cellulose, an organic compound.

"Cellulose is found in all green plants, and is the most abundant organic material on earth, and converting it into butanol is the dream of many," said Harshad Velankar, a postdoctoral fellow in David Mullin's lab in Tulane's Department of Cell and Molecular Biology. "In the United States alone, at least 323 million tons of cellulosic materials that could be used to produce butanol are thrown out each year."

Mullin's lab first identified TU-103 in animal droppings, cultivated it and developed a method for using it to produce butanol. A patent is pending on the process.

"Most important about this discovery is TU-103's ability to produce butanol directly from cellulose," explained Mullin.

He added that TU-103 is the only known butanol-producing clostridial strain that can grow and produce butanol in the presence of oxygen, which kills other butanol-producing bacteria. Having to produce butanol in an oxygen-free space increases the costs of production.

As a biofuel, butanol is superior to ethanol (commonly produced from corn sugar) because it can readily fuel existing motor vehicles without any modifications to the engine, can be transported through existing fuel pipelines, is less corrosive, and contains more energy than ethanol, which would improve mileage.

Read more...

Nasa Study Refutes Claims Of Drought-driven Declines İn Plant Productivity, Global Food Security

A new, comprehensive study by an international team of scientists, including scientists at Boston University in the US and the Universities of Viçosa and Campinas in Brazil, has been published in the current issue of Science (August 26, 2011) refuting earlier alarmist claims that drought has induced a decline in global plant productivity during the past decade and posed a threat to global food security. Those earlier findings published by Zhao and Running in the August 2010 issue of Science (Vol. 329, p. 940) also warned of potentially serious consequences for biofuel production and the global carbon cycle. The two new technical comments in Science contest these claims on the basis of new evidence from NASA satellite data, which indicates that Zhao and Running's findings resulted from several modeling errors, use of corrupted satellite data and statistically insignificant trends.

The main premise of Zhao and Running's model-based study was an expectation of increased global plant productivity during the 2000s based on previously observed increases during the 1980s and 1990s under supposedly similar, favorable climatic conditions. Instead, Zhao and Running were surprised to see a decline, which they attributed it to large-scale droughts in the Southern Hemisphere.

"Their model has been tuned to predict lower productivity even for very small increases in temperature. Not surprisingly, their results were preordained," said Arindam Samanta, the study's lead author. (Samanta, now at Atmospheric and Environmental Research Inc., Lexington, MA, worked on the study as a graduate student at Boston University's Department of Geography and Environment.)

Zhao and Running's predictions of trends and year-to-year variability were largely based on simulated changes in the productivity of tropical forests, especially the Amazonian rainforests. However, according to the new study, their model failed miserably when tested against comparable ground measurements collected in these forests.

"The large (28%) disagreement between the model's predictions and ground truth imbues very little confidence in Zhao and Running's results," said Marcos Costa, coauthor, Professor of Agricultural Engineering at the Federal University of Viçosa and Coordinator of Global Change Research at the Ministry of Science and Technology, Brazil.

This new study also found that the model actually predicted increased productivity during droughts, compared to field measurements, and decreased productivity in non-drought years 2006 and 2007 in the Amazon, in contradiction to the main finding of the previous report. "Such erratic behavior is typical of their poorly formulated model, which lacks explicit soil moisture dynamics," said Edson Nunes, coauthor and researcher at the Federal University of Viçosa, Brazil.

Read more...

Researchers Find Wide Gap İn İmmune Responses Of People Exposed To The Flu

Why do some folks who take every precaution still get the flu, while others never even get the sniffles? It comes down to a person's immune system response to the flu virus, says Alfred Hero, professor at the University of Michigan College of Engineering. In one of the first known studies of its kind, Hero and colleagues from Duke University Medical Center and the Duke Institute for Genome Sciences & Policy, used genomics to begin to unravel what in our complex genomic data accounts for why some get sick while others don't.

The study findings appears in PLoS Genetics Aug. 25.

Hero's analysis group used several methods, including a pattern recognition algorithm previously developed for satellite imaging of the environment to discover the genomic signatures associated with immune response and flu symptoms. Using these genomic signatures, researchers compared the responses of previously healthy participants inoculated with the flu, and found significant and complex immune responses in both people who got sick and those who did not.

The gene expression data gets to the heart of how the immune system reacts and orchestrates its response to the flu virus, which dictates whether people get sick.

"We looked at over 22,000 genes in 267 blood samples," said Hero, who is also affiliated with the U-M College of Literature, Science & Arts and the U-M Medical School. "No study of this magnitude has ever been done on human immune response."

Geoff Ginsburg, study co-author and director of the Center for Genomic Medicine at the Duke Institute for Genome Sciences & Policy, said the study reveals what happens after virus exposure.

"It also points out, importantly, that remaining asymptomatic in the face of an exposure to a virus is an active process in the immune system, and we can now begin to probe the underlying biology to resisting infection," Ginsburg said.

The team inoculated 17 healthy individuals with the flu virus and about half of them got sick. They then collected gene expression data from each individual at 16 time points over 132 hours. These data provided a clear picture of the gene expression over time in those who developed flu symptoms and those who did not.

Eventually, if scientists can understand what happens at the level of the genome that makes people more or less susceptible to viral illness, they could potentially develop therapies to prevent the illness. Hero said the inflammatory genomic signature that differentiated the well group from the sick group was measurable up to about 36 hours before peak flu symptoms developed. It may, therefore, be possible to detect illness early, allowing people to take precautions and perhaps even prevent the worst symptoms.

Read more...

Street Crane To Share Technical Expertise With Dubai Crane

Dubai Crane is one of many subsidiaries of Dubai Investments PJSC a public company with capital of a total asset value of AED 14.745 billion (USD 4 billion) and diverse interests. The 47 companies in the Dubai Investments portfolio span activities such as building, development, glass, technology, plastics, pharmaceuticals and food. The UK’s largest factory crane manufacturer, Street Crane operates globally via a network of over 60 high quality local trading partners.

Dubai Cranes LLC and Street Crane Company of the United Kingdom have concluded an agreement under which advanced design factory cranes will be available to manufacturers in United Arab Emirates on short lead times. Dubai Crane are to carry a substantial stock of Street’s crane kits that will be used with locally fabricated structural components to meet lifting needs of up to 300 tonnes.

Street Crane will share technical expertise with Dubai Crane so that major crane structures such as gantries and crane beams can be fabricated to UK and European standards in Dubai. These structures will then be mated to electro-mechanical systems such as end carriages, carriages, hoists, controls and electrical gear supplied by Street Crane from the UK to create advanced world-class cranes.

General manager at Dubai Cranes, Andew Kay explained, “Industrial development in the UAE is creating huge demand for high specification industrial cranes. This is especially true in the business cluster based on the Kalifa Port and expanding aluminium industries. The Dubai Crane service package covers everything from the initial survey and evaluation of handling requirements, through the design and installation of the crane, to the complete support and service of the equipment throughout its operational life.”

Stock holding by Dubai Crane will facilitate prompt delivery for main market cranes of single or double girder construction up to 25 tonnes safe working load and up to 30 metres span. In addition, technical collaboration allows the companies to undertake major complex infrastructure projects.

A recent collaboration was a new facility for Emirates Aluminium, (EMAL) where more then 15 special overhead cranes were installed as part of the multimillion US dollar investment. Cranes with capacities from three to 50 tonnes were installed with several cranes having auxiliary hoists to assist in the tipping of ladles. All the cranes were designed and engineered for intensive 24/7 operation in a harsh environment.

Speaking for Street Crane, managing director Andrew Pimblett commented. “We are delighted to be working with such a high profile and well resourced local partner. Local fabrication of major crane elements not only provides a more responsive service for end users, but also creates value in the local economy, so that this is a win-win for all the project partners.

Read more...

Thursday, August 25, 2011

Engineering Atomic İnterfaces For New Electronics

Most people cross borders such as doorways or state lines without thinking much about it. Yet not all borders are places of limbo intended only for crossing. Some borders, like those between two materials that are brought together, are dynamic places where special things can happen. For an electron moving from one material toward the other, this space is where it can join other electrons, which together can create current, magnetism or even light.

A multi-institutional team has made fundamental discoveries at the border regions, called interfaces, between oxide materials. Led by University of Wisconsin-Madison materials science and engineering professor Chang-Beom Eom, the team has discovered how to manipulate electrons oxide interfaces by inserting a single layer of atoms. The researchers also have discovered unusual electron behaviors at these engineered interfaces.

Their work, which is sponsored by the National Science Foundation, will be published Feb. 18 in the journal Science and could allow researchers to further study and develop interfaces with a wide array of properties.

Eom's team blends theorists and experimentalists, including UW-Madison physics professor Mark Rzchowski and collaborators at the University of Nebraska-Lincoln, University of Michigan, Argonne National Laboratory and Brookhaven National Laboratory.

The researchers used two pieces of precisely grown strontium titanate, which is a type of oxide, or compound with oxygen as a fundamental element. Between the pieces, the researchers inserted a one-atom-thick layer of one of five rare-earth elements, which are important components in the electronics industry.

The team found that the rare-earth element layer creates an electron gas that has some interesting characteristics. The gas actually behaves more like an electron "liquid," since the electrons move more in tandem, or in correlation, than a gas normally does.

Read more...

Taking Brain-Computer İnterfaces To The Next Phase

You may have heard of virtual keyboards controlled by thought, brain-powered wheelchairs, and neuro-prosthetic limbs. But powering these machines can be downright tiring, a fact that prevents the technology from being of much use to people with disabilities, among others. Professor José del R. Millán and his team at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have a solution: engineer the system so that it learns about its user, allows for periods of rest, and even multitasking. In a typical brain-computer interface (BCI) set-up, users can send one of three commands – left, right, or no-command. No-command is the static state between left and right and is necessary for a brain-powered wheelchair to continue going straight, for example, or to stay put in front of a specific target. But it turns out that no-command is very taxing to maintain and requires extreme concentration. After about an hour, most users are spent. Not much help if you need to maneuver that wheelchair through an airport.

In an ongoing study demonstrated by Millán and doctoral student Michele Tavella at the AAAS 2011 Annual Meeting in Washington, D.C., the scientists hook volunteers up to BCI and ask them to read, speak, or read aloud while delivering as many left and right commands as possible or delivering a no-command. By using statistical analysis programmed by the scientists, Millán's BCI can distinguish between left and right commands and learn when each subject is sending one of these versus a no-command. In other words, the machine learns to read the subject's mental intention. The result is that users can mentally relax and also execute secondary tasks while controlling the BCI.

The so-called Shared Control approach to facilitating human-robot interactions employs image sensors and image-processing to avoid obstacles. According to Millán, however, Shared Control isn't enough to let an operator to rest or concentrate on more than one command at once, limiting long-term use.

Millán's new work complements research on Shared Control and makes multitasking a reality while at the same time allows users to catch a break. His trick is in decoding the signals coming from EEG readings on the scalp—readings that represent the activity of millions of neurons and have notoriously low resolution. By incorporating statistical analysis, or probability theory, his BCI allows for both targeted control—maneuvering around an obstacle—and more precise tasks, such as staying on a target. It also makes it easier to give simple commands like "go straight" that need to be executed over longer periods of time (think back to that airport) without having to focus on giving the same command over and over again.

Read more...

'Model Minority' Not Perceived As Model Leader

Asian Americans are widely viewed as "model minorities" on the basis of education, income and competence. But they are perceived as less ideal than Caucasian Americans when it comes to attaining leadership roles in U.S. businesses and board rooms, according to researchers at the University of California, Riverside. In a groundbreaking study, researchers found that "race trumps other salient characteristics, such as one's occupation, regarding perceptions of who is a good leader," said Thomas Sy, assistant professor of psychology at UC Riverside and the lead author of the study.

The peer-reviewed paper, "Leadership Perceptions as a Function of Race-Occupation Fit: The Case of Asian Americans," appears in the Journal of Applied Psychology.

Co-authors are Lynn M. Shore of San Diego State University, Judy Strauss of CSU Long Beach, Ted H. Shore of CSU San Marcos, UCR graduate students Susanna Tram and Paul Whiteley, and Kristine Ikeda-Muromachi of CSU Long Beach.

"Understanding the effects of race on leadership perceptions is important, in part, because the U.S. workforce is increasingly racially diverse, and organizations are realizing that the inclusion of racial minorities constitutes a competitive advantage in a global market," according to the researchers. "However, racial minorities are often perceived to be less suitable for management positions in the United States, as evidenced by a persistent glass ceiling for these groups, lower managerial promotion ratings, lower job suitability ratings, and individuals' attributions of success and failure."

Read more...

Report Cites 'Liquefaction' As Key To Much Of Japanese Earthquake Damage

The massive subduction zone earthquake in Japan caused a significant level of soil "liquefaction" that has surprised researchers with its widespread severity, a new analysis shows. The findings also raise questions about whether existing building codes and engineering technologies are adequately accounting for this phenomenon in other vulnerable locations, which in the U.S. include Portland, Ore., parts of the Willamette Valley and other areas of Oregon, Washington and California.

A preliminary report about some of the damage in Japan has just been concluded by the Geotechnical Extreme Events Reconnaissance, or GEER advance team, in work supported by the National Science Foundation.

The broad geographic extent of the liquefaction over hundreds of miles was daunting to experienced engineers who are accustomed to seeing disaster sites, including the recent earthquakes in Chile and New Zealand.

"We've seen localized examples of soil liquefaction as extreme as this before, but the distance and extent of damage in Japan were unusually severe," said Scott Ashford, a professor of geotechnical engineering at Oregon State University and a member of this research team.

"Entire structures were tilted and sinking into the sediments, even while they remained intact," Ashford said. "The shifts in soil destroyed water, sewer and gas pipelines, crippling the utilities and infrastructure these communities need to function. We saw some places that sank as much as four feet."

Some degree of soil liquefaction is common in almost any major earthquake. It's a phenomenon in which saturated soils, particularly recent sediments, sand, gravel or fill, can lose much of their strength and flow during an earthquake. This can allow structures to shift or sink and significantly magnify the structural damage produced by the shaking itself.

But most earthquakes are much shorter than the recent event in Japan, Ashford said. The length of the Japanese earthquake, as much as five minutes, may force researchers to reconsider the extent of liquefaction damage possible in situations such as this.

"With such a long-lasting earthquake, we saw how structures that might have been okay after 30 seconds just continued to sink and tilt as the shaking continued for several more minutes," he said. "And it was clear that younger sediments, and especially areas built on recently filled ground, are much more vulnerable."

The data provided by analyzing the Japanese earthquake, researchers said, should make it possible to improve the understanding of this soil phenomenon and better prepare for it in the future. Ashford said it was critical for the team to collect the information quickly, before damage was removed in the recovery efforts.

Read more...

Researchers Pinpoint Graphene's Varying Conductivity Levels

Did you know that pencil lead may just end up changing the world? Graphene is the material from which graphite, the core of your No. 2 pencil, is made. It is also the latest "wonder material," and may be the electronics industry's next great hope for the creation of extremely fast electronic devices. Researchers at North Carolina State University have found one of the first roadblocks to utilizing graphene by proving that its conductivity decreases significantly when more than one layer is present. Graphene's structure is what makes it promising for electronics. Because of the way its carbon atoms are arranged, its electrons are very mobile. Mobile electrons mean that a material should have high conductivity. But NC State physicist Dr. Marco Buongiorno-Nardelli and NC State electrical and computer engineer Dr. Ki Wook Kim wanted to find a way to study the behavior of "real" graphene and see if this was actually the case.

"You can talk about the electronic structure of graphene, but you must consider that those electrons don't exist alone in the material," Buongiorno-Nardelli says. "There are impurities, and most importantly, there are vibrations present from the atoms in the material. The electrons encounter and interact with these vibrations, and that can affect the material's conductivity."

Buongiorno-Nardelli, Kim and graduate students Kostya Borysenko and Jeff Mullen developed a computer model that would predict the actual conductivity of graphene, both as a single layer and in a bilayer form, with two layers of graphene sitting on top of one another. It was important to study the bilayer model because actual electronic devices cannot work with only a single layer of the material present.

"You cannot make a semiconductor with just one graphite layer," Buongiorno-Nardelli explains. "To make a device, the conductive material must have a means by which it can be turned off and on. And bilayer provides such ability."

With the help of the high performance computers at Oak Ridge National Laboratories, the NC State team discovered both good and bad news about graphene. Their results appear as an Editor's Suggestion in the April 15 edition of Physical Review B.

With a single layer of graphene, the mobility – and therefore conductivity – shown by the researchers' simulations turned out to be much higher than they had originally thought. This good news was balanced, however, by the results from the bilayer state.

"We expected that the electrons' conductivity in bilayer graphene could be somewhat worse, due to the ways in which the vibrations from the atoms in each individual layer interact with one another," Mullen says. "Surprisingly, we found that the mobility of electrons in bilayer graphene is roughly an order of magnitude lower than in a single graphene sheet."

Read more...

University Of Toronto Researchers 'Brighten' The Future Of Oled Technology

Chlorine is an abundant and readily available halogen gas commonly associated with the sanitation of swimming pools and drinking water. Could a one-atom thick sheet of this element revolutionize the next generation of flat-panel displays and lighting technology? In the case of Organic Light-Emitting Diode (OLED) devices, it most certainly can. Primary researchers Michael G. Helander (PhD Candidate and Vanier Canada Graduate Scholar), Zhibin Wang (PhD Candidate), and led by Professor Zheng-Hong Lu of the Department of Materials Science & Engineering at the University of Toronto, have found a simple method of using chlorine to drastically reduce traditional OLED device complexity and dramatically improve its efficiency all at the same time. By engineering a one-atom thick sheet of chlorine onto the surface of an existing industry-standard electrode material (indium tin oxide, ITO) found in today's flat-panel displays, these researchers have created a medium that allows for efficient electrical transport while eliminating the need for several costly layers found in traditional OLED devices.

"It turns out that it's remarkably easy to engineer this one-atom thick layer of chlorine onto the surface of ITO," says Helander. "We developed a UV light assisted process to achieve chlorination, which negates the need for chlorine gas, making the entire procedure safe and reliable."

The team tested their green-emitting "Cl-OLED" against a conventional OLED and found that the efficiency was more than doubled at very high brightness. "OLEDs are known for their high-efficiency," says Helander. "However, the challenge in conventional OLEDs is that as you increase the brightness, the efficiency drops off rapidly."

Using their chlorinated ITO, this team of advanced materials researchers found that they were able to prevent this drop off and achieve a record efficiency of 50% at 10,000 cd/m2 (a standard florescent light has a brightness of approximately 8,000 cd/m2), which is at least two times more efficient than the conventional OLED.

"Our Cl-ITO eliminates the need for several stacked layers found in traditional OLEDs, reducing the number of manufacturing steps and equipment, which ultimately cuts down on the costs associated with setting up a production line," says Professor Zheng-Hong Lu.

Read more...

Researchers Create Elastic Material That Changes Color İn UV Light

Researchers from North Carolina State University have created a range of soft, elastic gels that change color when exposed to ultraviolet (UV) light – and change back when the UV light is removed or the material is heated up. The gels are impregnated with a type of photochromic compound called spiropyran. Spiropyrans change color when exposed to UV light, and the color they change into depends on the chemical environment surrounding the material.

The researchers made the gels out of an elastic silicone substance, which can be chemically modified to contain various other chemical compounds – changing the chemical environment inside the material. Changing this interior chemistry allows researchers to fine-tune how the color of the material changes when exposed to UV light.

"For example, if you want the material to turn yellow when exposed to UV light, you would attach carboxylic acid," explains Dr. Jan Genzer, Celanese Professor of Chemical and Biomolecular Engineering at NC State and co-author of a paper describing the research. "If you want magenta, you'd attach hydroxyl. Mix them together, and you get a shade of orange."

Photochromic compounds are not new, but this is the first time they've been incorporated into an elastic material, without impairing the material's elasticity.

The researchers were also able to create patterns by using a shaped mold to change the chemical make-up of specific regions in the material. For example, applying hydroxyl around a star-shaped mold (like a tiny cookie cutter) on the material would result in a yellow star-shaped pattern appearing on a dark magenta elastic when it is exposed to UV light.

Read more...

Climate Change From Black Carbon Depends On Altitude

Scientists have known for decades that black carbon aerosols add to global warming. These airborne particles made of sooty carbon are believed to be among the largest man-made contributors to global warming because they absorb solar radiation and heat the atmosphere. New research from Carnegie's Long Cao and Ken Caldeira, along with colleagues George Ban-Weiss and Govindasamy Bala, quantifies how black carbon's impact on climate depends on its altitude in the atmosphere. Their work, published online by the journal Climate Dynamics, could have important implications for combating global climate change. Black carbon is emitted from diesel engines and burning wood, among other sources. In the atmosphere, it acts as an absorbing aerosol—a particle that absorbs the sun's heating rays. (Other types of aerosols reflect the sunlight back out into space, providing a cooling effect.) The climate effect of black carbon is difficult to quantify because these particles heat the air around them, affecting clouds even before they begin to heat the land and ocean surface.

The team's research involved idealized simulations of adding a theoretical megatonne of black carbon uniformly around the globe at different altitudes in the atmosphere. They found that the addition of black carbon near the land and ocean surface caused the surface to heat. As the altitude of black carbon increased, surface warming decreased. The addition of black carbon to the stratosphere caused the land and oceans to cool. This cooling occurred despite the fact that the black carbon caused the Earth as a whole to absorb more energy from the sun. When black carbon is high in the atmosphere, it can lose its energy to space while helping to shade the land and ocean surface.

"Black carbon lower in the atmosphere is more effective at warming the surface, even though black carbon particles at higher altitudes absorb more solar radiation," said Ban-Weiss, formerly of Carnegie and currently at Lawrence Berkeley National Laboratory. He continued: "Just analyzing instantaneous changes in absorption of radiation from black carbon cannot accurately predict changes in surface temperatures. If we want a consistent framework for predicting changes in surface air temperature from black carbon we need to account for rapid atmospheric responses in things like clouds."

Black carbon also had varying effects on precipitation. In the lower layers it increased precipitation and in the upper layers it decreased precipitation, a result of changes in atmospheric stability.

"We showed that black carbon near Earth's surface has the greatest effect on global warming. Unfortunately, this is exactly where we are putting most of the black carbon that we add to the atmosphere," Caldeira said. "This black carbon also often causes health problems, so cleaning up these emissions would help both the environment and human health."

Read more...

UMD Scientists Make Magnetic New Graphene Discovery

University of Maryland researchers have discovered a way to control magnetic properties of graphene that could lead to powerful new applications in magnetic storage and magnetic random access memory. The finding by a team of Maryland researchers, led by Physics Professor Michael S. Fuhrer of the UMD Center for Nanophysics and Advanced Materials is the latest of many amazing properties discovered for graphene.

A honeycomb sheet of carbon atoms just one atom thick, graphene is the basic constituent of graphite. Some 200 times stronger than steel, it conducts electricity at room temperature better than any other known material (a 2008 discovery by Fuhrer, et. al). Graphene is widely seen as having great, perhaps even revolutionary, potential for nanotechnology applications. The 2010 Nobel Prize in physics was awarded to scientists Konstantin Novoselov and Andre Geim for their 2004 discovery of how to make graphene.

In their new graphene discovery, Fuhrer and his University of Maryland colleagues have found that missing atoms in graphene, called vacancies, act as tiny magnets -- they have a "magnetic moment." Moreover, these magnetic moments interact strongly with the electrons in graphene which carry electrical currents, giving rise to a significant extra electrical resistance at low temperature, known as the Kondo effect. The results appear in the paper "Tunable Kondo effect in graphene with defects" published this month in Nature Physics.

The Kondo effect is typically associated with adding tiny amounts of magnetic metal atoms, such as iron or nickel, to a non-magnetic metal, such as gold or copper. Finding the Kondo effect in graphene with vacancies was surprising for two reasons, according to Fuhrer.

"First, we were studying a system of nothing but carbon, without adding any traditionally magnetic impurities. Second, graphene has a very small electron density, which would be expected to make the Kondo effect appear only at extremely low temperatures," he said.

The team measured the characteristic temperature for the Kondo effect in graphene with vacancies to be as high as 90 Kelvin, which is comparable to that seen in metals with very high electron densities. Moreover the Kondo temperature can be tuned by the voltage on an electrical gate, an effect not seen in metals. They theorize that the same unusual properties of that result in graphene's electrons acting as if they have no mass also make them interact very strongly with certain kinds of impurities, such as vacancies, leading to a strong Kondo effect at a relatively high temperature.

Read more...

PRP Architects Complete Design For Centre Of Refurbishment Excellence İn Stoke-On-Trent

The 'Centre of Refurbishment Excellence' (CoRE) for Stoke-on-Trent City Council, the BRE and Stoke-on-Trent College is a new demonstration and learning facility. The Enson Works site in the Longton area of Stoke-on-Trent was selected as it is an architecturally impressive landmark setting, containing dilapidated grade II listed structures, which presents a unique refurbishment challenge, which will undoubtedly regenerate the surrounding area. PRP Architects completed the overall design for the former Victorian pottery works in Stoke-on-Trent.

PRP took the building design for CoRE to RIBA stage D, obtaining Listed Building Consent, Conservation Area consent and full Planning Permission for the whole site comprising the new build education building and entrance, conversion into conference and exhibition space of the retained pottery buildings and the existing America Hotel and PRP is presently delivering the Education Building with contractors Shaylor Construction

The 6,000 sqm site will provide an integrated facility to bring together the UK's talent, training and technologies to demonstrate best practice in sustainable refurbishment and retrofitting. The CoRE project will provide a new build college, conference area and demonstration space and will be a virtual hub and knowledge platform bringing together all stakeholders in the retrofit community, including a UK skills alliance.

An integral part of PRP's design has been the retention of the three, grade II listed bottle kilns. In addition the CoRE provides:
A 1,000 sqm BREEAM standard demonstration facility built over two floors and incorporating the kilns
A vast exhibition space to showcase sustainable refurbishment products with demonstration space for large construction models allowing all stakeholders to fully engage and understand the products and processes involved
A 1,350 sq new building which will comprise an education centre to undertake Stoke College's retrofit diploma course
Conference rooms to host events and seminars

PRP Partner, Frances Chaplin comments: "The CoRE will showcase best practice in retrofitting within the UK and PRP is pleased to be part of the delivery of such a landmark building. The listed kilns form an integral part of the design and full height glazing floods the interior with natural light."

Work has now commenced onsite for the project, with the new build college anticipated to be finished later this year. The project will benefit from £3.5 million of European Union investment from the ERDF Competitiveness Programme 2007-13, managed by regional development agency One North East. The ERDF programme is bringing over £300m into the North East to support innovation, enterprise and business support across the region.

Read more...

  © Blogger templates Psi by Ourblogtemplates.com 2008

Back to TOP