Friday, September 30, 2011

New Application For iPhone May Support Monitoring and Research On Parkinson's Disease

Researchers at the Georgia Tech Research Institute (GTRI) have developed a novel iPhone application that may enable persons with Parkinson's disease and certain other neurological conditions to use the ubiquitous devices to collect data on hand and arm tremors and relay the results to medical personnel. The researchers believe the application could replace subjective tests now used to assess the severity of tremors, while potentially allowing more frequent patient monitoring without costly visits to medical facilities.

The program – known as iTrem – could be offered later this year by the App Store, an Apple Inc. website that sells iPhone applications. But iTrem will first undergo a clinical study at Emory University and must receive any required approvals from the Food and Drug Administration.

"We expect iTrem to be a very useful tool for patients and their caregivers," said Brian Parise, a research scientist who is principal investigator for the project along with Robert Delano, another GTRI research scientist. "And as a downloadable application, it also promises to be convenient and cost-effective."

iTrem utilizes the iPhone's built-in accelerometer to collect data on a patient in his or her home or office. The application directly tracks tremor information currently, and in the future will use simple puzzle games to record tremor data, which will then be processed and transmitted.

The researchers expect the clinical trial to show that data gathered by the program would allow physicians to remotely monitor the degree of disability, progression and medication response among patients with tremor-related conditions. In addition, iTrem offers a social component that allows people to share stories, pictures and data.

iTrem's developers are working with the Advanced Technology Development Center (ATDC) to form a startup company based on iTrem and future applications that might take advantage of iPhone capabilities. ATDC is a startup accelerator based at Georgia Tech that helps Georgia entrepreneurs launch and build successful technology companies.

The GTRI team plans ongoing development of iTrem's interface, based on responses from doctors and patients. They're also investigating other consumer technologies with diagnostic potential, including the tiny gyroscopes now available in some cellular phones.

Read more...

Exeter Study Brings Brain-like Computing A Step Closer To Reality

The development of 'brain-like' computers has taken a major step forward today with the publication of research led by the University of Exeter. Published in the journal Advanced Materials and funded by the Engineering and Physical Sciences Research Council, the study involved the first ever demonstration of simultaneous information processing and storage using phase-change materials. This new technique could revolutionise computing by making computers faster and more energy-efficient, as well as making them more closely resemble biological systems.

Computers currently deal with processing and memory separately, resulting in a speed and power 'bottleneck' caused by the need to continually move data around. This is totally unlike anything in biology, for example in human brains, where no real distinction is made between memory and computation. To perform these two functions simultaneously the University of Exeter research team used phase-change materials, a kind of semi-conductor that exhibits remarkable properties.

Their study demonstrates conclusively that phase-change materials can store and process information simultaneously. It also shows experimentally for the first time that they can perform general-purpose computing operations, such as addition, subtraction, multiplication and division. More strikingly perhaps it shows that phase-change materials can be used to make artificial neurons and synapses. This means that an artificial system made entirely from phase-change devices could potentially learn and process information in a similar way to our own brains.

Lead author Professor David Wright of the University of Exeter said: "Our findings have major implications for the development of entirely new forms of computing, including 'brain-like' computers. We have uncovered a technique for potentially developing new forms of 'brain-like' computer systems that could learn, adapt and change over time. This is something that researchers have been striving for over many years."

This study focused on the performance of a single phase-change cell. The next stage in Exeter's research will be to build systems of interconnected cells that can learn to perform simple tasks, such as identification of certain objects and patterns.

Read more...

Smartphone App Helps You Find Friends İn A Crowd

Can a smartphone app enable meaningful, face-to-face conversation? Engineers are trying to find out, with software that helps people locate their friends in a crowd – and make new friends who share similar interests.

The software, called eShadow, makes its debut at the IEEE International Conference on Distributed Computing Systems (ICDCS) on Thursday, June 23 in Minneapolis.

It uses nearby wireless networks and smartphones' wireless communication technologies to alert users that a friend who also uses the software is in the area – and gives directions to that friend's location.

Dong Xuan, associate professor of computer science and engineering at Ohio State University, hopes that his research group's software will also build bridges between strangers who share personal or professional interests.

At a business meeting such as ICDCS, for example, the software could remind a user of a forgotten acquaintance's name, or help him or her make new professional contacts in the same area of research.

Since it enables face-to-face meetings, eShadow is a complement to online social networks such as Facebook, which excel at connecting people who are far apart, Xuan said.

"Today, online social networking has advanced dramatically, but our ability to meet people face-to-face hasn't gotten any easier," he said. "We want eShadow to close social gaps and connect people in meaningful ways, while keeping the technology non-intrusive and protecting privacy."

The name eShadow comes from the idea that users input their interests into the software, and their smartphone broadcasts those interests to certain other users of the software – but only within 50 yards of the phone. So as users move, the broadcast follows them around like a shadow.

As to users' safety, Xuan feels that, at least for some situations, meeting someone in person is safer than meeting them online.

"Online, people can steal others' identity, or lie easily without detection. It's much harder to pull off a masquerade in person," he said.

Plus, users only share information which they want to share, and can observe potential friends at a distance before deciding whether to introduce themselves. Young people, Xuan pointed out, are especially comfortable with putting personal information online, and could readily adapt to using the software.

That said, people can be selective about who they wish to receive their eShadow signals. Users can select individuals from their phone's contact list, and specifically de-select people as well.

Read more...

Teeming With Life, Pacific's California Current Likened To Africa's Serengeti Plain

Like the vast African plains, two huge expanses of the North Pacific Ocean are major corridors of life, attracting an array of marine predators in predictable seasonal patterns, according to final results from the Census of Marine Life Tagging of Pacific Predators (TOPP) project published today in the journal Nature. The paper culminates the TOPP program's decade-long effort to track top marine predator movements in the Pacific Ocean. It presents for the first time the results for all 23 tagged species and reveals how migrations and habitat preferences overlap -- a remarkable picture of critical marine life pathways and habitats.

The study found that major hot spots for large marine predators are the California Current, which flows south along the US west coast, and a trans-oceanic migration highway called the North Pacific Transition Zone, which connects the western and eastern Pacific on the boundary between cold sub-arctic water and warmer subtropical water -- about halfway between Hawaii and Alaska.

"These are the oceanic areas where food is most abundant, and it's driven by high primary productivity at the base of the food chain -- these areas are the savanna grasslands of the sea," say co-authors and project originators Barbara Block of Stanford University’s Hopkins Marine Station and Daniel Costa, professor of ecology and evolutionary biology at the University of California, Santa Cruz.

"Knowing where and when species overlap is valuable information for efforts to manage and protect critical species and ecosystems."

Drs. Costa and Block were joined by Steven Bograd of the NOAA Southwest Fisheries Science Center, Randy Kochevar of Stanford University and others to launch the project in 2000 as part of the Census of Marine Life, a 10-year research initiative that investigated the diversity, distribution, and abundance of marine life in the global ocean. TOPP became the world’s largest-ever biologging study, eventually involving more than 75 biologists, oceanographers, engineers and computer scientists across five countries.

Read more...

'Orca Ears' İnspire Stanford Researchers To Develop Ultrasensitive Undersea Microphone

For most people, listening to the ocean means contemplating the soothing sound of waves breaking gently on a sandy beach. But for researchers studying everything from whale migration to fisheries populations, and from underwater mapping to guiding robots trying to repair leaking undersea oil wells, listening to the ocean from the other side – underwater – can reveal volumes of valuable data.

Stanford researchers have developed a highly sensitive underwater microphone that can capture the whole range of ocean sounds, from the equivalent of a soft whisper in a library to an explosion of a ton of TNT just 60 feet away – a range of approximately 160 decibels – and do so accurately at any depth, no matter how crushing the pressure. It also can hear sound frequencies across a span of 17 octaves, spanning pitches far higher than the whine of a mosquito and far lower than a rumbling foghorn.

Existing underwater microphones – called hydrophones – have much more limited ranges of sensitivity and do not perform well at depth, where the ambient pressure can be extremely large, making it difficult to detect faint sounds.

Sonar – using sound to locate and map – is critical to underwater communication and exploration, because radio signals can travel only a centimeter or two before they dissipate in seawater and light can't penetrate the depths below about 100 meters.

In approaching the challenge of designing the new hydrophone, the researchers first examined some existing listening devices that work well underwater – the ears of marine mammals, particularly orcas.

"Orcas had millions of years to optimize their sonar and it shows," said Onur Kilic, a postdoctoral researcher in electrical engineering. "They can sense sounds over a tremendous range of frequencies and that was what we wanted to do."

Kilic is the lead author of a paper about the research published in the Journal of the Acoustic Society of America earlier this year.

Read more...

Evolution To The Rescue

Evolution is usually thought to be a very slow process, something that happens over many generations, thanks to adaptive mutations. But environmental change due to things like climate change, habitat destruction, pollution, etc. is happening very fast. There are just two options for species of all kinds: either adapt to environmental change or become extinct. So, according to McGill biology professor, Andrew Gonzalez, the question arises, "Can evolution happen quickly enough to help a species survive?" The answer, according to his most recent study, published in Science, is a resounding yes.

By using a long-armed robot working 24/7 over a period of several of months, McGill Professors Graham Bell and Gonzalez were able to track the fate of over 2000 populations of baker's yeast for many generations. Yeast was chosen for the experiment because a lot is known about the genetic makeup of this model organism and because it can reproduce in a matter of hours. Bell and Gonzalez used the robot to submit different yeast populations to varying degrees of environmental stress in the form of salt and so study evolutionary rescue, which is the ability of a population to adapt rapidly through evolution, in real time.

What they observed was that the likelihood of evolutionary rescue depended on the severity and rate of change of the environment and the degree of prior exposure of populations to the environmental stressor (salt). The degree of isolation from neighboring populations also affected the capacity of the yeast populations to adapt through the accumulation of beneficial mutations. For more a more detailed description of the findings see below.*

Gonzalez and his team were in effect watching evolution at work. And what they discovered is that it can happen surprisingly fast, within 50 – 100 generations.

"The same general processes are occurring whether it's yeast or mammals," said Gonzalez. "At the end of the day we can't do the experiment with a panda or a moose, for example, because the time it would take to study their evolution is far longer than the time we have given the current rate of environmental change. At some point we have to work at the level of a model and satisfy ourselves that the basic reality we capture is sufficient to extrapolate from." While there has been theoretical work on the subject done in the past, this is the first time anyone has done a practical experiment of this kind, and shown evolutionary rescue at work.

Read more...

University of Minnesota Engineering Researchers Discover Source For Generating 'Green' Electricity

University of Minnesota engineering researchers in the College of Science and Engineering have recently discovered a new alloy material that converts heat directly into electricity. This revolutionary energy conversion method is in the early stages of development, but it could have wide-sweeping impact on creating environmentally friendly electricity from waste heat sources. Researchers say the material could potentially be used to capture waste heat from a car's exhaust that would heat the material and produce electricity for charging the battery in a hybrid car. Other possible future uses include capturing rejected heat from industrial and power plants or temperature differences in the ocean to create electricity. The research team is looking into possible commercialization of the technology.

"This research is very promising because it presents an entirely new method for energy conversion that's never been done before," said University of Minnesota aerospace engineering and mechanics professor Richard James, who led the research team."It's also the ultimate 'green' way to create electricity because it uses waste heat to create electricity with no carbon dioxide."

To create the material, the research team combined elements at the atomic level to create a new multiferroic alloy, Ni45Co5Mn40Sn10. Multiferroic materials combine unusual elastic, magnetic and electric properties. The alloy Ni45Co5Mn40Sn10 achieves multiferroism by undergoing a highly reversible phase transformation where one solid turns into another solid. During this phase transformation the alloy undergoes changes in its magnetic properties that are exploited in the energy conversion device.

During a small-scale demonstration in a University of Minnesota lab, the new material created by the researchers begins as a non-magnetic material, then suddenly becomes strongly magnetic when the temperature is raised a small amount. When this happens, the material absorbs heat and spontaneously produces electricity in a surrounding coil. Some of this heat energy is lost in a process called hysteresis. A critical discovery of the team is a systematic way to minimize hysteresis in phase transformations. The team's research was recently published in the first issue of the new scientific journal Advanced Energy Materials.

Read more...

Uc San Diego Researchers Create Tool To Put The Lid On Solar Power Fluctuations

How does the power output from solar panels fluctuate when the clouds roll in? And can researchers predict these fluctuations? UC San Diego Professor Jan Kleissl and Matthew Lave, a Ph.D. student in the Department of Mechanical and Aerospace Engineering at the Jacobs School, have found the answer to these questions. They also have developed a software program that allows power grid managers to easily predict fluctuations in the solar grid caused by changes in the cloud cover. The program uses a solar variability law Lave discovered. The finding comes at a time when the Obama administration is pushing for the creation of a smart power grid throughout the nation. The improved grid would allow for better use of renewable power sources, including wind and solar.

Also, more utilities have been increasing the amount of renewable energy sources they use to power homes and businesses. For example, Southern California Edison reported this month that it is adding more large-scale solar power plants to its grid and retooling its distribution system to accommodate the power fluctuations that will follow.

Kleissl and Lave's finding could have a dramatic impact on the amount of solar power allowed to feed into the grid. Right now, because of concerns over variability in power output, the amount of solar power flowing in the grid at residential peak demand times—your typical sunny weekend afternoon in Southern California, say—is limited to 15 percent before utilities are required to perform additional studies. As operators are able to better predict a photovoltaic system's variability, they will be able to increase this limit. In California, a law signed by Gov. Jerry Brown in April 2011 requires all electricity retailers in the state, including publicly owned utilities, to generate 33 percent of their power sales from renewable energy sources by 2020.

Incidentally, Kleissl and Lave's research shows that the amount of solar variability can also be reduced by installing smaller solar panel arrays in multiple locations rather than building bigger arrays in just one spot, since a cloud covering one panel is less likely to cover the other panels, Lave said.

"The distance between arrays is key," he said.

The variability in the output of photovoltaic power systems has long been a source of great concern for utility operators worldwide. But Kleissl and Lave found that variability for large photovoltaic systems is much smaller than previously thought. It also can be modeled accurately, and easily, based on measurements from just a single weather station. Kleissl presented the paper, titled 'Modeling Solar Variability Effects on Power Plants,' this week at the National Renewable Energy Laboratory in Golden, Colo.

Read more...

Genius Of Einstein, Fourier Key To New Humanlike Computer Vision

Two new techniques for computer-vision technology mimic how humans perceive three-dimensional shapes by instantly recognizing objects no matter how they are twisted or bent, an advance that could help machines see more like people. The techniques, called heat mapping and heat distribution, apply mathematical methods to enable machines to perceive three-dimensional objects, said Karthik Ramani, Purdue University's Donald W. Feddersen Professor of Mechanical Engineering.

"Humans can easily perceive 3-D shapes, but it's not so easy for a computer," he said. "We can easily separate an object like a hand into its segments - the palm and five fingers - a difficult operation for computers."

Both of the techniques build on the basic physics and mathematical equations related to how heat diffuses over surfaces.

"Albert Einstein made contributions to diffusion, and 18th century physicist Jean Baptiste Joseph Fourier developed Fourier's law, used to derive the heat equation," Ramani said. "We are standing on the shoulders of giants in creating the algorithms for these new approaches using the heat equation."

As heat diffuses over a surface it follows and captures the precise contours of a shape. The system takes advantage of this "intelligence of heat," simulating heat flowing from one point to another and in the process characterizing the shape of an object, he said.

Findings will be detailed in two papers being presented during the IEEE Computer Vision and Pattern Recognition conference on June 21-23 in Colorado Springs. The paper was written by Ramani, Purdue doctoral students Yi Fang and Mengtian Sun, and Minhyong Kim, a professor of pure mathematics at the University College London.

A major limitation of existing methods is that they require "prior information" about a shape in order for it to be analyzed.

"For example, in order to do segmentation you have to tell the computer ahead of time how many segments the object has," Ramani said. "You have to tell it that you are expecting, say, 10 segments or 12 segments."

The new methods mimic the human ability to properly perceive objects because they don't require a preconceived idea of how many segments exist.

Read more...

Women İn Science? Universities Don't Make The Grade

Despite years of trying to improve the number of women undergraduates in science and engineering, a new study shows most universities are failing. Not only are women lagging behind their male classmates, efforts to close the gap too often focus on students instead of faculty and institutional structures. This is first study that looks at the full range of programs for undergraduate women in science and engineering in the U.S. It gathered information from nearly 50 difference programs.

Researchers found ongoing issues with the atmosphere towards women in the classroom, the structure of academic programs, and poor faculty attitudes. The teaching environment, they found, "often portrays science and engineering as highly competitive, masculine domains." While many universities are committed to increasing the number of women pursuing these "elite fields," their programs too often focus on things such as peer mentoring -- instead of creating real structural change. The result, say the authors, is that universities are contributing to the ongoing wage gap between men and women, as well as the continuing dearth of skilled scientists and engineers in the United States. Gender divisions in college education are significant because people who pursue scientific careers usually receive an undergraduate degree in their field.

Mary Frank Fox, a professor in the School of Public Policy at the Georgia Institute of Technology, Gerhard Sonnert (Harvard) and Irina Nikiforova (Georgia Institute of Technology) conducted the study, which was funded by the National Science Foundation. The findings appear in the October issue of Gender & Society, a journal of Sociologists for Women in Society.

The paper points out that while women earn 58 percent of all undergraduate degrees in the U.S., when it comes to science and engineering they're still far behind men. In fact, women receive only 21 percent of degrees in the field of computer and information science, and only 19 percent of engineering degrees.

Fox and her co-authors found that university program directors believe women's self-confidence and their knowledge about careers in science was a bigger obstacle than their academic ability. At the same time, the hostile classroom climate in may be affecting student's self confidence in science and engineering courses. Fox says the key issues facing undergraduate women were: a lack of supportive peer relationships, a lack of faculty advisors, unsupportive classroom climates, a lack of both faculty and administrative commitment to undergraduate women, and little attention paid to gender equity on campus.

Read more...

Thursday, September 29, 2011

Self-Cleaning Anodes Could Facilitate Cost-effective Coal-powered Fuel Cells

Using barium oxide nanoparticles, researchers have developed a self-cleaning technique that could allow solid oxide fuel cells to be powered directly by coal gas at operating temperatures as low as 750 degrees Celsius. The technique could provide a cleaner and more efficient alternative to conventional power plants for generating electricity from the nation's vast coal reserves. Solid oxide fuel cells can operate on a wide variety of fuels, and use hydrocarbons gases directly – without a separate reformer. The fuel cells rely on anodes made from nickel and a ceramic material known as yttria-stabilized zirconia. Until now, however, carbon-containing fuels such as coal gas or propane could quickly deactivate these Ni-YSZ anodes, clogging them with carbon deposits in a process known as "coking" – especially at lower operating temperatures.

To counter this problem, researchers have developed a technique for growing barium oxide nanostructures on the anodes. The structures adsorb moisture to initiate a water-based chemical reaction that oxidizes the carbon as it forms, keeping the nickel electrode surfaces clean even when carbon-containing fuels are used at low temperatures.

"This could ultimately be the cleanest, most efficient and cost-effective way of converting coal into electricity," said Meilin Liu, a Regents professor in the School of Materials Science and Engineering at the Georgia Institute of Technology. "And by providing an exhaust stream of pure carbon dioxide, this technique could also facilitate carbon sequestration without the separation and purification steps now required for conventional coal-burning power plants."

The water-mediated carbon removal technique was reported June 21 in the journal Nature Communications. The research was supported by the U.S. Department of Energy's Office of Basic Energy Sciences, through the HeteroFoaM Center, an Energy Frontier Research Center. The work also involved researchers from Brookhaven National Laboratory, the New Jersey Institute of Technology and Oak Ridge National Laboratory.

Conventional coal-fired electric generating facilities capture just a third of the energy available in the fuel they burn. Fuel cells can convert significantly more of the energy, approximately 50 percent. If gas turbines and fuel cells could be combined into hybrid systems, researchers believe they could capture as much as 80 percent of the energy, reducing the amount of coal needed to produce a given amount of energy, potentially cutting carbon emissions.

But that would only be possible if the fuel cells could run for long periods of time on coal gas, which now deactivates the anodes after as little as 30 minutes of operation.

The carbon removal system developed by the Georgia Tech-led team uses a vapor deposition process to apply barium oxide nanoparticles to the nickel-YSZ electrode. The particles, which range in size from 10 to 100 nanometers, form "islands" on the nickel that do not block the flow of electrons across the electrode surface

Read more...

Nanoparticles Disguised As Red Blood Cells Will Deliver Cancer-fighting Drugs

Researchers at the University of California, San Diego have developed a novel method of disguising nanoparticles as red blood cells, which will enable them to evade the body's immune system and deliver cancer-fighting drugs straight to a tumor. Their research will be published next week in the online Early Edition of the Proceedings of the National Academy of Sciences. The method involves collecting the membrane from a red blood cell and wrapping it like a powerful camouflaging cloak around a biodegradable polymer nanoparticle stuffed with a cocktail of small molecule drugs. Nanoparticles are less than 100 nanometers in size, about the same size as a virus.

"This is the first work that combines the natural cell membrane with a synthetic nanoparticle for drug delivery applications." said Liangfang Zhang, a nanoeningeering professor at the UC San Diego Jacobs School of Engineering and Moores UCSD Cancer Center. "This nanoparticle platform will have little risk of immune response".

Researchers have been working for years on developing drug delivery systems that mimic the body's natural behavior for more effective drug delivery. That means creating vehicles such as nanoparticles that can live and circulate in the body for extended periods without being attacked by the immune system. Red blood cells live in the body for up to 180 days and, as such, are "nature's long-circulation delivery vehicle," said Zhang's student Che-Ming Hu, a UCSD Ph.D. candidate in bioengineering, and first author on the paper.

Stealth nanoparticles are already used successfully in clinical cancer treatment to deliver chemotherapy drugs. They are coated in a synthetic material such as polyethylene glycol that creates a protection layer to suppress the immune system so that the nanoparticle has time to deliver its payload. Zhang said today's stealth nanoparticle drug delivery vehicles can circulate in the body for hours compared to the minutes a nanoparticle might survive without this special coating.

But in Zhang's study, nanoparticles coated in the membranes of red blood cells circulated in the bodies of lab mice for nearly two days. The study was funded through a grant from the National Institute of Health.

A shift towards personalized medicine

Using the body's own red blood cells marks a significant shift in focus and a major breakthrough in the field of personalized drug delivery research. Trying to mimic the most important properties of a red blood cell in a synthetic coating requires an in-depth biological understanding of how all the proteins and lipids function on the surface of a cell so that you know you are mimicking the right properties. Instead, Zhang's team is just taking the whole surface membrane from an actual red blood cell.

"We approached this problem from an engineering point of view and bypassed all of this fundamental biology," said Zhang. "If the red blood cell has such a feature and we know that it has something to do with the membrane -- although we don't fully understand exactly what is going on at the protein level -- we just take the whole membrane. You put the cloak on the nanoparticle, and the nanoparticle looks like a red blood cell."

Using nanoparticles to deliver drugs also reduces the hours it takes to slowly drip chemotherapy drug solutions through an intravenous line to just a few minutes for a single injection of nanoparticle drugs. This significantly improves the patient's experience and compliance with the therapeutic plan. The breakthrough could lead to more personalized drug delivery wherein a small sample of a patient's own blood could produce enough of the essential membrane to disguise the nanoparticle, reducing the risk of immune response to almost nothing.

Read more...

Discovery Of Parathyroid Glow Promises To Reduce Endocrine Surgery Risk

The parathyroid glands – four small organs the size of grains of rice located at the back of the throat – glow with a natural fluorescence in the near infrared region of the spectrum. This unique fluorescent signature was discovered by a team of biomedical engineers and endocrine surgeons at Vanderbilt University, who have used it as the basis of a simple and reliable optical detector that can positively identify the parathyroid glands during endocrine surgery.

The report of the discovery of parathyroid fluorescence and the design of the optical detector was published in the June issue of the Journal of Biomedical Optics.

Damage to these tiny organs can have deleterious, life-long effects on patients' health because they produce a hormone that controls critical calcium concentrations in bones, intestines and kidneys. However, the parathyroid glands are very difficult to identify with the naked eye. Not only are they small, but their location also varies widely from person to person and it takes a microscope to reliably tell the difference between parathyroid tissue and the thyroid and lymph tissue that surrounds it.

In 2004, more than 80,000 endocrine surgeries were performed in the United States and this number is projected to grow to more than 100,000 by 2020. Today, when a surgeon cuts into a patient's neck to remove a diseased thyroid, somewhere between 8 to 19 percent of the time the patient's parathyroid glands are also damaged or removed.

Parathyroid glow is surprisingly strong

"We have discovered that the parathyroid glands are two to 10 times more fluorescent in the near infrared than any other tissues found in the neck," said Professor of Biomedical Engineering Anita Mahadevan-Jansen, who directed the study. "We have taken measurements with more than 50 patients now and we have found this effect 100 percent of the time, even when the tissue is diseased. That is amazing. You almost never get 100 percent results in biological studies."

The fluorescence is so strong that it doesn't take expensive or sophisticated instruments to detect. The Vanderbilt researchers have assembled a detector from off-the-shelf hardware. It consists of a low-powered infrared laser connected to an optical fiber probe. As the fiber connected to the laser illuminates the tissue with invisible near infrared light, other fibers in the probe are connected to a detector that measures the strength of the fluorescence that the laser excites. The university has applied for an international patent that covers this application.

"I was certainly impressed with how accurate this method seems to be," said John Phay, an endocrine surgeon at the Ohio State University Medical Center, who collaborated in the study when he was at Vanderbilt. "The ability to detect the parathyroids would be a big help: The major problem in parathyroid surgery is finding them and it is very hard to avoid them in thyroid cancer surgery when you need to clear out lymph nodes."

Using the first generation of the device was "a bit burdensome, because you have to dim the lights," Phay commented. This will not be a problem with the next version, because it will include a filter that will block out visible light. According to the surgeon, the system will be the most useful with the planned addition of a camera that displays the fluorescence of all the tissues in the throat on a single display.

Project begins with curiosity of first-year surgery resident

The story of discovery began in 2007 when Lisa White, a first-year resident in the Vanderbilt surgery department, participated in her first neck surgery. "It was a very difficult case," White said. "We were looking for the parathyroid glands and they were very hard to find, although we finally did find them. After the surgery was over, I decided that we really need a better way of identifying parathyroid tissue."

This conclusion led White to conduct a literature search of the research that has been conducted on the basic physiology and biochemistry of the parathyroid. In the process she came across a paper written by Mahadevan-Jansen with another intern that described an optical technique that can detect liver cancer.

"I thought that if such a technique could detect the difference between normal and cancerous liver tissue, surely it could tell two different types of tissue apart," White said. So she decided to pay Mahadevan-Jansen a visit.

Read more...

Iowa State Hybrid Lab Combines Technologies To Make Biorenewable Fuels and Products

Laura Jarboe pointed to a collection of test tubes in her Iowa State University laboratory. Some of the tubes looked like they were holding very weak coffee. That meant microorganisms – in this case, Shewanella bacteria – were growing and biochemically converting sugars into hydrocarbons, said Jarboe, an Iowa State assistant professor of chemical and biological engineering.

Some of the sugars in those test tubes were produced by the fast pyrolysis of biomass. That's a thermochemical process that quickly heats biomass (such as corn stalks and leaves) in the absence of oxygen to produce a liquid product known as bio-oil and a solid product called biochar. The bio-oil can be used to manufacture fuels and chemicals; the biochar can be used to enrich soil and remove greenhouse gases from the atmosphere.

Iowa State's Hybrid Processing Laboratory on the first floor of the new, state-built Biorenewables Research Laboratory is all about encouraging that unique mix of biochemical and thermochemical technologies. The goal is for biologists and engineers to use the lab's incubators, reactors, gas chromatography instruments and anaerobic chambers to find new and better ways to produce biorenewable fuels and chemicals.

"Biological processes occur well below the boiling point of water, while thermal processes are usually performed hundreds of degrees higher, which makes it hard to imagine how these processes can be combined," said Robert C. Brown, an Anson Marston Distinguished Professor in Engineering, the Gary and Donna Hoover Chair in Mechanical Engineering, and the Iowa Farm Bureau Director of Iowa State's Bioeconomy Institute.

"In fact, these differences in operating regimes represent one of the major advantages of hybrid processing," Brown said. "High temperatures readily break down biomass to substrates that can be fermented to desirable products."

Jarboe's research is one example. She's trying to develop bacteria that can grow and thrive in the chemicals and compounds that make up bio-oil. That way, they can ferment the sugars from bio-oil with greater efficiency and produce more biorenewable fuels or chemicals.

Read more...

New Compact Microspectrometer Design Achieves High Resolution and Wide Bandwidth

A new microspectrometer architecture that uses compact disc-shaped resonators could address the challenges of integrated lab-on-chip sensing systems that now require a large off-chip spectrometer to achieve high resolution. Spectrometers have conventionally been expensive and bulky bench-top instruments used to detect and identify the molecules inside a sample by shining light on it and measuring different wavelengths of the emitted or absorbed light. Previous efforts toward miniaturizing spectrometers have reduced their size and cost, but these reductions have typically resulted in lower-resolution instruments.

"For spectrometers, it is better to be small and cheap than big and bulky -- provided that the optical performance targets are met," said Ali Adibi, a professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. "We were able to achieve high resolution and wide bandwidth with a compact single-mode on-chip spectrometer through the use of an array of microdonut resonators, each with an outer radius of two microns."

The 81-channel on-chip spectrometer designed by Georgia Tech engineers achieved 0.6-nanometer resolution over a spectral range of more than 50 nanometers with a footprint less than one square millimeter. The simple instrument -- with its ultra-small footprint -- can be integrated with other devices, including sensors, optoelectronics, microelectronics and microfluidic channels for use in biological, chemical, medical and pharmaceutical applications.

The microspectrometer architecture was described in a paper published on June 20 in the journal Optics Express. The research was supported by the Air Force Office of Scientific Research and the Defense Advanced Research Projects Agency.

"This architecture is promising because the quality-factor of the microdonut resonators is higher than that of microrings of the same size," said Richard Soref, a research scientist in the U.S. Air Force Research Laboratory at Hanscom Air Force Base who was not directly involved in the research. "Having such small resonators is also an advantage because they can be densely packed on a chip, enabling a large spectrum to be sampled."

Adibi's group is currently developing the next generation of these spectrometers, which are being designed to contain up to 1000 resonators and achieve 0.15-nanomater resolution with a spectral range of 150 nanometers and footprint of 200 micrometers squared.

Read more...

Device Could İmprove Harvest Of Stem Cells From Umbilical Cord Blood

Johns Hopkins graduate students have invented a system to significantly boost the number of stem cells collected from a newborn's umbilical cord and placenta, so that many more patients with leukemia, lymphoma and other blood disorders can be treated with these valuable cells. The prototype is still in the testing stage, but initial results are promising. The student inventors have obtained a provisional patent covering the technology and have formed a company, TheraCord LLC, to further develop the technology, which may someday be used widely in hospital maternity units. The students say the need for this system is obvious.

"Cord blood, collected from the umbilical cord and placenta after live birth, is the most viable source of stem cells, yet over 90 percent is uncollected and discarded," the team members wrote in a presentation of their project at the university's recent Biomedical Engineering Design Day. "One of the main reasons valuable cord blood is so frequently discarded is because no adequate collection method exists."

The students say their easy-to-use invention, called the CBx System, could remedy these shortcomings.

When a baby is born, a few families pay for private collection and storage of the child's cord blood, in case its stem cells are needed to treat a future illness. When families do not choose this option, the materials containing cord blood are generally thrown away as medical waste. But at the 180 hospitals affiliated with public cord blood banks, new mothers can donate cord blood so that its stem cells can be extracted and used to rebuild the immune systems of seriously ill patients, particularly those with blood cancers such as leukemia, lymphoma and myeloma.

According to the Johns Hopkins students, the current method of collecting these cells from cord blood doesn't work well because it relies strictly on gravity. The National Marrow Donor Program says about 50 percent of the units collected in this way contain enough stem cells to be stored for transplant use. Another organization, the National Cord Blood Program, says only 40 percent of collected units meet transplantation standards. Even when the procedure is successful, the Johns Hopkins students said, the average collection yields only enough stem cells to treat a child but not enough to treat an adult patient, based on the recommended cell dosage.

To solve these problems, the students, who were enrolled in a master's degree program in the university's Center for Bioengineering Innovation and Design, spent the past year developing a new collection method that uses both mechanical forces and a chemical solution to help detach and flush more stem cells from the cord and placenta blood vessels.

"This is important for two reasons," said James Waring, a member of the student team. "First, we believe it collects enough cells from each birth so that stem-cell therapy can be used on adult patients, who need more cells."

In addition, in early testing on discarded cords and placentas at The Johns Hopkins Hospital, the team's device collected up to 50 percent more stem cells than the traditional gravity system, the students said.

Read more...

Berkeley Lab Tests Cookstoves for Haiti

The developers of the fuel-efficient Berkeley-Darfur Stove for refugee camps in central Africa are at it once again, this time evaluating inexpensive metal cookstoves for the displaced survivors of last year's deadly earthquake in Haiti. Scientists from the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have teamed up with students from the University of California (UC), Berkeley to run a series of efficiency tests comparing the traditional Haiti cookstove with a variety of low-cost, commercially available alternatives. The long-term goal is to find the safest and most fuel-efficient stove -- or to design a new one that would win favor with the cooks of Haiti -- and tap the resources of nonprofit aid organizations to subsidize its manufacture in local metal shops.

"A more efficient cookstove would not only save Haitian families and aid organizations money on fuel, but could also reduce pressure to cut down trees in this already heavily deforested island nation," says Haiti Stove Project leader Ashok Gadgil, director of the Environmental Energy Technologies Division at Berkeley Lab and the driving force behind development of the Berkeley-Darfur stove. "More efficient stoves that emit less carbon monoxide and smoke could also help reduce the adverse impacts of these emissions on the health of the cooks in Haiti, who are mostly women."

The Haitian government estimates that 316,000 people were killed and more than 1 million made homeless by the January 12, 2010 magnitude 7.0 quake that left the capital city of Port-au-Prince in ruins (although some international organizations estimate the casualties to be lower). That suffering and devastation was readily apparent when Gadgil sent a team to Haiti three months after the quake on a mission to evaluate the need for cookstoves among survivors.

Their findings underscored both the promise and challenges facing any attempt to apply the Darfur cookstove experience to the Haitian situation. "The Darfur stove is a wood-burning stove. It didn't work as well in Haiti, where most people cook with charcoal,'' says UC Berkeley combustion engineering graduate student Katee Lask, who is supervising the stove-testing. "Since there were already so many charcoal stoves on the market, we decided to look at the ones that were already being disseminated there and provide an unbiased assessment. This is valuable information for the nongovernmental organizations, or NGOs, who do not have the technical capacity for assessment of efficiency and emissions."

The team brought back for testing a traditional Haiti stove, which is typically fabricated with perforated sheet metal, and several "improved" commercial designs also available there. In a scientific kitchen set up with a fume hood in a warehouse at Berkeley Lab, the performance of the traditional stove was compared with that of four alternatives made of metal or metal-ceramic combinations. UC Berkeley undergraduates carried out most of the combustion efficiency tests. One set of experiments matched the five charcoal stoves' performance at the simple task of boiling water; a second set of experiments involves cooking a traditional Haitian meal of beans and rice.

Read more...

Nıst Polishes Method For Creating Tiny Diamond Machines

Diamonds may be best known as a symbol of long-lasting love. But semiconductor makers are also hoping they'll pan out as key components of long-lasting micromachines if a new method developed at the National Institute of Standards and Technology (NIST) for carving these tough, capable crystals proves its worth.* The method offers a precise way to engineer microscopic cuts in a diamond surface, yielding potential benefits in both measurement and technological fields. By combining their own observations with background gleaned from materials science, NIST semiconductor researchers have found a way to create unique features in diamond -- potentially leading to improvements in nanometrology in short order, as it has allowed the team to make holes of precise shape in one of the hardest known substances. But beyond the creation of virtually indestructible nanorulers, the method could one day lead to the improvement of a class of electronic devices useful in cell phones, gyroscopes and medical implants.

Well known for making the hugely complex electronic microchips that run our laptops, the semiconductor industry has expanded its portfolio by fabricating tiny devices with moving parts. Constructed with substantially the same techniques as the electronic chips, these "micro-electromechanical systems," or MEMS, are just a few micrometers in size. They can detect environmental changes such as heat, pressure and acceleration, potentially enabling them to form the basis of tiny sensors and actuators for a host of new devices. But designers must take care that tiny moving parts do not grind to a disastrous halt. One way to make the sliding parts last longer without breaking down is to make them from a tougher material than silicon.

"Diamond may be the ideal substance for MEMS devices," says NIST's Craig McGray. "It can withstand extreme conditions, plus it's able to vibrate at the very high frequencies that new consumer electronics demand. But it's very hard, of course, and there hasn't been a way to engineer it very precisely at small scales. We think our method can accomplish that."

The method uses a chemical etching process to create cavities in the diamond surface. The cubic shape of a diamond crystal can be sliced in several ways -- a fact jewelers take advantage of when creating facets on gemstones. The speed of the etching process depends on the orientation of the slice, occurring at a far slower rate in the direction of the cube's "faces" -- think of chopping the cube into smaller cubes -- and these face planes can be used as a sort of boundary where etching can be made to stop when desired. In their initial experiments, the team created cavities ranging in width from 1 to 72 micrometers, each with smooth vertical sidewalls and a flat bottom.

Read more...

Instead Of Defibrillator's Painful Jolt, There May Be A Gentler Way To Prevent Sudden Death

Each year in the United States, more than 200,000 people have a cardiac defibrillator implanted in their chest to deliver a high-voltage shock to prevent sudden cardiac death from a life-threatening arrhythmia. While it's a necessary and effective preventive therapy, those who've experienced a defibrillator shock say it's painful, and some studies suggest that the shock can damage heart muscle. Scientists at Johns Hopkins believe they have found a kinder and gentler way to halt the rapid and potentially fatal irregular heart beat known as ventricular fibrillation. In a study published in the September 28 issue of Science Translational Medicine, they report success using lower amplitude, high-frequency alternating current at 100-200 Hz to stop the arrhythmia in the laboratory. They say this approach also may prove to be less painful for patients because of the lower amplitude and different frequency range than what is used for standard defibrillator shocks.

"We believe we have found a way to stop a life-threatening arrhythmia by applying a high-frequency alternating current for about one-third of a second," says Ronald Berger, M.D., Ph.D., a cardiac electrophysiologist at the Johns Hopkins Heart and Vascular Institute and a professor of medicine and biomedical engineering at the Johns Hopkins University School of Medicine. "The alternating current puts the disorganized, rapidly moving heart cells in a refractory state, like suspended animation. When we turn off the current, the cells immediately return to a normal state. If further research confirms what we have learned so far, this could be less painful for a patient while achieving the same result," says Berger, who is the senior author of the study.

Graduate student Seth Weinberg, a co-lead author of the study, says the way heart cells behave during ventricular fibrillation is like having a football stadium full of fans, all of whom are doing "the wave" in an uncoordinated, disorganized fashion. "Applying the alternating current," he says, "is like freezing all of the fans in a position halfway between sitting and standing. When the current is turned off, the fans sit down in an orderly way, ready to be instructed to do the wave in a coordinated way."

Berger says he and his colleagues, a team of Johns Hopkins cardiologists and biomedical engineers, have shown a proof of principle and a novel scientific finding: It's the first time heart cells have been put in a suspended state to interrupt ventricular fibrillation.

"The idea to put heart cells in a brief state of suspended animation came from studies showing that alternating current could be used to put nerve cells in a similar state to block the signals that cause pain," says Harikrishna Tandri, M.D., assistant professor of medicine and the other co-lead author of the study.

To ensure that they were correctly assessing the response of the heart cells to the high frequency current, and, at the same time, distinguishing the response from the cells' native electrical activity, the researchers used a technique called optical mapping. Unlike other electrical recording techniques, optical mapping measurements are not affected by applied electrical stimuli, according to co-author Leslie Tung, Ph.D., professor of biomedical engineering, who led the optical mapping aspect of the research.

In order to allow the team to explore the response of individual heart cells to the high-frequency electrical current, co-author Natalia Trayanova, Ph.D., professor of biomedical engineering, produced a multi-scale computational model of the heart.

While more testing is needed in animal models, the researchers are optimistic that their work may lead to a new approach to shock the human heart back to a normal rhythm. "We are ultimately hoping to develop a device that, instead of delivering a painful, high-voltage shock when it detects a life-threating arrhythmia, applies a more gentle alternating current for the right amount of time to stop the dangerous rhythm. We think that would be a great benefit to the millions of people worldwide who have a defibrillator to prevent sudden death," Berger says.

Read more...

Correcting Sickle Cell Disease With Stem Cells

Using a patient's own stem cells, researchers at Johns Hopkins have corrected the genetic alteration that causes sickle cell disease (SCD), a painful, disabling inherited blood disorder that affects mostly African-Americans. The corrected stem cells were coaxed into immature red blood cells in a test tube that then turned on a normal version of the gene. The research team cautions that the work, done only in the laboratory, is years away from clinical use in patients, but should provide tools for developing gene therapies for SCD and a variety of other blood disorders.

In an article published online August 31 in Blood, the researchers say they are one step closer to developing a feasible cure or long-term treatment option for patients with SCD, which is caused by a single DNA letter change in the gene for adult hemoglobin, the principle protein in red blood cells needed to carry oxygen. People who inherited two copies -- one from each parent -- of the genetic alteration, the red blood cells are sickle-shaped, rather than round. The misshapen red blood cells clog blood vessels, leading to pain, fatigue, infections, organ damage and premature death.

Although there are drugs and painkillers that control SCD symptoms, the only known cure -- achieved rarely -- has been bone marrow transplant. But because the vast majority of SCD patients are African-American and few African-Americans have registered in the bone marrow registry, it has been difficult to find compatible donors, says Linzhao Cheng, Ph.D., a professor of medicine and associate director for basic research in the Division of Hematology and also a member of the Johns HopkinsInstitute for Cell Engineering. "We're now one step closer to developing a combination cell and gene therapy method that will allow us to use patients' own cells to treat them."

Using one adult patient at The Johns Hopkins Hospital as their first case, the researchers first isolated the patient's bone marrow cells. After generating induced pluripotent stem (iPS) cells -- adult cells that have been reprogrammed to behave like embryonic stem cells -- from the bone marrow cells, they put one normal copy of the hemoglobin gene in place of the defective one using genetic engineering techniques.

The researchers sequenced the DNA from 300 different samples of iPS cells to identify those that contained correct copies of the hemoglobin gene and found four. Three of these iPS cell lines didn't pass muster in subsequent tests.

"The beauty of iPS cells is that we can grow a lot of them and then coax them into becoming cells of any kind, including red blood cells," Cheng said.

In their process, his team converted the corrected iPS cells into immature red blood cells by giving them growth factors. Further testing showed that the normal hemoglobin gene was turned on properly in these cells, although at less than half of normal levels. "We think these immature red blood cells still behave like embryonic cells and as a result are unable to turn on high enough levels of the adult hemoglobin gene," explains Cheng. "We next have to learn how to properly convert these cells into mature red blood cells."

Read more...

Wednesday, September 28, 2011

Science Explains Ancient Copper Artifacts

Northwestern University researchers ditched many of their high-tech tools and turned to large stones, fire and some old-fashioned elbow grease to recreate techniques used by Native American coppersmiths who lived more than 600 years ago. This prehistoric approach to metalworking was part of a metallurgical analysis of copper artifacts left behind by the Mississippians of the Cahokia Mounds, who lived in southeastern Illinois from 700 until 1400 A.D. The study was published in the Journal of Archaeological Science in May.

The researchers were able to identify how the coppersmiths of Cahokia likely set up their workshop and the methods and tools used to work copper nuggets into sacred jewelry, headdresses, breastplates and other regalia.

"Metals store clues within their structure that can help explain how they were processed," said David Dunand, the James N. and Margie M. Krebs Professor of materials science and engineering at Northwestern's McCormick School of Engineering and Applied Science and co-author of the paper. "We were lucky enough to analyze small, discarded pieces of copper found on the ground of the excavated 'copper workshop house in Cahokia and determine how the metal was worked by the Cahokians."

Two materials science and engineering students conducted much of the research. Matt Chastain, a Northwestern undergraduate at the time of the study, worked alongside Alix Deymier-Black, a graduate student in the materials science and engineering department. Chastain, first author of the paper, undertook the metallurgical analysis of the samples, supplied from ongoing excavations at Mound 34 in Cahokia. Chastain followed up his analysis by volunteering at the excavation site.

"We cut through some samples of the copper pieces and polished them to look at the grain structures of the copper with a microscope," said Deymier-Black, second author of the paper. "From the size, shape and features of the grains inside the copper, we determined that the coppersmiths were likely hammering the copper, probably with a heavy rock, then putting the copper in the hot coals of a wood fire for five to 10 minutes to soften it and repeating the cycle until they had created a thin sheet of copper. "

Read more...

Researchers Record Two-state Dynamics İn Glassy Silicon

Using high-resolution imaging technology, University of Illinois researchers have answered a question that had confounded semiconductor researchers: Is amorphous silicon a glass? The answer? Yes – until hydrogen is added. Led by chemistry professor Martin Gruebele, the group published its results in the journal Physical Review Letters.

Amorphous silicon (a-Si) is a semiconductor popular for many device applications because it is inexpensive and can be created in a flexible thin film, unlike the rigid, brittle crystalline form of silicon. But the material has its own unusual qualities: It seems to have some characteristics of glass, but cannot be made the way other glasses are.

Most glasses are made by rapidly cooling a melted material so that it hardens in a random structure. But cooling liquid silicon simply results in an orderly crystal structure. Several methods exist for producing a-Si from crystalline silicon, including bombarding a crystal surface so that atoms fly off and deposit on another surface in a random position.

To settle the debate on the nature of a-Si, Gruebele's group, collaborating with electrical and computer engineering professor Joseph Lyding's group at the Beckman Institute for Advanced Science and Technology, used a scanning tunneling microscope to take sub nanometer-resolution images of a-Si surfaces, stringing them together to make a time-lapse video.

The video shows a lumpy, irregular surface; each lump is a cluster about five silicon atoms in diameter. Suddenly, between frames, one bump seems to jump to an adjoining space. Soon, another lump nearby shifts neatly to the right. Although few of the clusters move, the action is obvious.

Such cluster "hopping" between two positions is known as two-state dynamics, a signature property of glass. In a glass, the atoms or molecules are randomly positioned or oriented, much the way they are in a liquid or gas. But while atoms have much more freedom of motion to diffuse through a liquid or gas, in a glass the molecules or atom clusters are stuck most of the time in the solid. Instead, a cluster usually has only two adjoining places that it can ferry between.

"This is the first time that this type of two-state hopping has been imaged in a-Si," Gruebele said. "It's been predicted by theory and people have inferred it indirectly from other measurements, but this is the first time we're been able to visualize it."

The group's observations of two-state dynamics show that pure a-Si is indeed a glass, in spite of its unorthodox manufacturing method. However, a-Si is rarely used in its pure form; hydrogen is added to make it more stable and improve performance.

Researchers have long assumed that hydrogenation has little to no effect on the random structure of a-Si, but the group's observations show that this assumption is not quite correct. In fact, adding hydrogen robs a-Si of its two-state dynamics and its categorization as a glass. Furthermore, the surface is riddled with signs of crystallization: larger clusters, cracks and highly structured patches.

Read more...

Ornl Package Tracking System Takes Social Media To New Heights

What has made the Internet such a success could help change the way high-dollar and hazardous packages are tracked, according to Randy Walker of the Department of Energy's Oak Ridge National Laboratory. Tracking 2.0, an ORNL system being developed by a team led by Walker, provides a clear start to finish view as an item moves to its destination, thereby eliminating the problem of proprietary and often incompatible databases used by various shippers. The system is the culmination of many years of research.

"Tracking 2.0 leverages eight years of ORNL research into supply chain infrastructure and test bed collaborations with state and local first responders, multi-modal freight service providers, private sector shippers and federal and international government partners," Walker said.

With Tracking 2.0, users will be able to share tracking data using existing tracking systems and leverage legacy and emerging technologies without having to retool the enterprise systems. In addition, users can deploy low-cost quick-to-market custom tools that combine proven security practices with emerging social computing technologies to network otherwise incompatible systems.

All codes translate to Uniform Resource Locators that point to tracking information. This address takes on the role of a permanent and unique "Virtual Resource Identifier," but does not require a priori agreement on a universal standard by all the stakeholders, which Walker described as "a difficult and open-ended process."

The system offers the ability to dynamically incorporate and associate searchable user-defined tags to the Virtual Resource Identifier. These tags are contributed incrementally by the various partners involved in the progress of the shipment, but they do not interfere with the seamless operation of the whole system.

Read more...

New Software 'Hearing Dummies' Pave The Way For Tailor-made Hearing Aids

New software 'hearing dummies' are part of cutting-edge research that promises to revolutionise the diagnosis and treatment of hearing impairments. The work could also be used in the long-term to develop a radical new type of hearing aid that can be customised using the hearing dummy to meet the different needs of individual patients. If the procedures gain clinical acceptance, a device could reach the market within 4 years.

The research is being carried out by a team at the University of Essex with funding from the Engineering and Physical Sciences Research Council (EPSRC).

The aim has been to enable hearing aids to be carefully calibrated so that they address the particular underlying hearing condition affecting each individual patient; and to ensure that they tackle the most common problem affecting hearing-impaired people – sound interference, which leads to an inability to follow conversations in noisy environments.

People also differ in how much they are affected by noisy environments, which is why developing a tailor-made approach represents such a significant breakthrough.

"Today's hearing aids don't help to separate sounds – they just amplify them," says Professor Ray Meddis, of the University's Department of Psychology, who has led the work. "So they often make everything too noisy for the wearer, especially in social situations like parties, and some wearers still can't make out what people are saying to them. They find the whole experience so uncomfortable that they end up taking their hearing aids out! This discourages them from going to social occasions or busy environments and may result in them withdrawing from society."

The first key advance has been the development of unique computer models (or 'hearing dummies') that can use the information collected during the tests to simulate the precise details of an individual patient's hearing.

Read more...

Simple Analysis Of Breathing Sounds While Awake Can Detect Obstructive Sleep Apnea

The analysis of breathing sounds while awake may be a fast, simple and accurate screening tool for obstructive sleep apnea, suggests a research abstract that will be presented Monday, June 13, in Minneapolis, Minn., at SLEEP 2011, the 25th Anniversary Meeting of the Associated Professional Sleep Societies LLC (APSS). Results show that several sound features of breathing were statistically significant between participants with obstructive sleep apnea and healthy controls. In an analysis that combined the two most significant sound features, the presence or absence of OSA was predicted with more than 84-percent accuracy. Sound analysis also allowed for the stratification of OSA severity.

According to the authors, people with OSA tend to have a narrower and more collapsible pharynx with more negative pharyngeal pressure, which creates greater resistance when breathing through the nose. Breathing sounds are directly related to pharyngeal pressure, making sound analysis a viable diagnostic option for OSA.

"Despite being able to breathe at the same high flow rate, the pharyngeal pressure in people with OSA during wakefulness is usually more negative than that in the non-OSA group," said principal investigator and lead author Zahra Moussavi, PhD, professor and Canada Research Chair on Biomedical Engineering at the University of Manitoba in Winnipeg, Canada.

Moussavi and co-investigator Aman Montazeri studied 35 patients with varying severity levels of OSA and 17 age-matched controls. The presence or absence of OSA was validated by full-night polysomnography.

The subjects were instructed to breathe through their nose at their normal breathing level for at least five breaths and then breathe at their maximum flow level for another five breaths. Then the process was repeated as they breathed through their mouth with a nose clip in place. The breathing sounds were picked up by a microphone placed over the neck, and the recordings were repeated in two body positions: sitting upright and lying on the back. Data were digitized and then analyzed using spectral and waveform fractal dimension techniques.

Moussavi added that detecting OSA through sound analysis could become an attractive alternative to the more costly and labor-intensive method of performing overnight polysomnography.

"If we can predict the likelihood of apnea and its severity with the same accuracy as in our pilot study, it will have a significant impact on health-care costs as it can reduce the need for full-night sleep assessment significantly," she said.

The study was supported by the National Sciences and Engineering Research Council of Canada and TRLabs Winnipeg, where Moussavi is an adjunct scientist.

Read more...

UK Power Networks Recycles 97 Per Cent Of Building Demolition Rubble At Whitechapel Construction Site

The Site Waste Management Plan Regulations 2008 encourage recycling and the Landfill Tax has become a significant factor in encouraging companies to use transfer stations.

UK Power Networks has announced that its construction site in Whitechapel has recycled 97 per cent of building demolition rubble taken off the site during an electricity substation redevelopment.

The company, in partnership with its demolition contractors Erith Contractors Ltd, recycled 18,000 tonnes of waste – equivalent to 967 lorry loads – during demolition of the existing turbine hall, basement, store rooms, offices and garage in preparation for the construction of a new substation.

A new electricity substation will now be constructed at the existing site containing transformers, switchgear and cables delivering electricity to homes and businesses in London. The project is set to continue until 2015 and will strengthen the local electricity network.

Clive Steed, sustainability manager at UK Power Networks, said: “We believe in working with our contractors to try to minimise our environmental impact and reduce the amount of waste sent to landfill. While we cannot achieve this level of recycling with all our construction projects, this case highlights our commitment to sustainability and shows what can be achieved.”

Stuart Accleton, associate director at Erith Contractors Ltd, said: “Targets to reduce greenhouse gas emissions, along with fuel costs, make using local recycling facilities beneficial environmentally and commercially. Sustainability is at the heart of everything we do and it has always been our business to make money out of other people’s waste. With new technology, better equipment and higher volumes we are achieving greater returns on recycling and reuse than ever before.

“The level of landfill tax within the UK which currently stands at £56 per tonne, rising to £64 per tonne on April 1 2012, £72 per tonne on April 1 2013 and £80 per tonne on April 1, 2014. By using waste transfer stations that do not incur landfill tax we generate savings for the project. With sites generally in close proximity to our projects we are also reducing fuel costs and saving time. Recycling is beneficial to us commercially and environmentally, releasing fewer emissions.”

Read more...

Se Controls' Seco 20 Chain Actuators Specified For Highbury College

Highbury College, located in Portsmouth, has seen a multi-million pound investment in recent years and can now boast ownership of some of the most sustainable buildings in Portsmouth.

SECO 20 Chain Actuators from SE Controls were specified to formpart of the natural ventilation system being installed into the main building. These actuators have a proven long life expectancy and have been used extensively in window automation requirements on many UK projects. In total 82 actuators are linked back to the buildings BMS which maintains air quality and temperature within the classrooms and main communal areas of the college.

SE Controls were specified on the project as they were recommended by the window specialist - the SECO 20 Chain Actuator provided a low power drain on the BMS system as compared to other actuators currently available. By choosing low amperage actuators, cost savings can be made both in the fixed hardware required for control and save energy requirements over time.

The project required SE Controls to install the actuators on site and ensure that they were commissioned ready to be wired into the BMS system by others. Whilst SE Controls can offer a wide range of services from design through to installation and on-going maintenance, the flexibility of the services provided allows the company to quote for any window automation requirements and to closely liaise with other specialists on site.

Read more...

New Yellow Multi Stock Brick For The Refurbishment Market From Wienerberger

Expected to be popular in the refurbishment market or projects where an old-style appearance is required, Wienerberger's Smeed Dean Reclaimed Yellow mixture is a genuine alternative to reclaimed bricks, having the same aesthetic look of a pre used product but with the technical guarantees of its modern counterpart.

The mixture of the weathered yellow brick combined with black and white accents is particularly popular across London and the South East of England where buildings have traditionally been built using yellow brick stocks.

Wienerberger's Smeed Dean Reclaimed Yellow mixture is a tumbled weathered multi stock with 5 per cent of the bricks painted black and 5 per cent painted white to create a beautiful, mottled appearance, ideal as a bespoke heritage blend.


The Smeed Dean Reclaimed Yellow Mixture is the latest to be produced at the Smeed Dean factory in Sittingbourne, the last remaining brickworks in Kent. The works has been one of the most prolific producers of yellow stock bricks for more than 200 years and have featured in some of London’s most memorable buildings including Buckingham Palace.

Read more...

Owlett-Jaton's New PPE and Workwear Range İs More Comprehensive and Diverse

At more than double the size of Owlett-Jaton’s original personal protective equipment (PPE) and workwear portfolio, Owlett-Jaton's new range of PPE and workwear offers customers more choice, flexibility and competitive prices when it comes to meeting their health and safety requirements.

The brand new and improved range features over 500 product lines from world-leading brands including 3M, Ansell, DeWalt and Uvex. As part of Owlett-Jaton’s service promise, the range is available via next day delivery with no minimum order restrictions.

To celebrate the new range, Owlett-Jaton is launching a dedicated PPE and workwear product guide. As well as product showcases, the guide highlights Owlett-Jaton’s popular product branding service, where customers can follow three simple steps to add a company logo and personalise the product they are ordering quickly, and cost effectively. It is now possible for customers to create a bespoke product guide from the Owlett-Jaton master catalogue.

Tony Williams, national sales manager (Jaton), explained: “As part of our dedication to being a ‘one-stop-shop’ it is crucial for us to continually develop our product range, offering customers more choice and flexibility. Over the past 12 months we have added some widely used and respected lines to our portfolio, including abrasives, sealants and adhesives, and ACE clamping products; we now have over 30,000 products available – our biggest ever offering. It’s our intention to continue with new product introduction over the coming months.

“The new PPE and workwear range is far more comprehensive and diverse than we’ve seen before. There are different levels of product available, from the cost-effective basic kit needed for one-off and low risk jobs, to high quality specialist gear that offers additional style and comfort features – something that may need to be considered if it is going to be worn over a long period.”

Read more...

RonaDeck Resin Bound Surfacing Provides An Attractive Pavement For Westfield Stratford City

Westfield Stratford City in East London is located adjacent to the Olympic Park and will form the prestigious gateway to the Olympic Games in 2012. Built at a cost of £1.8billion, it contains 1.9 million square feet of retail and leisure space. There are 250 shops and over 70 places to eat and drink in the complex which includes a 17- screen, all-digital Vue cinema; 14 lane, All Star Lanes bowling and the UK’s largest casino. A pedestrianised street runs through the development, linking John Lewis at one end with Marks & Spencer at the other.

The largest urban shopping centre in Europe, Westfield Stratford City retail and leisure development in Stratford is also the most environmentally friendly shopping complex in Britain.

Cemplas were employed to lay the surfacing element of the hard landscaping package, a key feature in the complex that covers over 3000m2. The surfacing system used was RonaDeck Resin Bound Surfacing, a permeable paving system consisting of blended aggregates bound in a UV resistant elastic resin.

RonaDeck Resin Bound Surfacing provides an attractive porous pavement for pedestrians and light vehicle traffic. The inbuilt porosity prevents surface ponding of rainwater and allows water to drain naturally, feeding plants, natural watercourses or manmade drainage systems.

The system is SUDs compliant, in keeping with the environmental aims of Westfield. The elasticity of the resin means that there is a ‘give’ to the surface, this makes it more comfortable to walk on and provides resilience to localised pressure which could cause damage to surfacings with more brittle binders.

Golden Harvest aggregate was selected for the walkway from Ronacrete’s range of natural aggregates, each formulated for strength and appearance. The ultra violet stability of the resin ensures that as the surfacing ages, the appearance of the aggregates will not be spoilt by the discolouration that is characteristic of cheaper resin systems.

Cemplas applied the RonaDeck Resin Bound Surfacing system to an asphalt substrate which was laid to form a firm base. The RonaDeck Resin Bound Surfacing system is supplied as a two component polyurethane resin together with a blend of kiln-dried aggregates for on-site mixing in a forced action mixer. Cemplas first mixed the two component resin with a drill and paddle before adding the mixed resin to Golden Harvest aggregate in the high capacity forced action mixer. As soon as the aggregate was fully coated with resin, the mixed material was discharged from the mixer and trowel applied at a thickness of 18mm onto the prepared asphalt surface. The finished system was ready for foot traffic within four hours of being laid.

Read more...

Tuesday, September 27, 2011

New Parallelization Technique Boosts Our Ability To Model Biological Systems

Researchers at North Carolina State University have developed a new technique for using multi-core chips more efficiently, significantly enhancing a computer's ability to build computer models of biological systems. The technique improved the efficiency of algorithms used to build models of biological systems more than seven-fold, creating more realistic models that can account for uncertainty and biological variation. This could impact research areas ranging from drug development to the engineering of biofuels. Computer models of biological systems have many uses, from predicting potential side-effects of new drugs to understanding the ability of plants to adjust to climate change. But developing models for living things is challenging because, unlike machines, biological systems can have a significant amount of uncertainty and variation.

"When developing a model of a biological system, you have to use techniques that account for that uncertainty, and those techniques require a lot of computational power," says Dr. Cranos Williams, an assistant professor of electrical engineering at NC State and co-author of a paper describing the research. "That means using powerful computers. Those computers are expensive, and access to them can be limited.

"Our goal was to develop software that enables scientists to run biological models on conventional computers by utilizing their multi-core chips more efficiently."

The brain of a computer chip is its central processing unit, or "core." Most personal computers now use chips that have between four and eight cores. However, most programs only operate in one core at a time. For a program to utilize all of these cores, it has to be broken down into separate "threads" – so that each core can execute a different part of the program simultaneously. The process of breaking down a program into threads is called parallelization, and allows computers to run programs very quickly.

In order to "parallelize" algorithms for building models of biological systems, Williams' research team created a way for information to pass back and forth between the cores on a single chip. Specifically, Williams explains, "we used threads to create 'locks' that control access to shared data. This allows all of the cores on the chip to work together to solve a unified problem."

The researchers tested the approach by running three models through chips that utilized one core, as well as chips that used the new technique to utilize two, four and eight cores. In all three models, the chip that utilized eight cores ran at least 7.5 times faster than the chip that utilized only one core.

Read more...

A New Way To Make Lighter, Stronger Steel İn A Flash

A Detroit entrepreneur surprised university engineers here recently, when he invented a heat-treatment that makes steel 7 percent stronger than any steel on record – in less than 10 seconds. In fact, the steel, now trademarked as Flash Bainite, has tested stronger and more shock-absorbing than the most common titanium alloys used by industry.

Now the entrepreneur is working with researchers at Ohio State University to better understand the science behind the new treatment, called flash processing.

What they've discovered may hold the key to making cars and military vehicles lighter, stronger, and more fuel-efficient.

In the current issue of the journal Materials Science and Technology, the inventor and his Ohio State partners describe how rapidly heating and cooling steel sheets changes the microstructure inside the alloy to make it stronger and less brittle.

The basic process of heat-treating steel has changed little in the modern age, and engineer Suresh Babu is one of few researchers worldwide who still study how to tune the properties of steel in detail. He's an associate professor of materials science and engineering at Ohio State, and Director of the National Science Foundation (NSF) Center for Integrative Materials Joining for Energy Applications, headquartered at the university.

"Steel is what we would call a 'mature technology.' We'd like to think we know most everything about it," he said. "If someone invented a way to strengthen the strongest steels even a few percent, that would be a big deal. But 7 percent? That's huge."

Yet, when inventor Gary Cola initially approached him, Babu didn't know what to think.

"The process that Gary described – it shouldn't have worked," he said. "I didn't believe him. So he took my students and me to Detroit."

Cola showed them his proprietary lab setup at SFP Works, LLC., where rollers carried steel sheets through flames as hot as 1100 degrees Celsius and then into a cooling liquid bath.

Read more...

Unprecedented İnternational Meeting Releases Preliminary Vision For Our Energy Future

A unique, international summit of scientists, engineers, entrepreneurs and future leaders from around the world has concluded with the release of the Equinox Summit: Energy 2030 Communiqué. The event's preliminary report includes visionary proposals for transformative action to reduce the electricity-related emissions that drive global warming

The Communiqué identifies a group of technological approaches and implementation steps that have the potential over the coming decades to accelerate the transition of our energy systems toward electrification and, in the longer term, toward an energy future where our dependence on fossil fuels is greatly reduced.

"Given the right support, the six priority actions we have identified can catalyze change on a global scale, from the cities of the developed world, to the billions of people who live in towns and villages that lack adequate access to electricity to provide the central link to improvements in the quality of life," said summit advisor Professor Jatin Nathwani, Executive Director of the Waterloo Institute for Sustainable Energy at the University of Waterloo and Ontario Research Chair in Public Policy for Sustainable Energy.

Can we low-carbon power the planet in 20 years?

Equinox Summit: Energy 2030 participants came together to intensely explore, discuss and propose how science and technology can catalyze the urgent change required.

With representatives from countries including Canada, Brazil, China, Costa Rica, Indonesia, Nigeria, the USA, and more, the Equinox Summit embodied the realities, challenges, and hopes of the enormously diverse global community – from those living in the world's 21 mega-cities of more than 10-million inhabitants, to the one-third of humanity who survive without electricity.

An electricity roadmap for nations

The Equinox Communiqué is a brief snapshot of the ideas and visions developed by the Summit participants, who aimed to address the great complexity of transitioning to low-carbon electricity production. It provides a series of immediate, concrete opportunities for action by industry and governments, both locally and internationally. These ideas will be explored in more detail in a future document, the Equinox Blueprint: Energy 2030.

The pathways described in the Communiqué include: accelerating implementation of technologies to enable the integration of large-scale renewable sources of power, such as wind and solar, into existing electricity grids; new ways to develop low-carbon transportation; ways to build energy-smart cities; and means of providing sustainable electricity to those who currently live without it.

Read more...

Penn Engineers Envision 2-Dimensional Graphene Metamaterials and 1-Atom-Thick Optical Devices

Two University of Pennsylvania engineers have proposed the possibility of two-dimensional metamaterials. These one-atom- thick metamaterials could be achieved by controlling the conductivity of sheets of graphene, which is a single layer of carbon atoms. Professor Nader Engheta and graduate student Ashkan Vakil, both of the Department of Electrical and Systems Engineering in Penn's School of Engineering and Applied Science, published their theoretical research in the journal Science.

The study of metamaterials is an interdisciplinary field of science and engineering that has grown considerably in recent years. It is premised on the idea that materials can be designed so that their overall wave qualities rely not only upon the material they are made of but also on the pattern, shape and size of irregularities, known as "inclusions," or "meta-molecules" that are embedded within host media.

"By designing the properties of the inclusions, as well as their shapes and density, you achieve in the bulk property something that may be unusual and not readily available in nature," Engheta said.

These unusual properties generally have to do with manipulating electromagnetic (EM) or acoustic waves; in this case, it is EM waves in the infrared spectrum

Changing the shape, speed and direction of these kinds of waves is a subfield of metamaterials known as "transformation optics" and may find applications in everything from telecommunications to imaging to signal processing.

Engheta and Vakil's research shows how transformation optics might now be achieved using graphene, a lattice of carbon a single atom thick.

Researchers, including many at Penn, have devoted considerable effort into developing new ways to manufacture and manipulate graphene, as its unprecedented conductivity would have many applications in the field of electronics. Engheta and Vakil's interest in graphene, however, is due to its capability to transport and guide EM waves in addition to electrical charges and the fact that its conductivity can be easily altered.

Applying direct voltage to a sheet of graphene, by way of ground plate running parallel to the sheet, changes how conductive the graphene is to EM waves. Varying the voltage or the distance between the ground plate and the graphene alters the conductivity, "just like tuning a knob," Engheta said.

Read more...

Will Psych Majors Make The Big Bucks?

A new crop of college graduates have just landed on the job market. Right now they're probably just hoping to get any job, if at all. However, for psychology majors, the salary outlook in both the short and long term is particularly poor, according to a new study which will be published in an upcoming issue of Perspectives on Psychological Science, a journal of the Association for Psychological Science. It's generally known that psychology majors don't make a ton of money when they're starting out; they're not like engineering students, many of whom go straight into a job that pays well for their technical skills. But some people have suggested that a psychology major may pay off later in the career, as the critical thinking skills and literacy of the liberal arts education become more valuable. D.W. Rajecki of Indiana University was skeptical. "Psychology educators say liberal arts skills should be valuable in the workplace. Employers say they value liberal arts skills in employees," he says. "I say, 'show me the money.'" So, with Victor M.H. Borden, he set out to examine several data sets on earnings for people in different fields.

As expected, they found that psychology majors' median starting salary of $35,300 is well below the average for college graduates. But they found that this is also true at midcareer, when psychology majors are still paid below the average. They fare particularly poorly when compared to graduates in other science fields, engineering, and health.

"Face it, wages are tied to specific occupations, and real-world data show that psychology alumni just don't work in areas that pay top dollar," says Rajecki. Advanced degrees don't help, either. "Even psychology professors obtain appointments at the lower end of that salary scale."

Rajecki doesn't think this means 18-year-olds should stop choosing psychology as a major. "Psychology is a remarkable academic discipline that seems to get more interesting every passing year. Why should any student avoid the field?" he says. And, of course, money isn't the only thing that matters. But when academic counselors are giving students advice, they should make it clear that psychology isn't necessarily the road to riches.

Read more...

Progress İn Tissue Engineering To Repair Joint Damage İn Osteoarthritis

Medical scientists now have "clear" evidence that the damaged cartilage tissue in osteoarthritis and other painful joint disorders can be encouraged to regrow and regenerate, and are developing tissue engineering technology that could help millions of patients with those disorders. That's the conclusion of a new analysis of almost 100 scientific studies on the topic, published in ACS's journal Molecular Pharmaceutics. Tong Cao, Wei Seong Toh and colleagues point out that damage to so-called articular cartilage — the smooth, white, rubbery tissue that covers and cushions the ends of bones in joints — is one of the most challenging problems in medicine. That's because the tissue lacks blood vessels and has little ability to repair itself and regrow. Wear-and-tear damage thus builds up over the years, resulting in conditions like osteoarthritis, which affects 27 million people in the United States alone. Osteoarthritis is a fast-growing public health problem because of the world's aging population and because of a sharp increase in obesity, which increases wear on joint cartilage. To assess progress toward medical use of tissue engineering to treat joint damage, the researchers scanned global research on the topic.

They found that scientists have developed many new tissue engineering methods, including implantation of so-called "scaffolds" made of biomaterials that mimic cartilage matrix in the body. The scaffolds could guide the transplanted cells, orchestrate the host cell response, provide structures and microenvironment substances to help rebuild cartilage at the injury site. "In summary, there is promise in future research involving the development of multi-functional biomaterial delivery systems that affect cartilage tissue regeneration on multiple levels," the article states

Read more...

Using Magnets To Help Prevent Heart Attacks

If a person's blood becomes too thick it can damage blood vessels and increase the risk of heart attacks. But a Temple University physicist has discovered that he can thin the human blood by subjecting it to a magnetic field. Rongjia Tao, professor and chair of physics at Temple University, has pioneered the use of electric or magnetic fields to decrease the viscosity of oil in engines and pipelines. Now, he is using the same magnetic fields to thin human blood in the circulation system.

Because red blood cells contain iron, Tao has been able to reduce a person's blood viscosity by 20-30 percent by subjecting it to a magnetic field of 1.3 Telsa (about the same as an MRI) for about one minute.

Tao and his collaborator tested numerous blood samples in a Temple lab and found that the magnetic field polarizes the red blood cells causing them to link together in short chains, streamlining the movement of the blood. Because these chains are larger than the single blood cells, they flow down the center, reducing the friction against the walls of the blood vessels. The combined effects reduce the viscosity of the blood, helping it to flow more freely.

When the magnetic field was taken away, the blood's original viscosity state slowly returned, but over a period of several hours.

"By selecting a suitable magnetic field strength and pulse duration, we will be able to control the size of the aggregated red-cell chains, hence to control the blood's viscosity," said Tao. "This method of magneto-rheology provides an effective way to control the blood viscosity within a selected range."

Currently, the only method for thinning blood is through drugs such as aspirin; however, these drugs often produce unwanted side effects. Tao said that the magnetic field method is not only safer, it is repeatable. The magnetic fields may be reapplied and the viscosity reduced again. He also added that the viscosity reduction does not affect the red blood cells' normal function.

Read more...

University Of Nevada, Reno, Engineers Simulate Large Quake On Curved Bridge

Six full-size pickup trucks took a wild ride on a 16-foot-high steel bridge when it shook violently in a series of never-before-conducted experiments to investigate the seismic behavior of a curved bridge with vehicles in place. The 145-foot-long, 162-ton steel and concrete bridge was built atop four large, 14-foot by 14-foot, hydraulic shake tables in the University of Nevada, Reno's Large-Scale Structures Earthquake Engineering Laboratory. "We took the bridge to its extreme, almost double what we planned at the outset," Ian Buckle, professor of civil engineering and director of the large-scale structures lab, said. "Preliminarily we see that in low amplitude earthquakes the weight of the vehicles actually helps the seismic effects on the structure, while at higher amplitudes the trucks hinder considerably the bridges ability to withstand an earthquake."

The trucks bounced and swayed as the four-span bridge's concrete columns deflected more than 14 inches in each direction, the steel girders twisted and the floor of the lab shook from the energy applied to the bridge. The bridge, with 80 feet of curvature, filled the cavernous high-bay lab on the University of Nevada, Reno campus from end-to-end.

A 3-minute video featuring the largest motion applied to the bridge can be viewed by clicking on this link http://imedia.unr.edu/media_relations/VNR_shake_trucks_2b.mp4.

"Whether you saw the experiment in person or watch the video, remember that this is a 2/5 scale model, and the movement you see would be two and a half times greater on a full-scale bridge," Buckle, principal investigator of the research project, said. "It would be scary to be driving under those conditions."

"Currently, bridges are not designed for the occurrence of heavy traffic and a large earthquake at the same time," he said. "With increasing truck traffic and frequent congestion on city freeways, the likelihood of an earthquake occurring while a bridge is fully laden is now a possibility that should be considered in design. But there has been no agreement as to whether the presence of trucks helps or hurts the behavior of a bridge during an earthquake, and this experiment is intended to answer this question."

The complete answer will come after months of examining the many gigabytes of information gleaned from the 400 sensors placed on the bridge and trucks. The results of this work, titled "Seismic Effects on Multi-span Bridges with High Degrees of Horizontal Curvature," will be used to frame changes to current codes and lead to safer bridges during strong earthquakes.

The four, 50-ton capacity shake tables simulated more than twice the strength of the 1994 Northridge, Calif. earthquake, which resulted in 33 deaths, 8700 injured and $2 billion damage in southern California. The ground acceleration of that quake was one of the highest ever instrumentally recorded in an urban area in North America, measuring 1.7 g (acceleration) with strong ground motion felt as far away as Las Vegas, Nev., more than 270 miles away. Through computer programs, the recordings of the quake control the hydraulically driven shake tables to simulate the seismic event in the University's lab.

Read more...

Ucla Engineering Researchers Help Develop Complete Map Of Mouse Genetic Variation

For decades, laboratory mice have been widely used in research aimed at understanding which genes are involved in various illnesses. But actual variations in past gene sequences of mice were unknown. While researchers were able to determine that a variant affecting disease was in a certain region, they couldn't pinpoint the exact set of variants in that region. Now, in new research recently published in the journal Nature, an international team of investigators that included UCLA researchers reports that it has sequenced the complete genomes of 17 strains of mice, including the most frequently used laboratory strains. The massive genetic catalog will provide scientists with unparalleled data for studying both how genetic variation affects phenotype and how mice evolved.

Researchers from UCLA's Henry Samueli School of Engineering and Applied Science played a key role in the study, using UCLA-developed technology to help sequence a nearly complete map of mouse genetic variation. Cataloging the full set of variants is a first step in identifying the actual variants affecting disease.

"The actual number of variants discovered is important because this gives the complete picture of how much variation exists in these mouse strains," said Eleazar Eskin, an associate professor of computer science at UCLA Engineering who develops techniques for solving computational problems that arise in the study of the genetic basis of disease. "Our group here at UCLA, and others, had tried to estimate this number from the data that existed previously, which only collected a fraction of the total variation."

The new study was led by groups from the Wellcome Trust Sanger Institute and the Wellcome Trust Centre for Human Genetics in Oxford.

Previous technology used in genetic sequencing would, in some cases, make ambiguous predictions, and the locations of these ambiguities resulted in missing entries in the catalog of genetic variation in mice.

"Our role in the collaboration was to apply a technique that we developed a couple years ago for predicting variants where the sequencer failed to make a prediction," said Eskin, who holds a joint appointment in the department of human genetics at the David Geffen School of Medicine at UCLA. "Our technique, called imputation, uses the complete data to try to fill in some of these entries. The method, called EMINIM, was specifically designed for mouse data. Our contribution was to apply this technique to the data, which led to an increase in the number of variants identified."

With the full set of genetic information, researchers can now accurately predict the phylogeny -- similar in concept to the family tree -- of how the various mouse strains are related. The new study confirms that mice have a complex evolutionary history.

In addition, the study has some applications for the new genetic map, which were impossible before the development of this resource. One application involves identifying "allele specific expression." This expression describes the activity level of a gene. Each individual has two chromosomes, one from the mother and one from the father. For this reason, there are two copies of each gene.

Read more...

  © Blogger templates Psi by Ourblogtemplates.com 2008

Back to TOP