Tuesday, May 31, 2011

Penn Researchers Help Nanoscale Engineers Choose Self-assembling Proteins

Engineering structures on the smallest possible scales — using molecules and individual atoms as building blocks — is both physically and conceptually challenging. An interdisciplinary team of researchers at the University of Pennsylvania has now developed a method of computationally selecting the best of these blocks, drawing inspiration from the similar behavior of proteins in making biological structures. The team was led by postdoctoral fellow Gevorg Grigoryan and professor William DeGrado of the Department of Biochemistry and Biophysics in Penn's Perelman School of Medicine, as well as graduate student Yong Ho Kim of the Department of Chemistry in Penn's School of Arts and Sciences. Their colleagues included members of the Department of Physics and Astronomy in SAS.

Their research was published in the journal Science today.

The team set out to design proteins that could wrap around single-walled carbon nanotubes. Consisting of a cylindrical pattern of carbon atoms tens of thousands of times thinner than a human hair, nanotubes are enticing to nanoengineers as they are extraordinarily strong and could be useful as platform for other nano-structures.

"We wanted to achieve a specific geometric pattern of the atoms that these proteins are composed of on the surface of the nanotube," Grigoryan said. "If you know the underlying atomic lattice, it means that you know how to further build around it, how to attach things to it. It's like scaffolding for future building."

The hurdle in making such scaffolds isn't a lack of information, but a surfeit of it: researchers have compiled databases that list hundreds of thousands of actual and potential protein structures in atomic detail. Picking the building materials for a particular structure from this vast array and assuring that they self-assemble into the desired shape was beyond the abilities of powerful computers, much less humans.

"There's just an enormous space of structural possibilities to weed through trying to figure out which are feasible," Grigoryan said. "To have a process that can do that quickly, that can look at a structure and say 'that's not reasonable, that can't be built out of common units,' would solve that problem."

The researchers' algorithm works in three steps, which, given the parameters of the desired scaffolding, successively eliminate proteins that will not produce the right shape. The elimination criteria were based on traits like symmetry, periodicity of binding sites and similarity to protein "motifs" found in nature.

After separating the wheat from the chaff, the result is a list of thousands of candidate proteins. While still a daunting amount, the algorithm makes the protein selection process merely difficult, rather than impossible.

The research team tested their algorithm by designing a protein that would not only stably wrap around a nanotube in a helix but also provide a regular pattern on its exterior to which gold particles could be attached.

"You could use this to build a gold nanowire, for instance, or modulate the optical properties of the underlying tube in desired ways" Grigoryan said.

Next steps will include applying this algorithm for designing proteins that can attach to graphene, which is essentially an unrolled nanotube. Being able to make scaffolds out of customizable array of proteins in a variety of shapes could lead to advances in everything from miniaturization of circuitry to drug delivery.

Engineering these materials in the lab requires a tremendous amount of precision and computational power, but such efforts are essentially mimicking a phenomenon found in even the simplest forms of life.

"The kind of packing that certain viruses have in their viral envelope is similar to what we have here in that they self-assemble. They have protein units that, on their own, form their complicated structures with features that are far beyond the size of any single protein," Grigoryan said. "Each protein doesn't know what the final structure is going to be, but it still helps form it. We were inspired by that."

Read more...

Code Green: Energy-efficient Programming To Curb Computers' Power Use

Soaring energy consumption by ever more powerful computers, data centers and mobile devices has many experts looking to reduce the energy use of these devices. Most projects so far focus on more efficient cooling systems or energy-saving power modes. A University of Washington project sees a role for programmers to reduce the energy appetite of the ones and zeroes in the code itself. Researchers have created a system, called EnergJ, that reduces energy consumption in simulations by up to 50 percent, and has the potential to cut energy by as much as 90 percent. They will present the research next week in San Jose at the Programming Language Design and Implementation annual meeting.

"We all know that energy consumption is a big problem," said author Luis Ceze, a UW assistant professor of computer science and engineering. "With our system, mobile phone users would notice either a smaller phone, or a longer battery life, or both. Computing centers would notice a lower energy bill."

The basic idea is to take advantage of processes that can survive tiny errors that happen when, say, voltage is decreased or correctness checks are relaxed. Some examples of possible applications are streaming audio and video, games and real-time image recognition for augmented-reality applications on mobile devices.

"Image recognition already needs to be tolerant of little problems, like a speck of dust on the screen," said co-author Adrian Sampson, a UW doctoral student in computer science and engineering. "If we introduce a few more dots on the image because of errors, the algorithm should still work correctly, and we can save energy."

The UW system is a general framework that creates two interlocking pieces of code. One is the precise part – for instance, the encryption on your bank account's password. The other portion is for all the processes that could survive occasional slipups.

The software creates an impenetrable barrier between the two pieces.

"We make it impossible to leak data from the approximate part into the precise part," Sampson said. "You're completely guaranteed that can't happen."

While computers' energy use is frustrating and expensive, there is also a more fundamental issue at stake. Some experts believe we are approaching a limit on the number of transistors that can run on a single microchip. The so-called "dark silicon problem" says that as we boost computer speeds by cramming more transistors onto each chip, there may no longer be any way to supply enough power to the chip to run all the transistors.

The UW team's approach would work like a dimmer switch, letting some transistors run at a lower voltage. Approximate tasks could run on the dimmer regions of the chip.

"When I started thinking about this, it became more and more obvious that this could be applied, at least a little bit, to almost everything," Sampson said. "It seemed like I was always finding new places where it could be applied, at least in a limited way."

Read more...

Stamping Out Low Cost Nanodevices

A simple technique for stamping patterns invisible to the human eye onto a special class of nanomaterials provides a new, cost-effective way to produce novel devices in areas ranging from drug delivery to solar cells. The technique was developed by Vanderbilt University engineers and described in the cover article of the May issue of the journal Nano Letters.

The new method works with materials that are riddled with tiny voids that give them unique optical, electrical, chemical and mechanical properties. Imagine a stiff, sponge-like material filled with holes that are too small to see without a special microscope.

For a number of years, scientists have been investigating the use of these materials – called porous nanomaterials – for a wide range of applications including drug delivery, chemical and biological sensors, solar cells and battery electrodes. There are nanoporous forms of gold, silicon, alumina, and titanium oxide, among others.

Simple stamping

A major obstacle to using the materials has been the complexity and expense of the processing required to make them into devices.

Now, Associate Professor of Electrical Engineering Sharon M. Weiss and her colleagues have developed a rapid, low-cost imprinting process that can stamp out a variety of nanodevices from these intriguing materials.

"It's amazing how easy it is. We made our first imprint using a regular tabletop vise," Weiss said. "And the resolution is surprisingly good."

The traditional strategies used for making devices out of nanoporous materials are based on the process used to make computer chips. This must be done in a special clean room and involves painting the surface with a special material called a resist, exposing it to ultraviolet light or scanning the surface with an electron beam to create the desired pattern and then applying a series of chemical treatments to either engrave the surface or lay down new material. The more complicated the pattern, the longer it takes to make.

About two years ago, Weiss got the idea of creating pre-mastered stamps using the complex process and then using the stamps to create the devices. Weiss calls the new approach direct imprinting of porous substrates (DIPS). DIPS can create a device in less than a minute, regardless of its complexity. So far, her group reports that it has used master stamps more than 20 times without any signs of deterioration.

Read more...

Nanoscale Waveguide For Future Photonics

The creation of a new quasiparticle called the "hybrid plasmon polariton" may throw open the doors to integrated photonic circuits and optical computing for the 21st century. Researchers with the U.S. Department of Energy (DOE)'s Lawrence Berkeley National Laboratory (Berkeley Lab) have demonstrated the first true nanoscale waveguides for next generation on-chip optical communication systems. "We have directly demonstrated the nanoscale waveguiding of light at visible and near infrared frequencies in a metal-insulator-semiconductor device featuring low loss and broadband operation," says Xiang Zhang, the leader of this research. "The novel mode design of our nanoscale waveguide holds great potential for nanoscale photonic applications, such as intra-chip optical communication, signal modulation, nanoscale lasers and bio-medical sensing."

Zhang, a principal investigator with Berkeley Lab's Materials Sciences Division and director of the University of California at Berkeley's Nano-scale Science and Engineering Center (SINAM), is the corresponding author of a paper published by Nature Communications that describes this work titled "Experimental Demonstration of Low-Loss Optical Waveguiding at Deep Sub-wavelength Scales." Co-authoring the paper with Zhang were Volker Sorger, Ziliang Ye, Rupert Oulton, Yuan Wang, Guy Bartal and Xiaobo Yin.

In this paper, Zhang and his co-authors describe the use of the hybrid plasmon polariton, a quasi-particle they conceptualized and created, in a nanoscale waveguide system that is capable of shepherding light waves along a metal-dielectric nanostructure interface over sufficient distances for the routing of optical communication signals in photonic devices. The key is the insertion of a thin low-dielectric layer between the metal and a semiconductor strip.

"We reveal mode sizes down to 50-by-60 square nanometers using Near-field scanning optical microscopy (NSOM) at optical wavelengths," says Volker Sorger a graduate student in Zhang's research group and one of the two lead authors on the Nature Communications paper. "The propagation lengths were 10 times the vacuum wavelength of visible light and 20 times that of near infrared."

The high-technology world is eagerly anticipating the replacement of today's electronic circuits in microprocessors and other devices with circuits based on the transmission of light and other forms of electromagnetic waves. Photonic technology, or "photonics," promises to be superfast and ultrasensitive in comparison to electronic technology.

"To meet the ever-growing demand for higher data bandwidth and lower power consumption, we need to reduce the energy required to create, transmit and detect each bit of information," says Sorger. "This requires reducing physical photonic component sizes down beyond the diffraction limit of light while still providing integrated functionality."

Read more...

Nokia Cuts Sales And Profit-margin Forecasts

Lower selling prices and gross margins together with general market trends were behind the downgrade for its Devices and Services division, it said.

It added that it could no longer provide a full-year forecast.

Last month, Nokia said it would cut 7,000 jobs worldwide as part of strategy to focus on smartphones.

'Transition period'

The company's latest announcement said sales would be "substantially" below the 6.1bn euros ($8.8bn; £5.3bn) to 6.6bn euros previously forecast for the second quarter of 2011.

Operating margin would also be "substantially" below the 6% to 9% range forecast.

As a result, shares in the group fell by almost 12% in Frankfurt.

"Strategy transitions are difficult," said Nokia chief executive Stephen Elop.

"We recognise the need to deliver great mobile products, and therefore we must accelerate the pace of our transition."

More details of Nokia's performance in the second quarter of the year will be published on 21 July.

Job cuts

As part of the Finnish firm's reorganisation, Nokia is moving from Symbian to Microsoft's smartphone technology.

Mr Elop said the company had "increased confidence" that its new Window's-based smartphones would be launched in the final three months of this year.

The firm is looking to make up the ground it has lost to competitors such as Apple's iPhone and phones using Google's Android operating system.

Of the 7,000 job cuts announced last month, 3,000 are being transferred to outsourcing and consultancy group Accenture, which will take over Nokia's Symbian software.

Read more...

Sprint Moves To Block ATT, T-Mobile deal

Sprint has been the most vocal opponent of the $39bn (£24bn) deal that would create the largest US wireless network.

It argued the deal had "no public interest benefit", in a filing at the Federal Communications Commission.

Forcing AT&T to divest assets would not be enough to prevent "serious anti-competitive harms", it added.

"This proposed takeover puts our mobile broadband future at a crossroads," said Sprint's Vonya McCann in a statement.

"We can choose the open, competitive road best travelled, and protect American consumers, innovation and our economy, or we can choose the dead end that merely protects only AT&T and leads the rest of us back down the dirt road to Ma Bell."

Sprint, the third biggest mobile operator in the US, also said the deal would lead to higher prices for consumers.

AT&T agreed to buy T-Mobile from Deutsche Telekom in March, but the deal still needs approval from regulators.

If approved, it would give AT&T about 43% of the US mobile market, taking it ahead of industry leader Verizon Wireless.

Read more...

China Factories Feel Credit Squeeze As Expansion Slows

China's official purchasing manager's index (PMI) fell to a nine-month low, the latest figures showed.

The PMI, an indicator of conditions in the manufacturing sector, fell to 52 in May from 52.9 in the previous month.

Manufacturing is a key contributor to growth in China's economy.

Even though the figure remained above the threshold level of 50, indicating expansion in the sector, the drop from the previous month shows that expansion is slowing down.

"The continued fall in PMI in May, after the drop in April, shows the rising possibility of a slowdown in economic growth," analyst Zhang Liqun said in the report.

Credit squeeze

China has witnessed robust growth in the past few years, in the process going on to become the world's second-largest economy.

However, concerns of overheating and rising consumer prices have seen the government implement measures to slow down growth.

One of those measures has been to tighten the supply of credit.

Analysts say the move is starting to affect growth in the sector.

"There has been a strain on the ability of manufacturers to raise capital," said Michael Pettis of Peking University.

"The majority of the funds that are available in the market are being channelled towards infrastructure development or real estate development in the country," he added

Read more...

Toyota Targets 90% Output İn June As Part Supplies Ease

The company's spokesperson Paul Nolasco told the BBC that output at its domestic factories is expected to recover to 90% of pre-quake levels as early as this month.

Last week, Toyota had reported a 74.5% plunge in production at its Japanese factories in April.

Toyota is the world's biggest car manufacturer.

Production at Japan's car manufacturers has been hit hard due to disruptions in the country's supply chain in wake of the 11 March earthquake and tsunami.

However, the company said the situation had been gradually improving.

"We have had a constant recovery in our supply chain and that is starting to have a positive effect on our production," Mr Nolasco said.

'Extremely committed'

Toyota said the speed at which the company's production is recovering was a result of the combined effort of the firm and its suppliers.

"The key behind all of this has been the extremely committed effort by our suppliers to get back on track," Mr Nolasco said.

"After the quake we were facing a shortage of almost 500 parts, the numbers have since decreased to 30 parts or may be even less right now," he added.

Toyota said that it had also sent workers from its own factories to help its part suppliers get back to normal production.

The company said that while the recovery had been fast, there was still work to be done.

"Ninety percent is not the end game, there is still room for improvement," Mr Nolasco said.

"We still have to reach full capacity and also have to take care of our overseas production," he added.

Read more...

House Republicans Reject US Debt Limit Bill

The chamber voted 318-97 against the bill, rejecting a call by US President Barack Obama to raise the debt limit without conditions.

Republicans have called for spending cuts in return for a debt increase.

The US treasury department has warned the US risks default if Congress does not authorise more borrowing by August.

Some Democrats who supported Mr Obama's position also voted against the bill after Democratic leaders criticised the bill as a Republican political ploy, noting Republican leaders brought the bill to a vote, then directed their caucus to vote against it.

Republicans leaders have not indicated they will ultimately refuse to grant a debt limit increase, but say the US must bring government spending in line with tax revenue.

The US national debt is $14.3 trillion (£8.7 trillion), and the annual budget deficit is roughly $1.5 trillion.

Leaders of both parties agree to the need to trim the budget, but Republicans have refused to allow tax increases, while Democrats have vowed to protect costly social programmes.

Read more...

Australian Economy Reports Biggest Fall İn 20 Years

Its economy contracted by 1.2% in the first three months of the year compared with the previous quarter, the latest government figures showed.

The government said flooding and cyclones in the resource rich states of Queensland and Western Australia had a significant impact on growth.

Australia's economy is heavily reliant on exporting its natural resources.

"The economy has hit a temporary pothole courtesy of the natural disasters this year," said Besa Deda of St George Bank

'Honeymoon period'
Australia has not only had to deal with the twin natural disasters. Other factors have also slowed down its economy.

The country's growth has been powered by a boom in its resources sector.

As economies like China and India grew, the demand for Australia's resources witnessed a massive surge.

However, analysts say the situation is changing.

"We have been in a honeymoon period for a long time," said Jonathan Barratt of Commodity Broking.

"The time has come for realignment. As growth in China and India slows down, the pace of growth in Australia will also be affected," he added.

Relieved market

While the dip in growth was the biggest in two decades, analysts said that the numbers were better than the markets had expected.

"The market was very bearish in the last 48 hours," said Mr Barratt.

"The numbers are not as bad as people were fearing they would be," he added.

The effect of that was evident in the currency markets. The Australian dollar rose by 0.6% against the US dollar after the data was released.

It was trading close to 1.0723 against the US dollar in Asian trade

Read more...

The New Science Of Glass

What is this magic material, there but not seen if you are looking through it?" he asked in an essay. Mr. Wright responded to that question – one that has fascinated architects and engineers before and after his work – by calling glass a "supermaterial" and frequently incorporating it into his designs. With prescience, Wright believed that future buildings would be constructed where "Walls themselves because of glass will become windows and windows as we know them as holes in walls will be seen no more." Would Wright be surprised to see how his vision of glass architecture has come true in 2009?

The prevalence of towering glassy skyscrapers would probably not shock him – but the changing nature of glass itself might.

"Glass has become less about abetting observation; it is something to be observed in itself," explains Michael Bell, a professor of architecture at Columbia University, in his recent book and DVD "Engineered Transparency – The Technical, Visual, and Spatial Effects of Glass."

In a phone interview, Mr. Bell says that glass is no longer a single material, "but the name of the family of building materials 'rewritten' by new technologies."

Thanks to breakthroughs and continued stoking by architects, engineers have created glasses capable of things Wright would have never dared to attempt. In doing so, Bell says, "glass may have become something other than glass."

The traditional glass recipe – mix sand, soda powder, and quicklime, then heat until transparent – delivers unpredictable strength and fragility.

But in the 20th century, Wright and other architectural pioneers began designing buildings where interior daylight and landscape views were highly valued. To accommodate their imagination, architects urged engineers to craft new glass formulations with increased capacities to predictably perform structurally and artfully.

The age of skyscrapers became possible through the invention of strengthened glass able to bear ground and weather changes, and unforeseeable human activities.

To understand the demands placed upon glass in high-rise buildings, think of a car windshield: It consists of a sandwich of two layers of glass with a layer of a clear synthetic resin between. The resin layer seals the glass sheets together and, in the event of a crash, prevents them from breaking into tiny shards. These "interlayers" have since contributed to novel "hybrid glass," possessing previously unimaginable strength under enormous loads.

In the 21st century, innovative glass technology has been driven by concerns with climate change, energy conservation, and urban sustainability.

For example, the heating and cooling needs of American buildings amount to an enormous collective energy bill – and estimates say one-third of a building's energy expenditure comes from heat seeping through traditional windows. Conversely, window glass can intensify the sun's rays during summer, forcing millions to reach for the air conditioner.

However, by using argon or krypton gas between glass panes, solar heat can be better managed.

Read more...

World?s Largest Online Engineering Library Launched

The Institution of Civil Engineers (ICE) has today launched the most extensive online library of engineering information in the world.

The new platform, ICE Virtual Library, hosts the Institution’s journal and book content back to 1836.

The online resource will provide a sophisticated, fully searchable platform for both practising engineers and researchers to access the content published by the ICE over three centuries.

Speaking at the launch, ICE head of publishing Leon Heward-Mills said, “With increasing pressure on engineers’ time, ICE Virtual Library forms an indispensible and trusted resource, providing authoritative best practice information at the click of the mouse.

“Our new range of ebooks will allow engineers new ways of accessing our published content, and a range of new purchasing options – such as the ability to buy individual book chapters – show that we’re continuing to respond to feedback from the civil engineering community.”

The ICE Virtual Library will host the prestigious ICE Proceedings suite of journals, as well as Géotechnique, the CESMM3 price database and the new ICE Manuals Series, including ICE Manual of Construction Materials, published this month.

Read more...

New Contracts For Stewart Milne Timber Systems

Stewart Milne Timber Systems has announced the company has secured £5m in new contracts for projects throughout the UK in both the public and private sectors and range from hotel extensions and new builds to a 300-bed student accommodation development for an English university.

In addition the company will supply timber systems for five separate private housing contracts and affordable housing developments in Inverness and Lewisham.

Each project will be delivered utilising either one of Stewart Milne Timber Systems low carbon closed panel build systems, or the company’s award-winning and innovative Sigma II build system which has been designed as a low carbon solution with energy efficiencies built in.

Gary Yeoman, sales and marketing director of Stewart Milne Timber Systems, commented: “We are delighted to have won such a diverse range of contracts in what is an exceptionally competitive market. As the UK’s leading timber systems provider, we work closely with our clients and bring our extensive sustainability credentials to bear in contributing towards sustainable targets in both domestic and commercial developments”.

Read more...

Eu Probe İnto Samsung and Hitachi Deals

The first is US-based Seagate Technology's bid for for Samsung Electronics' loss-making hard disk drive unit.

The other is Western Digital's purchase of Hitachi's storage business.

The commission is concerned the deals will reduce the number of rivals in the industry.

"Hard drives are the backbone of the digital economy," said Competition Commissioner Joaquin Almunia in a statement.

"The sector has already experienced significant consolidation, and the proposed acquisitions will further reduce competition."

Seagate Technology has bid to buy South Korean Samsung's hard-disk drive business for $1.4bn.

Western Digital, also based in the US, has offered $4.3bn for Japanese company Hitachi's hard drive unit.

These acquisitions would reduce the number of large manufacturers in the sector to three from five.

Western Digital would then have a 50% market share, while Seagate would have 40%.

The only other player would be Toshiba with 10% of the market.

The regulators say they will make a decision by 10 October on whether the deals will be cleared or blocked.

Read more...

Hagley Road Hotel And House Damaged İn Fire

About 20 people had to be moved out of the Knowle Lodge Hotel on Hagley Road in Birmingham. One man was treated for the effects of breathing in smoke.

It is believed Saturday's fire started in the roof of the vacant house and spread to the hotel. Both properties will be demolished fire crews said.

Part of the Hagley Road was temporarily closed. The fire was brought under control at about 0800 BST on Sunday.

The cause is not yet known.

About six fire engines were sent to the scene at about 1730 BST on Saturday.

Read more...

Japan Pensioners Volunteer To Tackle Nuclear Crisis

The Skilled Veterans Corps, as they call themselves, is made up of retired engineers and other professionals, all over the age of 60.

They say they should be facing the dangers of radiation, not the young.

It was while watching the television news that Yasuteru Yamada decided it was time for his generation to stand up.

No longer could he be just an observer of the struggle to stabilise the Fukushima nuclear plant.

The retired engineer is reporting back for duty at the age of 72, and he is organising a team of pensioners to go with him.

For weeks now Mr Yamada has been getting back in touch with old friends, sending out e-mails and even messages on Twitter.

Volunteering to take the place of younger workers at the power station is not brave, Mr Yamada says, but logical.

"I am 72 and on average I probably have 13 to 15 years left to live," he says.

"Even if I were exposed to radiation, cancer could take 20 or 30 years or longer to develop. Therefore us older ones have less chance of getting cancer."

Mr Yamada is lobbying the government hard for his volunteers to be allowed into the power station. The government has expressed gratitude for the offer but is cautious.

Certainly a couple of MPs are supporting Mr Yamada.

"At this moment I can say that I am talking with many key government and Tepco people. But I am sorry I can't say any more at this moment. It is on the way but it is a very, very sensitive issue politically," he said.

Read more...

Third Annual “Roboshock” Event Planned At Osu

Numerous teams of Oregon high school students will participate in the third annual “Roboshock” event on Saturday, May 21, at Oregon State University, in competition with robots they have built as part of a national initiative to encourage interest in science and engineering.

The event will be in Gill Coliseum on the OSU campus, beginning at 11 a.m. and concluding with an awards ceremony at 5 p.m. It is free and open to the public, and will include raffle prizes, a tour of OSU’s engineering facilities and field competitions.

The program is part of a national concept called FIRST, or For Inspiration and Recognition of Science and Technology. It includes the FIRST Robotics Competition LogoMotion and the “FIRST Tech Challenge Get Over It!” games.

“OSU is very interested in students with the skills and abilities learned through involvement in the FIRST program,” said Jonathan Hurst, an assistant professor of mechanical engineering at OSU and expert in robotics.

“Classes are important, but so are hands-on experience and self-initiated, career-related activities,” he said. “FIRST students come in and know how to make things happen, without waiting for an assignment. They have leadership, fundraising, organization and business skills.”

Roboshock is organized by the OSU Robotics Club. More information is available online at http://bit.ly/cLSfz8

OSU recognizes the value of these programs, financially helps support them, and actively recruits and offers scholarships to students who have participated in them, organizers say.

Read more...

New Resource Developed To Encourage Undergraduate Research Experiences

College educators around the nation who are discovering the unique value of research experiences for undergraduate students now have a new tool available to them – a “program in a box” detailing exactly how such experiences can be created, used and implemented.

This resource, which is free, will be introduced tomorrow in New York City by the National Center for Women and Information Technology, as part of their annual summit conference. It will soon be available online at http://bit.ly/mAcvYe, and was supported in part by the National Science Foundation.

“Hands-on research experiences for undergraduates help students to see the bigger picture and aspire to greater things,” said Margaret Burnett, a professor of computer science at Oregon State University and co-leader of the team that created this new educational resource.

“This will help faculty members better understand how research experiences for undergraduates can help everyone, both the student’s education and the faculty member’s own research efforts,” Burnett said.

“Some faculty may not have offered these because they had not thought of the research experiences from both sides,” she said. “Others would like to do this but don’t know how to start. This can help faculty with any of these perspectives see why to offer them and how to succeed at them.”

The “REU in a Box” tool is the latest resource from this national group. It draws examples and illustrations from computer science and information technology, Burnett said, but conceptually could be applied to research in any scientific field.

Original scientific research and scholarship has long been a part of most master’s degree and doctoral programs, but many universities are increasingly getting younger students involved as well, Burnett said. Experts say that students get more involved and interested, learn how to work in teams, are encouraged to continue their education at a graduate level, and can increase their chances for career success and employment.

For Kyle Rector, now a doctoral student at the University of Washington and a former OSU student who collaborated on research with Burnett, it was all of those things. Before earning her bachelor’s degree she has already co-authored several studies, won a Google Scholarship, a graduate fellowship from the National Science Foundation, and earned national recognition in her field.

“My undergraduate research experience is the reason why I am in computer science today,” Rector said.

“You can pursue your genuine interests and investigate questions without a prescribed answer,” she said. “I’ve been able to use my love of computer science and apply it to areas of life that I really care about. And you also learn time management, a work ethic and team-working skills.”

OSU has many initiatives to help undergraduate students get involved in research projects, Burnett said.

“At some institutions, faculty are discouraged from doing this, especially before they have tenure,” Burnett said. “Some feel, I believe wrongly, that it’s more trouble than it’s worth. But these activities can help both the student and faculty member. And they are especially useful to retain and inspire minority and under-represented students, help them succeed in their education.”

Read more...

International Conference To Address Manufacturing Research,Economic Progress

Many of the leading manufacturing researchers in the world and executives from private industry will be in Oregon from June 13-17 for a week-long conference at Oregon State University that will explore the role of manufacturing in innovation and economic development.

The professional conference will be the first of its type, combining in one venue members of the American Society of Mechanical Engineers, the Society of Manufacturing Engineers, and the Japan Society of Mechanical Engineers.

This event has been designed with help from industry, for industry,” said Brian Paul, an OSU professor of industrial and manufacturing engineering, and chair of the conference.

“It’s not often that Pacific Northwest manufacturers have an opportunity to network with international experts and learn more about new material processing techniques and manufacturing practices,” he said. “The conference also includes outreach to high schools and various student competitions that will provide opportunities to meet young talent from across the globe.”

Among the speakers are:

Jake Nichol, president and chief executive officer for Leatherman Tool Group
Chandra Brown, vice president of Oregon Iron Works
Skip Rung, president and executive director of the Oregon Nanoscience and Microtechnologies Institute
Charalabos Doumanidis, director of the nanomanufacturing program for the National Science Foundation
Bryan Dods, a manufacturing executive with General Electric Energy
Panel discussions will explore topics such as sustainable manufacturing, “clean” transportation manufacturing and “lean” engineering, as well as new concepts in manufacturing and global distribution. Speakers will discuss new alloys, composites and foams; emerging micro- and nanomanufacturing technologies; manufacturing innovations in clean energy; and “first-world” manufacturing strategies, among many other subjects

Read more...

Dams Power Down İn The Largest US Dam Removal

The Elwha River on Washington's Olympic Peninsula once teemed with legendary salmon runs before two towering concrete dams built nearly a century ago cut off fish access to upstream habitat, diminished their runs and altered the ecosystem.

On June 1, nearly two decades after Congress called for full restoration of the river and its fish runs, federal workers will turn off the generators at the 1913 dam powerhouse and set in motion the largest dam removal project in U.S. history.

Contractors will begin dismantling the dams this fall, a $324.7 million project that will take about three years and eventually will allow the 45-mile Elwha River to run free as it courses from the Olympic Mountains through old-growth forests into the Strait of Juan de Fuca.

"We're going to let this river be wild again," said Amy Kober, a spokeswoman for the advocacy group American Rivers. "The generators may be powering down, but the river is about to power up."

The 105-foot Elwha Dam also came on line in 1913, followed 14 years later by the 210-foot Glines Canyon Dam eight miles upstream. For years, they provided electricity to a local pulp and paper mill and the growing city of Port Angeles, Wash., about 80 miles west of Seattle. Electricity from the dams — enough to power about 1,700 homes — currently feeds the regional power grid.

A Washington state law required fish passage facilities, but none was built. So all five native species of Pacific salmon and other anadromous fish that mature in the ocean and return to rivers to spawn were confined to the lower five miles of the river. A hatchery was built but lasted only until 1922.

The fish are particularly important to members of the Lower Elwha Klallam Tribe, whose ancestors have occupied the Elwha Valley for generations and whose members recall stories of 100-pound Chinook salmon so plentiful you could walk across the river on their backs.

"We have never been happy that the salmon runs in the river were cut off," said Robert Elofson, Elwha River restoration director for the tribe, which along with environmental groups fought in the 1980s to tear down the dams. The tribe's land now includes about 1,000 acres on and near the Elwha River. "It's hard to have any pride when your main river of your tribe has been blocked and the salmon runs almost totally destroyed."

Read more...

Monday, May 30, 2011

Insourcing’ Effort Still Under Fire Despite Pentagon’s Gradual Retreat Drom Plan

The Pentagon in recent months has gradually backed off its controversial effort to move more contracting work into the government — a policy known as “insourcing” — but the effort is still raising the blood pressure of some in industry and at area think tanks.

As part of a larger Obama administration effort, the Pentagon in 2009 vowed to hire thousands of civil servants to replace contractors and particularly to beef up its acquisition workforce. The Defense Department called for insourcing in cases in which a job was “inherently governmental,” or a function that should be done by federal employees.

Part of the movement was based on cost savings the government hoped to achieve by hiring employees and reducing the role of contractors.

But in recent months, the Pentagon seems to have retreated from insourcing — though it maintains that it is continuing to implement the policy where appropriate. A memo issued this spring made moving work in-house subject to new fiscal constraints, and Defense Secretary Robert M. Gates reported publicly that the policy had not produced the cost savings he expected.

For the contracting industry, those developments came as a relief. The industry had criticized the policy’s implementation as arbitrary and based on reaching quotas rather than on thorough analyses. Still, conversations about the flaws of insourcing have persisted in the contracting world and on Capitol Hill. This month, think tanks added their voices to the mix.

The Center for Strategic and International Studies released what it called a “DOD Workforce Cost Realism Assessment.” David Berteau, senior adviser and director of the center’s defense-industrial initiatives group, joked that the organization had tried to give it an unremarkable title but that it’s already known as “the insourcing study.”

The report argued that insourcing does not necessarily save money. In fact, the document says, “the absence of potential competition leads inexorably to cost growth.”

More damning, the report says the Pentagon is making inaccurate cost comparisons by failing to take into account the total cost of a government employee.

Read more...

Quotations Of The Day

We’re not going to stop till Joplin’s back on its feet.” — President Barack Obama, at a memorial service in Joplin, Mo., a week after the city was hit by a devastating tornado.

While we were playing volleyball today, no doubt some soldier gave the ultimate sacrifice.” — Col. Thomas Magness, at a remembrance ceremony at the U.S. Army Corps of Engineers’ headquarters in Kabul, Afghanistan, on Memorial Day weekend. Earlier in the day, those working there had enjoyed a rare day off.

We’re looking forward to getting home, and we’re going to leave these guys to some peace and quiet and not disturb their space station any more.” — Mark Kelly, commander of the Endeavour, as the shuttle prepared to pull away from the International Space Station.

Copyright 2011 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Read more...

New Fuel Cell Reforming Technologies for the U.S. Military

Fuel cell manufacturers and OEMs continue to benefit from an increased military emphasis on energy security and logistical efficiency associated with the complex and challenging operational conditions being encountered in remote wartime environments such as Afghanistan. Reducing the strategic and tactical vulnerabilities associated with powering military equipment and remote installations has emerged as a leading priority for U.S. military leadership.

Fuel cells complement this mission in many ways offering significantly longer runtimes and significant savings in terms of weight and volume when compared to conventional military power sources such as the BA-5590 batteries and diesel generators. Fuel cell generators also offer tactical advantages by achieving significant reductions in the amount of noise, heat, and emissions associated with conventional diesel generators.

At the same time, logistical concerns regarding fuel availability for fuel cells represents one of the key challenges facing the fuel cell industry. The U.S. Department of Defense currently lacks an effective distribution system for conventional fuel cell fuels such as methanol and propane. Instead, the DOD has emphasized the need for achieving fuel compatibility with specialty military fuels where distribution networks already exist.

These specialty fuels are prominent across a wide spread of military applications. For example, JP?8, a fuel that is similar to commercial diesel and aviation fuel, is considered the most prominent fuel on the battlefield powering everything from tactical generators and unmanned vehicles to the military's mine resistant ambush protected (MRAP) vehicles, helicopters, and fighter aircraft.

The difficulties associated with engineering fuel cells that can run off these fuels are primarily associated with their high sulfur content. The sulfur content of these fuels is extremely high; up to around 3,000 ppmv S for jet fuels (JP-8, JP-5) and 10,000 ppmv S for naval distillate (NATO F-76). By comparison, commercial gasoline contains 30 ppmv S, while diesel power has around 15 ppmv.

The high sulfur content is poisonous to the reformer and electrode catalysts found in a fuel cell stack. Sulfur compounds in the liquid hydrocarbons must be subsequently reduced to less than 0.1 ppmw for polymer electrolyte membrane fuel cell (PEMFC) and at least less than 30 ppmw for the solid oxide fuel cell (SOFC).

According to Xialiang Ma, Altex Technologies Corporation, the development of new deep desulfurization processes for liquid hydrogen fuels has subsequently become one of the major challenges in developing the hydrocarbon processor for military fuel cell applications. As a result, the DOD has been supporting efforts to engineer hydrocarbon compatible fuel cell processors. For example:

- Adaptive Materials Inc., has been heavily involved in developing technologies that enable the use of JP-5 and JP-8 in fuel cells

Read more...

Shuttle Leaves Space Station To Begin Trip Home

As the spacecraft sailed 215 miles above Bolivia, pilot Greg Johnson gently pulsed Endeavour's steering jets at 11:55 p.m. EDT on Sunday (0355 GMT on Monday) to back away from the docking port that has anchored the shuttle since its arrival on May 18.

Endeavour delivered the station's premier science experiment -- the $2 billion Alpha Magnetic Spectrometer particle detector -- and a pallet of spare parts intended to tide over the orbital outpost after the shuttle program ends.

"Endeavour departing," radioed station flight engineer Ron Garan. "Fair winds and following seas, guys."

"Thanks Ron," replied Endeavour commander Mark Kelly. "Appreciate all the help."

Afterward, Endeavour maneuvered to within about 950 feet of the station to test a new automated rendezvous system being developed for NASA's next spaceship, the Multi-Purpose Crew Vehicle, intended to fly astronauts to the moon, asteroids and eventually to Mars.

"We're now separating -- that's the closest we're going to get," Kelly radioed to Garan.

Then Kelly fired the shuttle's thrusters, and Endeavour pulled away from the space station for the last time. NASA plans to decommission Endeavour, its youngest shuttle with 25 voyages, and send it to a museum in Los Angeles for display.

One final shuttle mission is planned before the United States ends the 30-year-old shuttle program. Atlantis is due to launch July 8 with a year's worth of supplies for the station, a contingency plan in case the commercial companies hired to take over supply runs to the station encounter delays with their new vehicles.

The shuttles are being retired to save the $4 billion annual operating expenses so NASA can develop new vehicles that can travel beyond the station's 220-mile-high (355-km-high) orbit.

During its 12 days at the station, the Endeavour crew conducted four spacewalks to complete construction of the U.S. side of the $100 billion outpost, a project of 16 nations that has been being assembled in orbit since 1998.

Endeavour is due back at the Kennedy Space Center in Florida at 2:35 a.m. EDT (0635 GMT) on Wednesday, the same day sister ship Atlantis is scheduled to reach the launch pad for NASA's 135th and final flight.

Read more...

What It Takes to Be a Landlord

The residence at 3809 Shannon Drive in northeast Baltimore has seen better days. Way better, to be honest. A two-story red-brick row house built in 1952, the 900-square-foot, three-bedroom unit has a tiny kitchen, rickety stairs, worn carpets, dingy and obsolete bathrooms, and, most off-putting, a concrete basement floor that curls up at either end in the shape of a half-pipe skateboard run. That's a sign of ominous settling and water damage. The nearest bus stop is a hike of four blocks. No supermarkets or restaurants are nearby. You're five miles from downtown. It feels like 25.

In a city rife with empty and deteriorating homes, no wonder the family of the elderly owner, who moved out last year to live with relatives, couldn't get their original asking price of $119,900 in February 2010. Nor could they attract a buyer when they lowered the price to $99,900 in May, or to $65,000 in October. Once they dropped the price to $55,000 in January 2011, professional real estate investors sensed an opportunity, like bottom-fishers snaring distressed stocks and bonds. So 3809 Shannon is getting a new lease on life.

The entrepreneur who bought it as-is in April for $42,000 is 45-year-old Baltimorean Fayz Khan, an automotive engineer turned full-time landlord. Khan thinks he can earn a return that would make Wall Street financiers envious. That's conceivable if Khan can recondition this pile of bricks quickly, at minimal cost, and get a reliable tenant who qualifies for a government rent subsidy. Khan thinks he can get $1,400 a month -- twice what plenty of nearby homeowners are paying each month on their mortgages, and about the same as the rent on a luxurious downtown condo. But that's what the federal government sets as "fair market rent" for a three-bedroom townhouse in Baltimore. Allowable rents are higher still in places like New York City and San Francisco.

Going down-market
This deal highlights the essential truth of property investing today: To boost the odds that you'll make a profit, you need to go down-market, where the rents are exorbitant relative to the cost of ownership. This is due to a drastic shortage of decent low- and middle-income rentals. For many years, it paid to buy rental houses even if the monthly expenses exceeded the income because you could count on leverage, high appreciation and tax breaks to produce a profit in a few years. But that business model does not make sense in 2011. Property values are not appreciating, so you can't expect to buy a place and flip it for a fast return. You might be able to profit from a modest small apartment building or duplex, but overbuilt high-rise condos and foreclosed suburban homes aren't economically feasible as rentals. The idea now is to insist that every house or apartment deliver "positive cash flow" (a net monthly operating profit) with no exceptions. If a house cannot "cash flow," as the pros say, you need either to walk away from the deal or to offer so little that if by chance your bid is accepted, the numbers work.

Accomplishing that isn't as easy as buying cheap stocks. First, you have to come up with enough cash to launch your business with a couple of properties because banks have all but ceased lending to inexperienced landlords. Then, take a hard look at your handyman skills: If you can barely hang a light fixture, you're vulnerable to all kinds of unpleasant surprises. It also helps to know the local real estate market, local politics, and which neighborhoods are stable and which are in irreversible decline.

Shannon Drive is a tidy street in a neighborhood of homeowners with a strong community association, and bringing a property up to snuff could help preserve the area. But Peter Giardini, a veteran real estate investor whom Khan has hired to be his coach and adviser, has a tough litmus test for would-be landlords: "What if you were faced with evicting a family in the month of December? Could you do it? If you hesitate more than three seconds, I'd say you don't have the temperament for this business."

Read more...

Lockheed Strengthens Network Security After Hacker Attack

Lockheed Martin said on Sunday that it had stepped up its investigation into a sophisticated hacking attack on its computer networks and bolstered security measures for gaining remote access to its systems

Lockheed and RSA Security, which supplies coded access tokens to millions of corporate users and government officials, said they were still trying to determine whether the attack had relied on any data that hackers had stolen from RSA in March or if it had exploited another weakness.

Lockheed, which is based in Bethesda, Md., said on Saturday night that the attack, which occurred on May 21, was “significant and tenacious.” Lockheed officials said they had stopped the attack shortly after hackers got into a system, adding that no customer or company data was compromised.

Sondra Barbour, Lockheed’s chief information officer, sent a memo to the company’s employees on Sunday, saying that its systems remained secure. She said Lockheed had quickly shut down remote access to its network after the attack began.

Still, the attack was significant enough that it was described in briefing materials provided to President Obama, the White House spokesman, Jay Carney, said on Sunday. He said the damage was “fairly minimal.”

Government officials have said Lockheed Martin, the nation’s largest military contractor, and other military companies face frequent attacks from hackers seeking national security data.

Officials at Lockheed and RSA Security, a division of the EMC Corporation that provides the SecurID brand of electronic access tokens, said they were working with federal officials to investigate how the attack occurred and who was behind it.

Ms. Barbour said Lockheed also had accelerated a plan to increase network security. The company has upgraded the SecurID tokens supplied by RSA and is resetting all user passwords. Lockheed also switched to eight-digit access codes from four-digit codes, which are randomly generated by the tokens.

Lockheed officials said on Friday that the attack on its systems might have been linked to one on the RSA network in March. At the time, RSA said it had sustained a data breach that could have compromised some of its security products. Its announcement shocked computer security experts, particularly because its systems are widely used.

Shortly after RSA announced that breach, Lockheed, like many other large companies, said it had added an additional password to the process employees used to connect to its system from remote locations.

Read more...

As PC Markets Slow, Nvidia Aims at Tablets

Jen-Hsun Huang has rallied his engineers and developers before around his grand plans for Nvidia, the computer chipmaker he helped found nearly 20 years ago.

Mr. Huang — a Taiwanese immigrant, onetime table tennis champ and Stanford-educated electrical engineer — took a gamble in 2006 on a graphics chip that would give his company the lead in the most sophisticated computing power used in moviemaking and science. And in 2003, he energized Nvidia after it lost a lucrative deal for supplying a graphics chip to the Microsoft Xbox game machine.

Now he wants the company to make another shift, stretching beyond graphics to build the chips that power smartphones and tablets.

“We used to be a PC graphics company — only PCs, only graphics,” Mr. Huang said in an interview last week. “We have reinvented Nvidia.”

It is an opportune time for the shift. Tablets are poised to surge, as PC sales are slowing. Meanwhile, the technology in PCs is changing, threatening the company’s old market. Intel and Advanced Micro Devices both sell main processors that include graphics abilities, cutting out the need for add-on graphics processors, and already eating into some Nvidia sales.

“We are well positioned to go after a market opportunity that is sixfold the market opportunity of the Nvidia you knew from the past,” Mr. Huang recently told analysts.

Behind Mr. Huang’s showmanship and bold plans is the Tegra 2, the company’s latest mobile chip. This chip, based on the power-efficient chip architecture from ARM Holdings, has started to appear in a number of smartphones, like the Droid X2, and in tablets like the Samsung Galaxy Tab 10.1 that run on Google’s Android operating system. Mr. Huang envisions Nvidia becoming as essential to Android as Intel has long been to Windows PCs.

Although few people in the chip industry dismiss Mr. Huang’s ideas, the path will be difficult. “There’s a lot of uncertainty about how they are going to outgrow their core graphics business,” said Rajvindra Gill, an analyst with Needham & Company, which recently downgraded Nvidia’s stock to hold from buy.

Still, the shift appears to be a necessary step.

A recent report from Jon Peddie Research shows that Nvidia’s share of the market for graphics processing — including integrated and stand-alone chips — dropped eight percentage points year over year in the first quarter to 20 percent, while A.M.D.’s share of the graphics chip market rose three percentage points and Intel’s rose almost five percentage points

Read more...

Bringing Efficiency to the Infrastructure

Wireless sensors can now collect and transmit information from almost any object — for instance, roads, food crates, utility lines and water pipes. And the improved software helps interpret the huge flow of information, so raw data becomes useful knowledge to monitor and optimize transport and other complex systems. The efficiency payoff, experts say, should translate into big reductions in energy used, greenhouse gases emitted and natural resources consumed.

Smart infrastructure is a new horizon for computer technology. Computers have proven themselves powerful tools for calculation and communication. The next step, experts say, is for computers to become intelligent instruments of control, linking them to data-generating sensors throughout the planet’s infrastructure. “We are entering a new phase of computing, in which computers will be interacting with the physical world as never before,” said Edward Lazowska, a professor of computer science at the University of Washington.

Computer-enhanced infrastructure promises to be a lucrative market. And the outlook has seemingly improved in the economic downturn, as governments around the world embrace stimulus spending that relies heavily on public works projects, both high-tech and low.

A handful of big technology corporations, including I.B.M., Cisco and General Electric, have major initiatives under way — I.B.M. has even branded its campaign, “Smarter Planet.” Yet many other companies, both large and small, are also pursuing opportunities.

Just how large the market will be and how quickly it will develop remain uncertain. The early smart-infrastructure ventures often seem like applied science projects, encouraging but small scale. It is not clear whether they will work outside the laboratory, where they must turn a profit or justify higher taxes or user fees. Much of the early Internet investment, after all, came to grief.

The smart infrastructure wave, analysts warn, could bring a similar cycle of enthusiasm and disappointment. Yet, like the Internet, they say, the technology will prevail in the long run.

“There will be a lot of hype and a lot of things that don’t pan out,” said Rosabeth Moss Kanter, a professor of business administration at the Harvard Business School. “But the direction is absolutely right. We’ve barely scratched the surface of how information technology can help control and conserve energy use

Read more...

More Than Tech Needed to Reduce CO2 Footprint of Buildings

Energy use in buildings could be cut by as much as 60 percent by mid-century, but doing so would take more than just adopting energy-saving technologies. That’s according to the findings of a four-year study looking at residential and commercial building sectors around the world and published Monday by the Geneva-based World Business Council for Sustainable Development, a global business association. The report, entitled “Transforming the Market: Energy Efficiency in Buildings,” is being touted as the most rigorous study ever conducted on the subject and includes a sweeping road map for the building industry to achieve this energy-cutting goal. “Energy efficiency is fast becoming one of the defining issues of our times, and buildings are that issue’s elephant in the room,” said Björn Stigson, president of the business council, in a statement.


Besides the adoption of technology, like high-insulating window and walls or energy-efficient lighting, the report makes five principle recommendations. They include: strengthening building codes and energy labeling for more transparency; using subsidies and price signals to incentivize energy-efficient investments; encouraging a more integrated design approach among building professionals; enabling the workforce to save energy; and fostering an energy-aware culture.


The report’s authors said they took a “bottom-up, market-driven approach” to understanding the barriers to lower energy use. Some of the more interesting targets include: for investors, to use energy efficiency analysis to enhance traditional decision-making; for suppliers and manufacturers, to develop marketing campaigns that promote a building’s total energy performance rather than single components; and for architects, to design buildings that can be more easily retrofitted with new technologies as they’re developed in the future.


This isn’t the first study to stress how important changes in the building sector are when it comes to achieving global climate change goals. Buildings consume more energy — and contribute to more greenhouse gasses — than industry or transportation, accounting for about 40 percent of the world’s energy use. The topic has become so uncontroversial that this morning the conservative Wall Street Journal, rarely prone to support eco-causes, published a special section on the “green house of the future,” complete with flowery language that could have been pulled from a 1970s hippie manifesto on eco-living.


According to the study, many energy efficiency projects are economically feasible at today’s energy costs. A $150 billion annual investment in building energy efficiency in the six markets studied (U.S., EU, Japan, China, India and Brazil) would reduce carbon footprints by 40 percent with a five-year discounted payback, the study found. Doubling that investment could achieve a payback within five and 10 years and would reduce the footprint another 12 percent. But achieving a higher reduction target, such as the 77 percent called for by the Intergovernmental Panel on Climate Change, could not be justified on economic grounds at today’s energy prices and would require other steps, such as subsidies.

Read more...

Levees Cannot Fully Eliminate Risk of Flooding to New Orleans

Levees and floodwalls surrounding New Orleans -- no matter how large or sturdy -- cannot provide absolute protection against overtopping or failure in extreme events, says a new report by the National Academy of Engineering and the National Research Council. The voluntary relocation of people and neighborhoods from areas that are vulnerable to flooding should be considered as a viable public policy option, the report says. If relocation is not feasible, an alternative would be to elevate the first floor of buildings to at least the 100-year flood level.



The report is the fifth and final one to provide recommendations to the Interagency Performance Evaluation Task Force (IPET), formed by the U.S. Army Corps of Engineers to examine why New Orleans' hurricane-protection system failed during Hurricane Katrina and how it can be strengthened. The previous four reports by the NAE and Research Council examined various draft volumes of the IPET. This report reviews the 7,500-page IPET draft final report, reflects upon the lessons learned from Katrina, and offers advice for how to improve the hurricane-protection system in the New Orleans area.



Although some of the report's recommendations to enhance hurricane preparedness have been widely acknowledged for years, many have not been adequately implemented, said the committee that wrote the report. For instance, levees and floodwalls should be viewed as a way to reduce risks from hurricanes and storm surges, not as measures that completely eliminate risk. As with any structure built to protect against flooding, the New Orleans hurricane-protection system promoted a false sense of security that areas behind the structures were absolutely safe for habitation and development, the report says. Unfortunately, there are substantial risks that never were adequately communicated to the public and undue optimism that the 350-mile structure network could provide reliable flood protection, the committee noted.



Comprehensive flood planning and risk management should be based on a combination of structural and nonstructural measures, including the option of voluntary relocations, floodproofing and elevation of structures, and evacuation, the committee urged. Rebuilding the New Orleans area and its hurricane-protection system to its pre-Katrina state would leave the city and its inhabitants vulnerable to similar disasters. Instead, settlement in areas most vulnerable to flooding should be discouraged, and some consideration should be given to new designs of the New Orleans metro hurricane-protection system. As part of the future design, relocation of some structures and residents would help improve public safety and reduce flood damages.



For structures in hazardous areas and residents who do not relocate, the committee recommended major floodproofing measures -- such as elevating the first floor of buildings to at least the 100-year flood level and strengthening electric power, water, gas, and telecommunication supplies. Also, a comprehensive evacuation program should be established that includes well-designed and tested evacuation plans; improved local and regional shelters that would make evacuations less imposing; and long-term strategies that could enhance the efficiency of evacuations, such as locating facilities for the ill and elderly away from hazardous areas.



Furthermore, the 100-year flood level -- which is a crucial flood insurance standard -- is inadequate for flood protection structures in heavily populated areas such as New Orleans, where the failure of the system would be catastrophic. Use of this standard in the New Orleans area has escalated the costs of protection, encouraged settlement in areas behind levees, and resulted in losses of life and vast federal expenditures following numerous flood and hurricane disasters, the committee said.



Regarding IPET's draft final report, the committee concluded that it contained important advances in characterizing and understanding the nature of Gulf hurricane storm surges and waves -- in particular explaining the storm surge generated by Hurricane Katrina, how waters from the surge entered the New Orleans metro region, and the amount of flooding across the city. In addition, IPET's studies have made significant contributions to simulating hurricane impacts, characterizing the collective effects of hurricane damage, and improving knowledge of regional vulnerability to hurricanes and storm surge.



However, the final IPET report should provide a better explanation of its methods to evaluate flood risks, the committee said. The final report also should be written in a more clear and organized manner, using layman's terminology that can be understood by the public and officials. Such clarity is lacking in Volume VIII, which was the principal focus of the final two years of IPET's study. This volume assesses the risks posed by future tropical storms and contains inundation maps that show the areas at most risk for future flooding. These maps are important to citizens, businesses, and government agencies for planning resettlement and redevelopment in the region, but the volume contains limited discussion of the implications of these maps. Moreover, at times the extensive technical information presented in the volume overshadows key results.



The committee also recommended that a professional technical firm prepare a second document for the public and officials that would be shorter and focus on explaining IPET report results and implications for reconstruction and resettlement

Read more...

Saturday, May 28, 2011

Better Prediction Sought for Devastating Floods

It fiercely shook the capital Lima, but its devastating epicentre was about 200km (124 miles) to the south, near the town of Pisco, a small fishing port built largely of adobe - mud bricks which Peruvians have used for thousands of years.

More than 500 people were killed and about 75,000 homes were left uninhabitable.

For Peruvian engineer Marcial Blondet, it was the devastating quake in 1970 that first motivated him to develop earthquake-resistant buildings, particularly for those who could least afford them.

Some 70,000 people died in the mountainous region of Huaraz, many of them in an avalanche of snow, ice and rock which obliterated the town of Yungay. It was the deadliest earthquake in Latin American history.

'Tragic combination'

"Adobe and earthquakes are a perverse and tragic combination," says Mr Blondet.

"We are right in the middle of the most seismic area in the world. We've had many, many huge quakes and we are still waiting for the super big one.

"But a very large percentage of the people here are poor, so adobe is the only thing they can use to build their homes. Unfortunately, that's the case for millions of people in seismic zones around the world."

During more than 35 years of research, Mr Blondet and his team have tried a range of natural and industrial materials to try to reinforce weak mud-brick structures. Bamboo cane was one option, but there is not enough of it.

Mud-brick structures are tested vigorously on shaking tables which simulate earthquakes in the structural engineering laboratory at Lima's Catholic University.

Watching the simulations, it is easy to see just why adobe houses, home to about 40% of Peruvians, are such death-traps.

First a vertical crack appears, then the outer wall falls outwards, before the other walls crumble and the roof caves in.

"The people on the street are killed by the walls that fall out, the people inside are killed by the roof that falls in. It's terrible," says Mr Blondet.

"No-one should live in a house that behaves like this. A house is a place where we go when we want to feel protected and safe, so it's unbearable, completely unacceptable - an abomination - that your house kills you."

Read more...

Zapping Deadly Bacteria Using Space Technology

Using plasma – superheated, electrically charged gas – Max Planck Institute for Extraterrestrial Physics director Gregor Morfill is developing ways to kill bacteria and viruses that can cause infections in hospitals.

“What we have with plasma is the possibility to supplement our own immune system,” says Dr. Morfill.

The research began on the International Space Station (ISS), where his ESA-funded physics experiments have been running since 2001.

The first was ‘Plasmakristall Experiment Nefedov’ in cooperation with Russian partners. Later, the PK-3 Plus and PK-4 experiments flew in 2006 as part of ESA’s Astrolab mission.

“It’s the longest-running space experiment in the history of human spaceflight,” notes Dr. Morfill. More than two dozen astronauts and cosmonauts have operated the equipment aboard the ISS.

Laboratory prototype plasma device for sanitising hands. Credits: Max-Planck Institute for Extraterrestrial Physics

The work in space led to the realisation that plasma might have very practical terrestrial applications – and Dr. Morfill turned to ESA's Technology Transfer Program to make it a reality.

Plasma dispensers can tackle a serious problem: in recent years, health experts have seen a dramatic rise in super-strains of bacteria that can survive the strongest antibiotics in medicine’s arsenal.

One, the multiple drug-resistant Staphylococcus aureus – perhaps better known as MRSA – kills 37 000 people each year in the EU alone. It affects more than 150 000 patients, resulting in extra in-hospital costs of €380 million for EU healthcare systems.

With help from ESA, Dr. Morfill’s team is now focusing on developing a system for hospitals, but cold plasma technology might one day also make it into our homes. Plasma could be used to disinfect toothbrushes and razors instead of UV light, which only sanitizes the surfaces it shines on. Plasma-charged gas would clean in hidden cracks and crevices, too.

At the other end of the spectrum, he says that plasma could be used as a ‘planetary protection system’ to clean satellites and planetary probes so they don’t carry terrestrial bacteria to distant planets.

Read more...

Retrofitted Historic Building Survives Strong Simulated Jolts During UCSD Test

As part of the $1.24 million research project sponsored by the National Science Foundation under the Network for Earthquake Engineering Simulation (NEES) program, the three-story, masonry-infilled, reinforced concrete frame representing structures built in California in the 1920s was tested at the NEES -UCSD Englekirk Structural Engineering Center, home of the world’s largest outdoor shake table. The structure had an unreinforced masonry wall at the bottom story retrofitted with an engineered cementitious composite (ECC). Currently, there is a lack of reliable analysis methods to evaluate the seismic performance of these older masonry structures and validated retrofit methods to improve their seismic behavior. The ultimate goal of this project is to provide methods to assess and improve the seismic performance of these older buildings by developing reliable analytical models and effective retrofit techniques.

Many reinforced concrete (RC) and steel frame structures have unreinforced masonry infill walls to serve as interior and exterior partitions. Such construction can be found in many older buildings in the western United States, including pre-1930’s buildings in California, and in numerous newer buildings in the midwestern and eastern parts of the country. In fact, RC frames with brick infill are common construction practice in many parts of the world, such as China and Mediterranean countries. However, in spite of the fact that unreinforced masonry infill walls are usually treated as non-structural components, they will interact with the bounding frames when subjected to earthquake loads, explained Benson Shing, a UCSD structural engineering professor and lead researcher on the project.


Ads by Google

Seismic Solutions - Leader in Real-time Seismic Systems Nuclear, Earthquake, Buildings, SHM - Kinemetrics.com


“When these structures are subjected to severe seismic loads, the interaction of the frame and infill walls can lead to undesired failure mechanisms, such as the brittle shear failure of the columns and the cracking and collapse of the brick infill walls,” Shing said. “The two-bay, three-story frame had one of the bottom-story bays open without any brick infill. This configuration would normally introduce a weak bottom story, which could lead to a catastrophic failure.”

The main focus of the recent tests - which were performed in collaboration with Stanford University, and the University of Colorado at Boulder - was to evaluate the effectiveness of two retrofit techniques intended to enhance the seismic performance of these structures. One is the use of an overlay of an engineered cementitious composite (ECC) material to restrain the cracking of the brick walls, and the other is the use of a thin overlay of a glass fiber reinforced polymeric (GFRP) material for the same purpose. The ECC material consists of cement, sand, glass fibers, and other additives. The surface of the glass fibers is specially treated to enhance the tensile ductility of the cement-fiber composite. The ECC material was applied onto the infill wall in the bottom story of the test structure, and the GFRP overlay was used on the second-story walls, one of which had a window opening.

“The interaction of the concrete frame with a masonry wall is a complicated physical phenomenon,” Shing said. “ So we try to predict with more precision with advanced computation models; we hope these tests will provide data that will allow us to validate these models so we can use them in the future to study the performance of similar types of construction and structures with different configurations such a different number of stories.

Shing said despite intense shaking of the structure, the retrofit proved very successful. The recent shake tests were a follow-up to seismic tests conducted by Shing and his team on a similar structure on the shake table at the UCSD Englekirk Center in November 2008.

During the November 2008 tests, the engineers subjected a 3-story structure with a non-ductile reinforced concrete frame and unreinforced masonry infill walls. Infill walls can generally improve the seismic safety of a building up to a certain level of earthquake intensity depending on the number of walls present and their locations. Once the strength of the walls is exceeded by the earthquake force, the failure of such structures could be sudden and catastrophic.

The video below shows the severe damage to the three-story structure that occurred during the November 2008 tests.

Read more...

Engineering Structures that Adapt to Changing Environmental Conditions

Weather conditions such as wind and snow loads can cause failure and collapse of supporting structures in roofs and similar constructions. Based on new hybrid intelligent construction elements (HICE), researchers at the University of Stuttgart have developed a shell structure which is able to adapt to changing environmental conditions. In a further step, the scientists will now use their knowledge to develop machines from these new structural elements which will also be able to react to their environments and adapt to given conditions. According to experts, this development may eventually lead to a significant acceleration of entire construction processes in mechanical, electrical and control engineering.

A research group of six engineers from different fields such as civil, aerospace, mechanical and process engineering is funded by the Deutsche Forschungsgemeinschaft (German Research Foundation) with a grant of 1.858 m € assigned for the first three years of a six-year project. The research group has started to operate in June.

The structural elements (e.g. shafts, levers, tractive or surface elements) are provided with integrated sensors, actuators and control elements. Light-weight and wear-resistant materials increase their functionality. Within the course of three years, the scientists from Stuttgart hope to assemble six newly-developed HICEs (membrane shells, adaptive cover elements, tile coating elements, inflexible force transmission elements, hybrid rope elements, bearing and lever elements) into a large-scale demonstrator shell structure measuring five square metres, which will combine all of the HICEs' functionalities. The adaptive shell structure will be translucent and much lighter than conventional supporting structures. If a change in environmental factors such as wind load, wind direction or snow load occurs, the structure shall be able to dissipate strain autonomously and adaptively via levers, ties and shell elements in order to prevent failure. The demonstrator will be exhibited by the University of Stuttgart.

Portability to all engineering disciplines

In a second phase of the project, the participating researchers will try to show by means of further constructions that HICEs can be applied in all engineering disciplines. By way of example, a hybrid engine bonnet shall be developed which may be combined with state-of-the-art "active" bonnets. This could improve pedestrian safety significantly by preventing severe injuries in case of a collision with this type of bonnet: Standard active bonnets are able to report the clash via additional sensors to an electronic control device which then prompts the rear part of the bonnet to be lifted upwards via a lever structure. This creates a protective distance between the accident victim and the hard engine parts beneath the bonnet. An intelligent hybrid engine bonnet would additionally create a specific deformation of the bonnet in reaction to the parameters of the actual collision. Based on new materials, the bonnet shall be able to soften or harden relevant areas of its structure autonomously in order to prevent injuries as far as possible.

In addition, demonstrators for the application of HICEs in shaft-to-collar connections and machine enclosures will be developed.

The participating institutions are the Institutes of Mechanical Handling and Logistics, of Construction Technology and Technical Design, of Textile and Process Engineering, of Aircraft Design, of Design and Construction and of Metal Forming Technology. "Within six years, the research group will have developed an entirely new class of hybrid intelligent construction elements together with its respective constructional and computational methods. We will have reached a new level of systems integration", says research group spokesman Prof. Karl-Heinz Wehking.

Read more...

Engineers Design Self-righting Buildings That Survive Earthquake Test İn Style

A new earthquake-resistant structural system for buildings, just successfully tested in Japan, will not only help a multi-story building hold itself together during a violent earthquake, but also return it to standing up straight on its foundation afterward, true and plumb, with damage confined to a few easily replaceable parts.

The team that designed the system was led by researchers at Stanford University and the University of Illinois. During testing on a massive shake table, the system survived simulated earthquakes in excess of magnitude 7, bigger than either the 1994 Northridge earthquake or the 1989 Loma Prieta earthquake in California.

"This new structural system has the potential to make buildings far more damage resistant and easier to repair, so people could reoccupy buildings a lot faster after a major earthquake than they can now," said Greg Deierlein, professor of civil and environmental engineering at Stanford, who led the team that designed the new system.

The system dissipates energy through the movement of steel frames that are situated around the building's core or along exterior walls. The frames can be part of a building's initial design or could be incorporated into an existing building undergoing seismic retrofitting. They are economically feasible to build, as all the materials employed are commonly used in construction today and all the parts can be made using existing fabrication methods.

"What is unique about these frames is that, unlike conventional systems, they actually rock off their foundation under large earthquakes," Deierlein said.

The rocking frames are steel braced-frames, the columns of which are free to rock up and down within steel "shoes" secured at their base. To control the rocking and return the frame to vertical when the shaking stops, steel tendons run down the center of the frame from top to bottom. These tendons are made of high-strength steel cable strands twisted together and designed to remain elastic during shaking. When shaking is over, they rebound to their normal length, pulling the building back into proper alignment.

At the bottom of the frame sit steel "fuses" designed keep the rest of the building from sustaining damage.

"The idea of this structural system is that we concentrate the damage in replaceable fuses," Deierlein said. The fuses are built to flex and dissipate the shaking energy induced by the earthquake, thereby confining the damage. Like electrical fuses, the steel fuses are easily replaced when they "blow out."

Deierlein and his colleagues conducted shake testing of the new system in the last few weeks at the Hyogo Earthquake Engineering Research Center in Miki City, Japan. Using different types of fuses and various shaking parameters, they conducted four major tests, the last on Aug. 24. They had previously developed and tested the individual components of the system and performed computational analyses to simulate the system's performance at laboratories at Stanford and the University of Illinois.

Read more...

Cement's Basic Molecular Structure Finally Decoded

Oddly enough, the three-dimensional crystalline structure of cement hydrate -- the paste that forms and quickly hardens when cement powder is mixed with water -- has eluded scientific attempts at decoding, despite the fact that concrete is the most prevalent man-made material on earth and the focus of a multibillion-dollar industry that is under pressure to clean up its act. The manufacture of cement is responsible for about 5 percent of all carbon dioxide emissions worldwide, and new emission standards proposed by the U.S. Environmental Protection Agency could push the cement industry to the developing world.

"Cement is so widely used as a building material that nobody is going to replace it anytime soon. But it has a carbon dioxide problem, so a basic understanding of this material could be very timely," said MIT Professor Sidney Yip, co-author of a paper published online in the Proceedings of the National Academy of Sciences (PNAS) during the week of Sept. 7 that announces the decoding of the three-dimensional structure of the basic unit of cement hydrate by a group of MIT researchers who have adopted the team name of Liquid Stone.

"We believe this work is a first step toward a consistent model of the molecular structure of cement hydrate, and we hope the scientific community will work with it," said Yip, who is in MIT's Department of Nuclear Science and Engineering (NSE). "In every field there are breakthroughs that help the research frontier moving forward. One example is Watson and Crick's discovery of the basic structure of DNA. That structural model put biology on very sound footing."

Scientists have long believed that at the atomic level, cement hydrate (or calcium-silica-hydrate) closely resembles the rare mineral tobermorite, which has an ordered geometry consisting of layers of infinitely long chains of three-armed silica molecules (called silica tetrahedra) interspersed with neat layers of calcium oxide.

But the MIT team found that the calcium-silica-hydrate in cement isn't really a crystal. It's a hybrid that shares some characteristics with crystalline structures and some with the amorphous structure of frozen liquids, such as glass or ice.

At the atomic scale, tobermorite and other minerals resemble the regular, layered geometric patterns of kilim rugs, with horizontal layers of triangles interspersed with layers of colored stripes. But a two-dimensional look at a unit of cement hydrate would show layers of triangles (the silica tetrahedra) with every third, sixth or ninth triangle turned up or down along the horizontal axis, reaching into the layer of calcium oxide above or below.

And it is in these messy areas - where breaks in the silica tetrahedra create small voids in the corresponding layers of calcium oxide - that water molecules attach, giving cement its robust quality. Those erstwhile "flaws" in the otherwise regular geometric structure provide some give to the building material at the atomic scale that transfers up to the macro scale. When under stress, the cement hydrate has the flexibility to stretch or compress just a little, rather than snapping.

Read more...

Twin Towers, Twin Myths?

One of the crucial technical disputes in American history, perhaps second only to global warming, is underway. It pits hundreds of government technicians who say the World Trade Center buildings were brought down by airplane impact against hundreds of professional architects and building engineers who insist that the Twin Towers could never have collapsed solely due to the planes and are calling for a new independent investigation. It is a fight that is not going away and is likely to get louder as more building trade professionals sign on to one side or the other.

Better than anyone, David Ray Griffin understands the “enormous importance” of Richard Gage, the Bay Area architect and staunch Republican who founded Architects and Engineers for 9/11 Truth

Griffin, the controversial retired Santa Barbara philosophy professor/theologian (Claremont School of Theology), is regarded as the leading investigative force within what is called the 9/11 Truth movement, with seven 9/11 books to his credit, including his bestseller The New Pearl Harbor. Although sometimes challenged (about accuracy), until Gage appeared, Griffin found his greatest stumbling block in public appearances to be this question: If his analysis was true-that two planes could not have brought down three World Trade Center (WTC) buildings without the aid of pre-planted explosives-why didn’t a single U.S. architect or building engineer publicly support him? Now, in three years, Gage has signed up 804 architects and structural engineers, some from top firms, who challenge the official version of the buildings’ collapses.

Read more...

The Elements of a Great Scientific and Technical Dispute

If the scientific fight over the World Trade Center was not so hugely important, it might be viewed as simply ridiculous that core elements of an event could be so severely disputed by people equally pledged to the scientific method. But with the stakes so immense, the vastness of the gap is far from ridiculous and is, in fact, of such magnitude that it is almost certainly going to take wide public understanding of the elements of the dispute to force re-examination of the evidence in a manner that would win the trust of both the public and the experts.

For the record, here is a summary of just some of the technical areas in dispute and what the National Institute for Standards and Technology (NIST) and its building trade and science allies on one side and its equally credentialed science, professional and licensed critics (building and structural engineers, architects, physicists, chemists) on the other side, put forward as their cases. It was compiled from NIST’s official report and from analysis that included papers and reports by independent professionals or members of groups representing each side of the argument, as well as from some other independent technical experts who have not taken sides.

The dispute takes place in a context that no other high-rise steel buildings ever collapsed in such a manner without the use of explosives. NIST alleges that in this special-circumstances case the buildings, like the “unsinkable” Titanic, did just that. NIST’s independent critics believe that what is “titanic” here are NIST’s scientific mistakes, evasions and willful refusal to examine all evidence.

Impact of Planes on Steel Columns
NIST reports that of the 47 core columns in each tower, three in WTC 1 were severed, four sustained heavy damage and five sustained moderate damage, adding up to about 25% of the columns. In WTC2 five core columns were severed, four sustained heavy damage and one sustained moderate damage, adding up to about 21% of the columns. NIST argues that in combination with the steel beams weakened by fire after the plane impact stripped fireproofing from the beams, this was sufficient to trigger a general collapse in both towers. Moreover, in both buildings perimeter columns on the exterior were severed, in one of them 35 such columns out of the 240 in each tower.

Read more...

Researcher Thinks "Inside the Box" to Create Self-contained Wastewater System for Soldiers, Small Towns

Cheaper. Better. Faster. Most people will say you can't have all three. But don't tell that to Dr. Jianmin Wang, a professor of civil, architectural and environmental engineering at Missouri University of Science and Technology.

Wang has created a wastewater system "in a box." Each system, built by re-purposing a shipping container, is low power, low maintenance and highly efficient. Built from weathering steel, these containers are designed to be tough and can be dropped on site by helicopters.

The system’s scorecard is so good that it could be deployed anywhere – from small, rural communities to forward operating bases, like those in Iraq or Afghanistan. Currently, the typical 600-soldier forward operating base requires a daily convoy of 22 trucks to supply the base with fuel or water and dispose of wastewater and solid waste. With few mechanical parts and a small footprint, the system is ideal for military use, Wang says.

“Currently, human wastes are typically burned in burn pits, and the wastewater is usually hauled away and dumped by local contractors,” Wang explains. “This results in high costs, security issues, potential health risks, negative environmental impacts to the hosting country and a potential negative image.

“Moreover, almost all fresh water used in the camp – including water for drinking, bathing, showering, laundry, car washing and toilet flushing ¬– is from outside sources in the form of bottled and surface water. A deployable and easy-to-use water reclamation station, which transforms wastewater into reusable water within the base, would improve the base environment, security, soldiers’ health, stewardship of foreign lands and concurrently reduce cost and fresh water demand from off-base sources.”

Current wastewater treatment options include membrane bioreactor, activated sludge, fixed film or on-site septic systems. Similar to these methods, Wang’s process uses microorganisms to break down the organic pollutants. Membrane bioreactor, activated sludge process and fixed-film process have been built using standard shipping containers, too. But that’s where the similarities end.

The membrane bioreactor process, while similar in size and quality of effluent produced, has extremely higher energy and maintenance costs, and up to 10 times more expensive parts.

Read more...

Research on the Potential of Geopolymer Concrete

Dr. Erez Allouche, assistant professor of civil engineering at Louisiana Tech University and associate director of the Trenchless Technology Center, is conducting innovative research on geopolymer concrete and providing ways to use a waste byproduct from coal fired power plants and help curb carbon dioxide emissions.

Inorganic polymer concrete (geopolymer) is an emerging class of cementitious materials that utilize "fly ash", one of the most abundant industrial by-products on earth, as a substitute for Portland cement, the most widely produced man-made material on earth.

Portland cement production is a major contributor to CO2 emissions as an estimated five to eight percent of all human-generated atmospheric CO2 worldwide comes from the concrete industry. Production of Portland cement is currently toping 2.6 billion tons per year worldwide and growing at 5 percent annually.

Geopolymer concrete has the potential to substantially curb CO2 emissions, produce a more durable infrastructure capable of design life measured in hundreds of years instead of tens, conserve hundreds of thousands of acres currently used for disposal of coal combustion products, and protect aquifers and surface bodies of fresh water via the elimination of fly ash disposal sites.

In comparison to ordinary Portland cement (OPC), geopolymer concrete (GPC) features greater corrosion resistance, substantially higher fire resistance (up to 2400° F), high compressive and tensile strengths, a rapid strength gain, and lower shrinkage.

Perhaps Geopolymer concrete's greatest appeal is its life cycle greenhouse gas reduction potential; as much as 90% when compared with OPC.

Read more...

Friday, May 27, 2011

Superior Sound For Telephones, Mobile and Related Devices

mp3 for phone calls – Considering the poor sound quality of many phone calls, this is a great idea. Videoconference phone calls in particular can be unintentionally awkward because the participants start to speak at the same time due to the time delay in the transmission. The reasons for this are long delay times and the poor quality of today's video calls. Fraunhofer's task was therefore to improve the quality and simultaneously minimize the delay time. The technology that makes this possible is called Enhanced Low Delay Advanced Audio Coding, in short, AAC-ELD. It was developed by Manfred Lutzky, Marc Gayer, Markus Schnell and their team from the Fraunhofer Institute for Integrated Circuits IIS in Erlangen.

Fraunhofer IIS is known as the main inventor of mp3, the audio codec that made it possible to greatly reduce the size of music or other audio files without impairing the sound. To implement something similar for the telephone and other devices was easier said than done. "The algorithm requires a certain amount of time to encode the data and to decode it again at the other end of the line. The process requires data that is still in the future, as it must wait for the data to arrive. This can result in a situation where interactive communication is very difficult," explained Markus Schnell.For several years, the IIS team continued to improve the algorithm even further to shorten the delay and not impair the quality at the same time. The solution, "We attempted to further minimize the area that is forward-looking and to only process current data. We did that until we found an optimum balance between quality and delay," said Schnell.

One technology – many applications

The results are audibly good as the delay with Enhanced Low Delay AAC is only about 15 milliseconds. During this extremely short timespan, the algorithm manages to reduce the audio data to less than one-thirtieth of its original volume without major losses of sound quality. Due to its enormous performance capacity, the coding process has already prevailed in many areas. Marc Gayer explains, "Currently, AAC Low Delay, the forerunner of AAC-ELD, is the actual standard for many video-conferencing systems. But the process is also increasingly applied in radio broadcasts, for example for live sports reports."

The advantage of improved speech transmission is also heard in mobile devices, such as the iPhone4 and in the iPad2, for example. Video telephone transmissions in particular are supported in these devices. The developers created a very special application was to promote the communication between groups that are socially close to each other. A system was created that makes it possible to play games across the borders of cities or countries. "Thanks to the optimized image and sound quality, there is the impression that game partners who are far apart from each other are not in front of screens, but actually sitting across from one another," said Manfred Lutzky.

Currently, more than 120 scientists and engineers are working on audio and multimedia technologies at IIS. Marc Gayer, Manfred Lutzky and Markus Schnell are receiving the 2011 Joseph von Fraunhofer Prize on behalf of the entire team

Read more...

Flexible Films For Photovoltaics

What do potato chips and thin-film solar cells have in common? Both need films that protect them from air and water vapor: the chips in order to stay fresh and crisp; the solar cells in order to have a useful life that is as long as possible. In most cases, glass is used to protect the active layers of the solar cells from environmental influences. Dr. Klaus Noller from the Fraunhofer Institute for Process Engineering and Packaging IVV in Freising explains the advantages of a plastic film: "The films are considerably lighter – and flexible. They make new production processes possible that enable significant reductions in the cost of manufacturing a photovoltaic module." Instead of working with individual glass plates, the solar cells could be printed onto a plastic film and then encapsulated with the barrier film: photovoltaic modules on a roll.

That is not a small goal that the researchers from two Fraunhofer institutes want to achieve: The film and packaging developers led by Dr. Klaus Noller along with Dr. Sabine Amberg-Schwab from the Fraunhofer Institute for Silicate Research ISC in Würzburg, who is an expert in hybrid polymers, called ORMOCERs – an in-house development of the ISC. She and her team worked almost 20 years on developing a coating material on the basis of ORMOCER that can be used as an effective barrier against oxygen and water vapor. What has been created is a barrier lacquer that the researchers combined with another known barrier material: silicon dioxide. "The results were astounding," said Amberg-Schwab: "A barrier effect that is far better than could be expected from adding only the two layers. The reason for this are special effects that are generated between the two materials."

For the ideal application on a film, the team in Würzburg developed an ORMOCER coating material that is easy to process and cure. The damp heat test was a particular obstacle: the cured lacquer coating must remain stable at 85 degrees Celsius and 85 percent humidity. The solar cells on the roof or the facade are intended to withstand extreme weather conditions and temperatures as long as possible. The folks from the Freisinger institute faced the challenge of developing a process with which the barrier layers can be applied to the film perfectly and economically.

This was achieved with a roll-to-roll process. The painting line was optimized continuously to meet the special requirements: The ORMOCERs must be applied in a dust-free environment, with the layer thickness being extremely thin, yet as a continuous film. During this, the coated side must not touch one of the rollers at any time. That would damage the layer. The patented process makes it possible to manu facture tough high barrier films in a cost-effective and environmentally friendly way. Industrial partners are already using this process. Dr. Sabine Amberg-Schwab from the ISC and Dr. Klaus Noller from the IVV will receive one of the three 2011 Joseph von Fraunhofer Prizes for their developments

Read more...

  © Blogger templates Psi by Ourblogtemplates.com 2008

Back to TOP