Tuesday, December 11, 2007

Clemson researcher studies carbon fibers for nuclear reactor safety

Carbon nano fibres may be very useful in Gen IV nuclear power reactors.The US department of Energy(DOE) is extending support to a professor to carry out research on these fibres.

K.S.Parthasarathy



Public release date: 10-Dec-2007

Contact: Amod Ogale
ogale@clemson.edu
864-656-5483
Clemson University
Clemson researcher studies carbon fibers for nuclear reactor safety

CLEMSON, S.C. –– Carbon fibers that are only one-tenth the size of a human hair, but three times stronger than steel, may hold up to the intense heat and radiation of next generation nuclear power generators, providing a safety mechanism. The “Gen IV” power-generating reactors are being designed to provide low-cost electricity, but with a built-in safety mechanism current reactors do not have.

The Department of Energy (DoE) has awarded chemical engineering professor Amod Ogale, deputy director of the Center for Advanced Engineering Fibers and Films (CAEFF), a $450,000 grant to research carbon fibers embedded into a carbon matrix that do not melt in extreme temperatures for potential use in Gen IV power generators. Presently, about 20 percent of electricity produced in the United States is from nuclear sources.

“One proposed design of the next generation of nuclear plants will consist of a helium-cooled generator that will operate in the range of 1,200 to 1,800 degrees Fahrenheit,” says Ogale. “A critical safety requirement for this reactor is that it can shut down safely in the event of a malfunction where coolant flow is interrupted. Steel alloys currently used internally in reactors melt at the peak temperature of 2500 degrees Fahrenheit, where carbon fiber composites do not.”

Carbon fiber composites are already used successfully in jetliner brake systems because of their ability to withstand high temperatures without melting. However, their performance in a nuclear environment is not adequately understood.

Ogale and his team will study the neutron-radiation damage effects on carbon fibers.

His prior research has shown that including carbon nanotubes (large molecules of carbon that are tube-shaped and 30 nanometers in size) in carbon fibers leads to the development of a more uniform texture that improves the properties of the ultra-thin carbon fibers.

In his research, Ogale expects to generate high graphitic crystallinity, a solid ordered pattern which is evenly distributed so that any changes in fiber properties due to radiation can be minimized.

Irradiation experiments will be conducted in collaboration with researchers at Oak Ridge National Labs. South Carolina State University researchers also will participate in the study.

“This research will lead to a fundamental understanding of how the nanotubes set themselves up to provide radiation-damage tolerance to carbon fibers,” said Ogale.

###

Editors: This material is based upon work supported by DoE under grant number DE-FG02-07ER46364. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of DoE.

Tuesday, November 27, 2007

Secondhand smoke damages lungs, MRIs show

Researchers use MRI scanner to image damage in the lung due to second-hand smoke.They presented this work at RSNA 2007

K.S.Parthasarathy

Public release date: 26-Nov-2007


Contact: Rachel Salis-Silverman
Salis@email.chop.edu
267-426-6063
Children's Hospital of Philadelphia
Secondhand smoke damages lungs, MRIs show
The apparent diffusion coefficient (ADC) measures lung injury, indicated by different colors.
Click here for more information.

It’s not a smoking gun, but it’s smoking-related, and it’s there in bright medical images: evidence of microscopic structural damage deep in the lungs, caused by secondhand cigarette smoke. For the first time, researchers have identified lung injury to nonsmokers that was long suspected, but not previously detectable with medical imaging tools.

The researchers suggest that their findings may strengthen public health efforts to restrict secondhand smoke.

“We used a special type of magnetic resonance imaging to find these structural changes in the lungs,” said study leader Chengbo Wang, Ph.D., a magnetic resonance physicist in the Department of Radiology at The Children’s Hospital of Philadelphia. “Almost one-third of nonsmokers who had been exposed to secondhand cigarette smoke for a long time developed these structural changes.” Formerly at the University of Virginia, Wang collaborated with radiology researchers at that institution, where they acquired the MRIs from adult smokers and nonsmokers.

Wang presented the team’s findings in Chicago at the annual meeting of the Radiological Society of North America. Although the participants in the research study were adults, Wang said the results have implications for the 35 percent of American children who live in homes where regular smoking occurs.

The researchers studied 60 adults between ages 41 and 79, 45 of whom had never smoked. The 45 non-smokers were divided into groups with low and high exposure to secondhand smoke; the high-exposure subjects had lived with a smoker for at least 10 years, often during childhood. The 15 current or former smokers formed a positive control group.

The research team prepared an isotope of helium called helium-3 by polarizing it to make it more visible in the MRI. Researchers diluted the helium in nitrogen and had research subjects inhale the mixture. Unlike ordinary MRIs, this MRI machine measured diffusion, the movement of helium atoms, over 1.5 seconds. The helium atoms moved a greater distance than in the lungs of normal subjects, indicating the presence of holes and expanded spaces within the alveoli, tiny sacs within the lungs.

The researchers found that almost one-third of the non-smokers with high exposure to secondhand smoke had structural changes in their lungs similar to those found in the smokers. “We interpreted those changes as early signs of lung damage, representing very mild forms of emphysema,” said Wang. Emphysema, a lung disease that is a major cause of death in the U.S., is commonly found in heavy smokers.

The researchers also found a seemingly paradoxical result among two-thirds of the high-exposure group of non-smokers—diffusion measurements that were lower than those found in the low-exposure group. Although these findings require more study, said Wang, they may reflect a narrowing in airways caused by early stages of another lung disease, chronic bronchitis.

“To our knowledge, this is the first imaging study to find lung damage in non-smokers heavily exposed to secondhand smoke,” said Wang. “We hope our work strengthens the efforts of legislators and policymakers to limit public exposure to secondhand smoke.”

###

The study received financial support from the National Heart, Lung and Blood Institute, the Flight Attendant Medical Research Institute, the Commonwealth of Virginia Technology Research Fund, and Siemens Medical Solutions.

Wang’s co-authors were Talissa A. Altes, M.D., and Kai Ruppert, Ph.D., now of the Children’s Hospital Radiology Department; and G. Wilson Miller, Ph.D., Eduard E. deLange, M.D., Jaime F. Mata, Ph.D., Gordon D. Cates, Jr., Ph.D., and John P. Mugler III, Ph.D., all of the University of Virginia Department of Radiology. Drs. Wang, Altes, and Ruppert were previously at the University of Virginia as well.

About The Children's Hospital of Philadelphia: The Children's Hospital of Philadelphia was founded in 1855 as the nation's first pediatric hospital. Through its long-standing commitment to providing exceptional patient care, training new generations of pediatric healthcare professionals and pioneering major research initiatives, Children's Hospital has fostered many discoveries that have benefited children worldwide. Its pediatric research program is among the largest in the country, ranking third in National Institutes of Health funding. In addition, its unique family-centered care and public service programs have brought the 430-bed hospital recognition as a leading advocate for children and adolescents. For more information, visit http://www.chop.edu.

[ Back to EurekAlert! ] [ Print Article | E-mail Article | Close Window ]

Wednesday, November 21, 2007

Post-treatment PET scans can reassure cervical cancer patients

PET scanner is a very powerful tool which can pinpoint secondary cancers , if any, in patients who underwent radiation treatment

K.S.Parthasarathy




Public release date: 20-Nov-2007


Contact: Gwen Ericson
ericsong@wustl.edu
314-286-0141
Washington University in St. Louis
Post-treatment PET scans can reassure cervical cancer patients

St. Louis, Nov. 20, 2007 — Whole-body PET (positron emission tomography) scans done three months after completion of cervical cancer therapy can ensure that patients are disease-free or warn that further interventions are needed, according to a study at Washington University School of Medicine in St. Louis.

"This is the first time we can say that we have a reliable test to follow cervical cancer patients after therapy," says Julie K. Schwarz, M.D., Ph.D., a Barnes-Jewish Hospital resident in the Department of Radiation Oncology. "We ask them to come back for a follow-up visit about three months after treatment is finished, and we perform a PET scan. If the scan shows a complete response to treatment, we can say with confidence that they are going to do extremely well. That's really powerful."

Schwarz and colleagues published their study in the Nov. 21, 2007 issue of the Journal of the American Medical Association (JAMA).

Without a test like PET, it can be difficult to tell whether treatment has eliminated cervical tumors, Schwarz says. That's because small tumors are hard to detect with pelvic exams, and overt symptoms, such as leg swelling, don't occur until tumors grow quite large. Furthermore, CT and MRI scans often don't differentiate tumor tissue from surrounding tissues, Pap tests can be inaccurate because of tissue changes induced by radiation therapy, and no blood test exists to detect the presence of cervical cancer.

Cancerous tumors glow brightly in the PET scans used in the study, called FDG-PET scans, which detect emissions from radioactively tagged blood sugar, or glucose. Tumor tissue traps more of the glucose than does normal tissue, making tumors readily discernable.

Not only can post-treatment PET scans reassure those patients whose tumors respond well to therapy, they can also identify those patients whose tumors have not responded so that their physicians can explore other treatment options before the cancer advances further. These options can include surgery to remove tissue, standard chemotherapy or experimental therapies available through clinical trials.

"Follow-up PET scans can also be very useful tools for physicians conducting clinical trials of new therapies," Schwarz says. "Our study has shown that the scans are predictive of long-term survival. Using PET scans, clinical researchers can get an early readout of how effective experimental treatments might be."

Schwarz and colleagues also have a project to compare follow-up PET results with tumor biology to find out why some tumors don't respond well to therapy. In a study that won her a Resident Clinical Basic Science Research Award from the American Society for Therapeutic Radiation and Oncology, a global organization of medical professionals, Schwarz found differences in gene activity between tumors from patients that responded well and those that had persistent disease. Ongoing research will look for the significance of these differences.

The study's senior author, Perry Grigsby, M.D., professor of radiation oncology, of nuclear medicine and of obstetrics and gynecology and a radiation oncologist with the Siteman Cancer Center at Washington University School of Medicine and Barnes-Jewish Hospital, has overseen a patient database that now has PET images and tumor samples from hundreds of cervical cancer patients.

"We have a tremendous database of PET images collected from patients in the department since 1998," Schwarz says. "We want to combine these results with analyses of tumor biopsies so that we can more effectively choose additional therapies for patients who haven't responded to the initial treatment."

###

Schwarz JK, Siegel BA, Dehdashti F, Grigsby PW. Association of posttherapy positron emission tomography with tumor response and survival in cervical carcinoma. Journal of the American Medical Association, November 21, 2007.

Funding from the Department of Radiology and the Department of Radiation Oncology at Washington University School of Medicine in St. Louis supported this research.

Washington University School of Medicine's 2,100 employed and volunteer faculty physicians also are the medical staff of Barnes-Jewish and St. Louis Children's hospitals. The School of Medicine is one of the leading medical research, teaching and patient care institutions in the nation, currently ranked fourth in the nation by U.S. News & World Report. Through its affiliations with Barnes-Jewish and St. Louis Children's hospitals, the School of Medicine is linked to BJC HealthCare.

Siteman Cancer Center is the only federally-designated Comprehensive Cancer Center within a 240-mile radius of St. Louis. Siteman Cancer Center is composed of the combined cancer research and treatment programs of Barnes-Jewish Hospital and Washington University School of Medicine. Siteman has satellite locations in West County and St. Peters, in addition to its full-service facility at Washington University Medical Center on South Kingshighway.

Saturday, November 17, 2007

Remote-control nanoparticles deliver drugs directly into tumors



Public release date: 16-Nov-2007

Contact: Elizabeth Thomson
thomson@mit.edu
617-258-5402
Massachusetts Institute of Technology
MIT: Remote-control nanoparticles deliver drugs directly into tumors

CAMBRIDGE, MA--MIT scientists have devised remotely controlled nanoparticles that, when pulsed with an electromagnetic field, release drugs to attack tumors. The innovation, reported in the Nov. 15 online issue of Advanced Materials, could lead to the improved diagnosis and targeted treatment of cancer.

In earlier work the team, led by Sangeeta Bhatia, M.D.,Ph.D., an associate professor in the Harvard-MIT Division of Health Sciences & Technology (HST) and in MIT's Department of Electrical Engineering and Computer Science, developed injectable multi-functional nanoparticles designed to flow through the bloodstream, home to tumors and clump together. Clumped particles help clinicians visualize tumors through magnetic resonance imaging (MRI).

With the ability to see the clumped particles, Bhatia’s co-author in the current work, Geoff von Maltzahn, asked the next question: “Can we talk back to them?”

The answer is yes, the team found. The system that makes it possible consists of tiny particles (billionths of a meter in size) that are superparamagnetic, a property that causes them to give off heat when they are exposed to a magnetic field. Tethered to these particles are active molecules, such as therapeutic drugs.

Exposing the particles to a low-frequency electromagnetic field causes the particles to radiate heat that, in turn, melts the tethers and releases the drugs. The waves in this magnetic field have frequencies between 350 and 400 kilohertz—the same range as radio waves. These waves pass harmlessly through the body and heat only the nanoparticles. For comparison, microwaves, which will cook tissue, have frequencies measured in gigahertz, or about a million times more powerful.

The tethers in the system consist of strands of DNA, “a classical heat sensitive material,” said von Maltzahn, a graduate student in HST. Two strands of DNA link together through hydrogen bonds that break when heated. In the presence of the magnetic field, heat generated by the nanoparticles breaks these, leaving one strand attached to the particle and allowing the other to float away with its cargo.

One advantage of a DNA tether is that its melting point is tunable. Longer strands and differently coded strands require different amounts of heat to break. This heat-sensitive tuneability makes it possible for a single particle to simultaneously carry many different types of cargo, each of which can be released at different times or in various combinations by applying different frequencies or durations of electromagnetic pulses.

To test the particles, the researchers implanted mice with a tumor-like gel saturated with nanoparticles. They placed the implanted mouse into the well of a cup-shaped electrical coil and activated the magnetic pulse. The results confirm that without the pulse, the tethers remain unbroken. With the pulse, the tethers break and release the drugs into the surrounding tissue.

The experiment is a proof of principal demonstrating a safe and effective means of tunable remote activation. However, work remains to be done before such therapies become viable in the clinic.

To heat the region, for example, a critical mass of injected particles must clump together inside the tumor. The team is still working to make intravenously injected particles clump effectively enough to achieve this critical mass.

“Our overall goal is to create multifunctional nanoparticles that home to a tumor, accumulate, and provide customizable remotely activated drug delivery right at the site of the disease,” said Bhatia.

###

Co-authors on the paper are Austin M. Derfus, a graduate student at the University of California at San Diego; Todd Harris, an HST graduate student; Erkki Ruoslahti and Tasmia Duza of The Burnham Institute in La Jolla, CA; and Kenneth S. Vecchio of the University of San Diego.

The research was supported by grants from the David and Lucile Packard Foundation, the National Cancer Institute of the National Institutes of Health. Dervis was supported by a G.R.E.A.T fellowship from the University of California Biotechnology Research and Educational Program.

Written by Elizabeth Dougherty, Harvard-MIT Division of Health Sciences and Technology

Tuesday, November 13, 2007

Scientists discover record-breaking hydrogen storage materials for use in fuel cells

UVa researchers found materials that absorb hydrogen up to 14 percent by weight at room temperature. By absorbing twice as much hydrogen, the new materials could help make the dream of a hydrogen economy come true.

K.S.Parthasarathy





Public release date: 12-Nov-2007
[ Print Article | E-mail Article | Close Window ]

Contact: Bellave Shivaram
bss2d@virginia.edu
434-806-9582
University of Virginia
Scientists discover record-breaking hydrogen storage materials for use in fuel cells

Scientists at the University of Virginia have discovered a new class of hydrogen storage materials that could make the storage and transportation of energy much more efficient — and affordable — through higher-performing hydrogen fuel cells.

Bellave S. Shivaram and Adam B. Phillips, the U.Va. physicists who invented the new materials, will present their finding at 8 p.m., Monday, Nov. 12, at the International Symposium on Materials Issues in a Hydrogen Economy at the Omni Hotel in Richmond, Va.

“In terms of hydrogen absorption, these materials could prove a world record,” Phillips said. “Most materials today absorb only 7 to 8 percent of hydrogen by weight, and only at cryogenic [extremely low] temperatures. Our materials absorb hydrogen up to 14 percent by weight at room temperature. By absorbing twice as much hydrogen, the new materials could help make the dream of a hydrogen economy come true.”

In the quest for alternative fuels, U.Va.’s new materials potentially could provide a highly affordable solution to energy storage and transportation problems with a wide variety of applications. They absorb a much higher percentage of hydrogen than predecessor materials while exhibiting faster kinetics at room temperature and much lower pressures, and are inexpensive and simple to produce.

“These materials are the next generation in hydrogen fuel storage materials, unlike any others we have seen before,” Shivaram said. “They have passed every litmus test that we have performed, and we believe they have the potential to have a large impact.”

The inventors believe the novel materials will translate to the marketplace and are working with the U.Va. Patent Foundation to patent their discovery.

“The U.Va. Patent Foundation is very excited to be working with a material that one day may be used by millions in everyday life,” said Chris Harris, senior licensing manager for the U.Va. Patent Foundation. “Dr. Phillips and Dr. Shivaram have made an incredible breakthrough in the area of hydrogen absorption.”

###

Phillips’s and Shivaram’s research was supported by the National Science Foundation and the U.S. Department of Energy.

Thursday, November 1, 2007

World's smallest radio uses single nanotube to pick up good vibrations

Public release date: 31-Oct-2007
[ Print Article | E-mail Article | Close Window ]

Contact: Robert Sanders
rsanders@berkeley.edu
510-643-6998
University of California - Berkeley
World's smallest radio uses single nanotube to pick up good vibrations

Berkeley -- Physicists at the University of California, Berkeley, have built the smallest radio yet - a single carbon nanotube one ten-thousandth the diameter of a human hair that requires only a battery and earphones to tune in to your favorite station.

The scientists successfully received their first FM broadcast last year - Derek & The Dominos' "Layla" and the Beach Boys' "Good Vibrations" transmitted from across the room. In homage to last year's 100th anniversary of the first voice and music radio transmission, they also transmitted and successfully tuned in to the first music piece broadcast in 1906, "Largo" from George Frederic Handel's opera "Xerxes."

"We were just in ecstasy when this worked," said team leader Alex Zettl, UC Berkeley professor of physics. "It was fantastic."

The nanoradio, which is currently configured as a receiver but could also work as a transmitter, is 100 billion times smaller than the first commercial radios, and could be used in any number of applications - from cell phones to microscopic devices that sense the environment and relay information via radio signals, Zettl said. Because it is extremely energy efficient, it would integrate well with microelectronic circuits.

"The nanotube radio may lead to radical new applications, such as radio-controlled devices small enough to exist in a human's bloodstream," the authors wrote in a paper published online today by the journal Nano Letters. The paper will appear in the print edition of Nano Letters later this month.

Authors of the nanoradio paper are Zettl, graduate student Kenneth Jensen, and their colleagues in UC Berkeley's Center of Integrated Nanomechanical Systems (COINS) and in the Materials Sciences Division at Lawrence Berkeley National Laboratory (LBNL). COINS is a Nanoscale Science and Engineering Research Center supported by the National Science Foundation (NSF).

Nanotubes are rolled-up sheets of interlocked carbon atoms that form a tube so strong that some scientists have suggested using a nanotube wire to tether satellites in a fixed position above Earth. The nanotubes also exhibit unusual electronic properties because of their size, which, for the nanotubes used in the radio receiver, are about 10 nanometers in diameter and several hundred nanometers long. A nanometer is one billionth of a meter; a human hair is about 50,000-100,000 nanometers in diameter.

In the nanoradio, a single carbon nanotube works as an all-in-one antenna, tuner, amplifier and demodulator for both AM and FM. These are separate components in a standard radio. A demodulator removes the AM or FM carrier frequency, which is in the kiloHertz and megaHertz range, respectively, to retrieve the lower frequency broadcast information.

The nanoradio detects radio signals in a radically new way - it vibrates thousands to millions of times per second in tune with the radio wave. This makes it a true nanoelectromechanical device, dubbed NEMS, that integrates the mechanical and electrical properties of nanoscale materials.

In a normal radio, ambient radio waves from different transmitting stations generate small currents at different frequencies in the antenna, while a tuner selects one of these frequencies to amplify. In the nanoradio, the nanotube, as the antenna, detects radio waves mechanically by vibrating at radio frequencies. The nanotube is placed in a vacuum and hooked to a battery, which covers its tip with negatively charged electrons, and the electric field of the radio wave pushes and pulls the tip thousands to millions of times per second.

While large objects, like a stiff wire or a wooden ruler pinned at one end, vibrate at low frequencies - between tens and hundreds of times per second - the tiny nanotubes vibrate at high frequencies ranging from kiloHertz (thousands of times per second) to hundreds of megaHertz (100 million times per second). Thus, a single nanotube naturally selects only one frequency.

Although it might seem that the vibrating nanotube yields a "one station" radio, the tension on the nanotube also influences its natural vibration frequency, just as the tension on a guitar string fine tunes its pitch. As a result, the physicists can tune in a desired frequency or station by "pulling" on the free tip of the nanotube with a positively charged electrode. This electrode also turns the nanotube into an amplifier. The voltage is high enough to pull electrons off the tip of the nanotube and, because the nanotube is simultaneously vibrating, the electron current from the tip is an amplified version of the incoming radio signal. This is similar to the field-emission amplification of old vacuum tube amplifiers used in early radios and televisions, Zettl said. The amplified output of this simple nanotube device is enough to drive a very sensitive earphone.

Finally, the field-emission and vibration together also demodulate the signal.

"I hate to sound like I'm selling a Ginsu knife - But wait, there's more! It also slices and dices! - but this one nanotube does everything; it performs all radio functions simultaneously and extremely efficiently," Zettl said. "It's ridiculously simple - that's the beauty of it."

Zettl's team assembles the nanoradios very simply, too. From nanotubes copiously produced in a carbon arc, they glue several to a fixed electrode. In a vacuum, they bring the electrode within a few microns of a second electrode, close enough for electrons to jump to it from the closest nanotube and create an electrical circuit. To achieve the desired length of the active nanotube, the team first runs a large current through the nanotube to the second electrode, which makes carbon atoms jump off the tip of the nanotube, trimming it down to size for operation within a particular frequency band. Connect a battery and earphones, and voila!

Reception by the initial radios is scratchy, which Zettl attributes in part to insufficient vacuum. In future nanoradios, a better vacuum can be obtained by insuring a cleaner environment, or perhaps by encasing the single nanotube inside a second, larger non-conducting nanotube, thereby retaining the nanoscale.

Zettl won't only be tuning in to oldies stations with his nanoradio. Because the radio static is actually the sound of atoms jumping on and off the tip of the nanotube, he hopes to use the nanoradio to sense the identity of atoms or even measure their masses, which is done today by cumbersome large mass spectrometers.

###

Coauthors with Jensen and Zettl are UC Berkeley post-doctoral fellow Jeff Weldon and physics graduate student Henry Garcia. The work was supported by NSF and the U.S. Department of Energy.

Wednesday, October 24, 2007

Nuclear power worldwide: status and outlook

Nuclear power is expected to raise to a maximum of 679MWe to a minimum of 447MWe from the present 370MWe by 2030.Nuclear power's share of worldwide electricity production rose from less than 1 percent in 1960 to 16 percent in 1986, and that percentage has held essentially constant in the 21 years since 1986.The IAEA report released on October 23, 2007 reviews the status of nuclear power worldwide.

K.S.Parthasarathy


Public release date: 23-Oct-2007

Contact: Press Office
press@iaea.org
0043-126-002-1273
International Atomic Energy Agency
Nuclear power worldwide: status and outlook
A report from the IAEA

The IAEA makes two annual projections concerning the growth of nuclear power, a low and a high. The low projection assumes that all nuclear capacity that is currently under construction or firmly in the development pipeline gets completed and attached to the grid, but no other capacity is added. In this low projection, there would be growth in capacity from 370 GW(e) at the end of 2006 to 447 GW(e) in 2030. (A gigawatt = 1000 megawatts = 1 billion watts)

In the IAEA's high projection -- which adds in additional reasonable and promising projects and plans -- global nuclear capacity is estimated to rise to 679 GW(e) in 2030. That would be an average growth rate of about 2.5%/yr.

"Our job is not so much to predict the future but to prepare for it, " explains the IAEA's Alan McDonald, Nuclear Energy Analyst. "To that end we update each year a high and low projection to establish the range of uncertainty we ought to be prepared for."

Nuclear power's share of worldwide electricity production rose from less than 1 percent in 1960 to 16 percent in 1986, and that percentage has held essentially constant in the 21 years since 1986. Nuclear electricity generation has grown steadily at the same pace as overall global electricity generation. At the close of 2006, nuclear provided about 15 percent of total electricity worldwide.

The IAEA's other key findings as of the end of 2006 are elaborated below.

There were 435 operating nuclear reactors around the world, and 29 more were under construction. The US had the most with 103 operating units. France was next with 59. Japan followed with 55, plus one more under construction, and Russia had 31 operating, and seven more under construction.

Of the 30 countries with nuclear power, the percentage of electricity supplied by nuclear ranged widely: from a high of 78 percent in France; to 54 percent in Belgium; 39 percent in Republic of Korea; 37 percent in Switzerland; 30 percent in Japan; 19 percent in the USA; 16 percent in Russia; 4 percent in South Africa; and 2 percent in China.

Present nuclear power plant expansion is centred in Asia: 15 of the 29 units under construction at the end of 2006 were in Asia. And 26 of the last 36 reactors to have been connected to the grid were in Asia. India currently gets less than 3% of its electricity from nuclear, but at the end of 2006 it had one-quarter of the nuclear construction - 7 of the world's 29 reactors that were under construction. India's plans are even more impressive: an 8-fold increase by 2022 to 10 percent of the electricity supply and a 75-fold increase by 2052 to reach 26 percent of the electricity supply. A 75-fold increase works out to an average of 9.4 percent/yr, about the same as average global nuclear growth from 1970 through 2004. So it's hardly unprecedented.

China is experiencing huge energy growth and is trying to expand every source it can, including nuclear power. It has four reactors under construction and plans a nearly five-fold expansion by just 2020. Because China is growing so fast this would still amount to only 4 percent of total electricity.

Russia had 31 operating reactors, five under construction and significant expansion plans. There's a lot of discussion in Russia of becoming a full fuel-service provider, including services like leasing fuel, reprocessing spent fuel for countries that are interested, and even leasing reactors.

Japan had 55 reactors in operation, one under construction, and plans to increase nuclear power's share of electricity from 30 percent in 2006 to more than 40 percent within the next decade.

South Korea connected its 20th reactor just last year, has another under construction and has broken ground to start building two more. Nuclear power already supplies 39 percent of its electricity.

Europe is a good example of "one size does not fit all." Altogether it had 166 reactors in operation and six under construction. But there are several nuclear prohibition countries like Austria, Italy, Denmark and Ireland. And there are nuclear phase-out countries like Germany and Belgium.

There are also nuclear expansion programmes in Finland, France, Bulgaria and Ukraine. Finland started construction in 2005 on Olkiluoto-3, which is the first new Western European construction since 1991. France plans to start its next plant in 2007.

Several countries with nuclear power are still pondering future plans. The UK, with 19 operating plants, many of which are relatively old, had been the most uncertain until recently. Although a final policy decision on nuclear power will await the results of a public consultation now underway, a White Paper on energy published in May 20071 concluded that "…having reviewed the evidence and information available we believe that the advantages [of new nuclear power] outweigh the disadvantages and that the disadvantages can be effectively managed. On this basis, the Government's preliminary view is that it is in the public's interest to give the private sector the option of investing in new nuclear power stations."

###

1 http://www.dti.gov.uk/energy/whitepaper/page39534.html

The US had 103 reactors providing 19 percent of the country's electricity. For the last few decades the main developments have been improved capacity factors, power increases at existing plants and license renewals. Currently 48 reactors have already received 20-year renewals, so their licensed lifetimes are 60 years. Altogether three-quarters of the US reactors either already have license renewals, have applied for them, or have stated their intention to apply. There have been a lot of announced intentions (about 30 new reactors' worth) and the Nuclear Regulatory Commission is now reviewing four Early Site Permit applications.

For further information, please contact: IAEA Division of Public Information, Media & Outreach Section 43-1-2600-21273 .

For further details on the current status of the nuclear industry, go to IAEA's "Power Reactor Information System,"(PRIS).

Video B-roll is available on request.

Audio Q & A with IAEA Nuclear Energy Analyst, Alan McDonald and UN language editions of this press release are available under the following link http://www.iaea.org/NewsCenter/PressReleases/2007/prn200719.html

Quantitative PET imaging finds early determination of effectiveness of cancer treatment

PET imaging can demonstrate the effectiveness of cancer treatment.This imaging modality will reveal reduction in metabolism of cells killed by chemotherapeutic agents

K.S.Parthasarathy



Public release date: 23-Oct-2007

Contact: Maryann Verrillo
mverrillo@snm.org
703-652-6773
Society of Nuclear Medicine
Quantitative PET imaging finds early determination of effectiveness of cancer treatment
Visual analysis of PET Scans for non-Hodgkin lymphoma may be improved by using standardized uptake value in monitoring response to treatment, say researchers in October Journal of Nuclear Medicine

RESTON, Va.—With positron emission tomography (PET) imaging, seeing is believing: Evaluating a patient’s response to chemotherapy for non-Hodgkin lymphoma (NHL) typically involves visual interpretation of scans of cancer tumors. Researchers have found that measuring a quantitative index—one that reflects the reduction of metabolic activity after chemotherapy first begins—adds accurate information about patients’ responses to first-line chemotherapy, according to a study in the October issue of the Journal of Nuclear Medicine.

“In our study, we demonstrated that a quantitative assessment of therapeutic response for patients with diffuse large B-cell lymphoma (DLBCL) is more accurate than visual analysis alone when using the radiotracer FDG (fluorodeoxyglucose) with PET scans,” said Michel Meignan, professor of nuclear medicine at Henri Mondor Hospital in Creteil, France. “The ability to predict tumor response early in the course of treatment is very valuable clinically, allowing intensification of treatment in those patients who are unlikely to response to first-line chemotherapy,” he added. “Similarly, treatment could possibly be shortened in those patients who show a favorable response after one or two cycles of chemotherapy, and quantification also may help identify the disease’s transformation from low-grade to aggressive stage,” he explained. “However, visual interpretation of PET scans will always be the first step of analysis and will prevail in case of difficulties to quantify images,” added Meignan.

Diffuse large B-cell lymphoma is a fast-growing, aggressive form of non-Hodgkin lymphoma, a cancer of the body’s lymphatic system. Although there are more than 20 types of NHL, DLBCL is the most common type, making up about 30 percent of all lymphomas. In the United States, about 63,190 people are expected to be diagnosed with non-Hodgkin lymphoma in 2007, according to recent statistics.

Ninety-two patients with DLBCL were studied before and after two cycles of chemotherapy, and tumor response was assessed visually and by various quantitative parameters, explained the co-author of “Early 18F-FDG PET for Prediction of Prognosis in Patients With Diffuse Large B-Cell Lymphoma: SUV-Based Assessment Versus Visual Analysis.” Meignan said, “We found that quantification of tumor FDG uptake (the ratio of tissue radioactivity concentration) can markedly improve the accuracy of FDG PET for prediction of patient outcome.” Additional studies need to be done, said Meignan, reiterating that the future monitoring of cancer tumor response will probably include a combination of quantitative analysis and visual assessment.

PET is a powerful molecular imaging procedure that uses very small amounts of radioactive materials that are targeted to specific organs, bones or tissues. When PET is used to image cancer, a radiopharmaceutical (such as FDG, which includes both a sugar and a radionuclide) is injected into a patient. Cancer cells metabolize sugar at higher rates than normal cells, and the radiopharmaceutical is drawn in higher amounts to cancerous areas. PET scans show where FDG is by tracking the gamma rays given off by the radionuclide tagging the drug and producing three-dimensional images of their distribution within the body. PET scanning provides information about the body’s chemistry, metabolic activity and body function.

###

“Early 18F-FDG PET for Prediction of Prognosis in Patients With Diffuse Large B-Cell Lymphoma: SUV-Based Assessment Versus Visual Analysis” appears in the October issue of the Journal of Nuclear Medicine, which is published by SNM, the world’s largest molecular imaging and nuclear medicine society. Additional co-authors include Chieh Lin, Alain Luciani and Alain Rahmouni, Department of Radiology; Emmanuel Itti and Gaetano Paone, Department of Nuclear Medicine; and Corinne Haioun and Jehan Dupuis, Department of Hematology, all at Henri Mondor Hospital in Créteil, France; and Yolande Petegnief and Jean-Noël Talbot, Department of Nuclear Medicine, Tenon Hospital in Paris, France.

Credentialed press: To obtain a copy of this article—and online access to the Journal of Nuclear Medicine— please contact Maryann Verrillo by phone at (703) 652-6773 or send an e-mail to mverrillo@snm.org. Current and past issues of the Journal of Nuclear Medicine can be found online at http://jnm.snmjournals.org. Print copies can be obtained by contacting the SNM Service Center, 1850 Samuel Morse Drive, Reston, VA 20190-5316; phone (800) 513-6853; e-mail servicecenter@snm.org; fax (703) 708-9015. A subscription to the journal is an SNM member benefit.

About SNM—Advancing Molecular Imaging and Therapy

SNM is an international scientific and professional organization of more than 16,000 members dedicated to promoting the science, technology and practical applications of molecular and nuclear imaging to diagnose, manage and treat diseases in women, men and children. Founded more than 50 years ago, SNM continues to provide essential resources for health care practitioners and patients; publish the most prominent peer-reviewed journal in the field (Journal of Nuclear Medicine); host the premier annual meeting for medical imaging; sponsor research grants, fellowships and awards; and train physicians, technologists, scientists, physicists, chemists and radiopharmacists in state-of-the-art imaging procedures and advances. SNM members have introduced—and continue to explore—biological and technological innovations in medicine that noninvasively investigate the molecular basis of diseases, benefiting countless generations of patients. SNM is based in Reston, Va.; additional information can be found online at http://www.snm.org.

Monday, October 22, 2007

UT rheumatologists discover 2 genes related to disabling form of arthritis


Genetics of arthritis



John D. Reveille, M.D., University of Texas Medical School at Houston.



Public release date: 21-Oct-2007
[ Print Article | E-mail Article | Close Window ]

Contact: Meredith Raine
Meredith.Raine@uth.tmc.edu
713-500-3030
University of Texas Health Science Center at Houston
UT rheumatologists discover 2 genes related to disabling form of arthritis

HOUSTON – (Oct. 22, 2007)—Work done in part by researchers at The University of Texas Medical School at Houston has led to the discovery of two genes that cause ankylosing spondylitis, an inflammatory and potentially disabling disease. The findings are published in the Oct. 21 online edition of Nature Genetics, a journal that emphasizes research on the genetic basis for common and complex diseases.

John D. Reveille, M.D., professor and director of the Division of Rheumatology and Clinical Immunogenetics, in conjunction with Matthew A. Brown, M.D., professor of immunogenetics at Australia’s University of Queensland, led research done by the Triple “A” Spondylitis Consortium Genetic Study (i.e. the TASC or Australo-Anglo-American Spondylitis Consortium).

The international team of researchers worked with investigators from the British Wellcome Trust Case Control Consortium, and together they made the genetic discovery.

Reveille, chief of rheumatology at Memorial Hermann – Texas Medical Center, said the discovery of genes ARTS1 and IL23R brings the scientific community two steps closer to fully understanding ankylosing spondylitis or AS, a chronic form of arthritis that attacks the spine and also can target other joints and organs in the body.

“We’ve long known that the HLA-B27 gene accounts for 40 percent of the overall cause of AS,” said Reveille, the principal investigator of TASC. “Now we have found two new genes. Together with HLA-B27, these genes account for roughly 70 percent of the overall cause. That means we’ve almost nailed this disease. Within the next year, I predict we will have identified all the genes that play a role in this insidious disease. There is more exciting news to come.”

The recent discovery is based on work from the largest and most comprehensive genome-wide association scan conducted to date. In this part of the research project, investigators were searching for genetic information related to AS, as well as autoimmune thyroid disease/Graves’ Disease, breast cancer and multiple sclerosis. Reveille, the George S. Bruce, Jr. Professor in Arthritis and Other Rheumatic Diseases, said the most significant findings were in AS, a disease that generally strikes patients in their teens, 20s or 30s.

ARTS1 and IL23R show a new pathway of causation, Reveille said, and this could lead to new therapies for the arthritic condition, which can cause a complete fusion of the spine, leaving patients unable to straighten and bend.

The identification of the two new genes also could help physicians identify patients who are at the highest risk for developing AS.

“For example, if you have a family member with AS, a simple blood test would be able to tell us if you are also at risk,” Reveille said. “We could offer screenings for people with back pain. In the past, the HLA-B27 test was all we had. Now we potentially have more tests.”

Steve Haskew, who has lived with AS for thirty years, said the genetic discovery offers hope to patients – especially those who are newly diagnosed.

“When I first started experiencing problems – lower back pain, the aching joints – no one could tell me what was wrong,” said Haskew, 59, co-leader of an AS support group that meets every other month at the UT Medical School at Houston. “It took 10 years before a rheumatologist diagnosed me with AS. Back then, there weren’t many options. I was told to take anti-inflammatories and stay as active as possible. It’s fascinating to see how far we’ve come and how much has been learned about the disease since then.”

The research done by Reveille and his colleague Xiaodong Zhou, M.D., associate professor of medicine in Division of Rheumatology and Clinical Immunogenetics, was supported in part by the Center for Clinical and Translational Sciences (CCTS) at The University of Texas Health Science Center at Houston.

“This is a success story for genetics work, and I think it will lead the way for other work to be done,” Reveille said.

The Spondylitis Association of America (SAA) oversaw the nationwide recruitment of patients and families for the study.

“This is the most significant breakthrough in AS genetic research since HLA-B27 was uncovered 34 years ago, and SAA played a significant role in making the study possible,” said SAA Associate Executive Director Laurie Savage, who is co-principal investigator for TASC’s administrative core.

###


[ Back to EurekAlert! ] [ Print Article | E-mail Article | Close Window ]

Friday, September 28, 2007

Cockroaches are morons in the morning, geniuses in the evening

The fact that scientists can teach cockroaches lessons is itself very interesting
K.S.Parthasarathy


Public release date: 27-Sep-2007


Contact: David F. Salisbury
david.salisbury@vanderbilt.edu
615-343-6803
Vanderbilt University
Cockroaches are morons in the morning, geniuses in the evening


In its ability to learn, the cockroach is a moron in the morning and a genius in the evening.

Dramatic daily variations in the cockroach’s learning ability were discovered by a new study performed by Vanderbilt University biologists and published online this week in the Proceedings of the National Academy of Sciences.

“This is the first example of an insect whose ability to learn is controlled by its biological clock,” says Terry L. Page, the professor of biological sciences who directed the project. Undergraduate students Susan Decker and Shannon McConnaughey also participated in the study.

The few studies that have been done with mammals suggest their ability to learn also varies with the time of day. For example, a recent experiment with humans found that people’s ability to acquire new information is reduced when their biological clocks are disrupted, particularly at certain times of day. Similarly, several learning and memory studies with rodents have found that these processes are modulated by their circadian clocks. One study in rats associated jet lag with retrograde amnesia.

In the current study, the researchers taught individual cockroaches to associate peppermint – a scent that they normally find slightly distasteful – with sugar water, causing them to favor it over vanilla, a scent they find universally appealing.

The researchers trained individual cockroaches at different times in the 24-hour day/night cycle and then tested them to see how long they remembered the association. They found that the individuals trained during the evening retained the memory for several days. Those trained at night also had good retention. During the morning, however, when the cockroaches are least active, they were totally incapable of forming a new memory, although they could recall memories learned at other times.

“It is very surprising that the deficit in the morning is so profound,” says Page. “An interesting question is why the animal would not want to learn at that particular time of day. We have no idea.”

Most previous studies of circadian rhythm have focused on the visual system. “The advantage of eyes becoming more sensitive at night is so obvious that people haven’t looked much at other sensory systems,” says Page. “The fact that our study involves the olfactory system suggests that the circadian cycle could be influencing a number of senses beyond vision.”

In the study, the researchers used cockroaches of the species Leucophaea maderae. It doesn’t have a common name but it is commonly used in scientific experiments because it was used extensively in early physiological and endocrinological studies.

The discovery that the cockroach’s memory is so strongly modulated by its circadian clock opens up new opportunities to learn more about the molecular basis of the interaction between biological clocks and memory and learning in general.

Much of the new information about the molecular basis of memory and learning has come from the study of other invertebrates (animals without backbones) such as the sea slug (Apylsia) and the fruit fly (Drosophila).

“Studies like this suggest that time of day can have a profound impact, at least in certain situations. By studying the way the biological clock modulates learning and memory we may learn more about how these processes take place and what can influence them,” Page says.

###

[Note: A multimedia version of this story is available on Exploration, Vanderbilt University’s online research magazine, at http://www.vanderbilt.edu/exploration/stories/cockroach.html.]

Monday, September 24, 2007

Nature explains the cell-making process




Published online: 21 September 2007; | doi:10.1038/news070917-11
Why a person doesn't evolve in one lifetime
The body's complicated cell-making process may help to avoid cancer.

Philip Ball


What if your skin cells evolved every time they grew to replace dead skin?
STEVE GSCHMEISSNER / SCIENCE PHOTO LIBRARY
It's not easy making a human. Getting from a fertilized egg to a full-grown adult involves a near-miracle of orchestration, with replicating cells acquiring specialized functions in just the right places at the right times. So you'd think that, having done the job once, our bodies would replace cells when required by the simplest means possible.

Oddly, they don't. Our tissues don't renew themselves by mere copying, with old skin cells dividing into new skin cells and so forth. Instead, they keep repeating the laborious process of starting each cell from scratch. Now scientists think they know why: it could be nature's way of making sure that we don't evolve as we grow older1.

Evolution is usually thought of as something that happens to whole organisms. But there's no fundamental reason why, for multicelled organisms, it shouldn't happen within a single organism too.

In a colony of single-celled bacteria, researchers can watch evolution in action. As the cells divide, mutants appear; and under stress, there is a selective pressure that favours some mutants over others, spreading advantageous genetic changes through the population.

In principle, precisely the same thing could occur throughout our bodies. Our cells are constantly being replaced in vast numbers: the human body typically contains about a hundred trillion cells, and many billions are shed and replaced every day.

If this happened simply by replication of the various specialized cells in each tissue, our tissues would evolve: mutations would arise, and some would spread. In particular, mutant cells that don't do their specialized job so well tend to replicate more quickly than non-mutants, and so gain a competitive advantage, freeloading off the others. In such a case, our wonderfully wrought bodies could grind to a halt.

Avoiding fate

While working at the Santa Fe Institute in New Mexico, evolutionary biologist John Pepper of the University of Arizona in Tucson and his co-workers came up with a theory for how multicelled organisms avoid this fate. They say it explains why the epithelial tissue cells that line all parts of the body take such an apparently long-winded route to replication, rather than just copying themselves in their mature form.

To renew themselves, epithelial tissues retain a population of undifferentiated stem cells, like the unformed cells present in embryos, that have the ability to grow into different types of cells. When replacements are needed, some of these stem cells divide to make transient amplifying cells (TACs). The TACs then divide several times, and Pepper and his co-workers think that each division produces cells that are a little more developed into mature tissue cells.

All this costs a lot of metabolic energy, so it is not very efficient. But, the researchers say, it means that the functions of self-replication and proliferation are divided between separate groups of cells. The stem cells replicate, but only a little, and so there's not much chance for mutations to arise or for selective pressure to fix them in place. The proliferating TACS may mutate, but they aren't simply copying themselves, so there isn't any direct competition between the cells to create an evolutionary pressure. As a result, evolution can't get started.

Pepper and his colleagues have used computer modelling to show that this proposed mechanism can suppress evolution in a long-lived, multicelled organism.

Inside job

One case in which this scheme might not operate, they say, is in the immune system. Here evolution is beneficial, as it introduces adaptations that fight previously encountered invaders.

One drawback of this, however, is that it would be expected to make the immune system more prone to cancers. And that seems to be so: leukaemia and lymphoma are cancers associated with the immune system, and they seem to be more common in younger people than many other cancers, suggesting that the failure to suppress evolution allows its problems to show up rather quickly.

The researchers think that their hypothesis could provide new insights into cancers more generally. Whereas conventional wisdom has it that cancer is caused by some genetic mutation that leads cells to proliferate uncontrollably, this new picture implies that the problem would lie with TAC mutations that interfere with differentiation — so that a TAC cell ends up just copying itself instead of producing cells on the next rung up on the way to mature tissue cells.

Carlo Maley, Pepper's colleague at the Wistar Institute, a biomedical research centre in Philadelphia, Pennsylvania, says that if their picture is right, incipient cancer formation might be detected very early by looking for biomolecules in body fluids that signal disruption of cell differentiation, even before there are any physical signs of tumour growth.

Visit our newsblog to read and post comments about this story.

Top
References

1. Pepper, J. W., et al. PLoS Comput. Biol. (in the press).

Top

Story from news@nature.com:
http://news.nature.com//news/2007/070917/070917-11.html

Nature Publishing Group, publisher of Nature, and other science journals and reference works © 2006 Nature Publishing Group | Privacy policy

Vitamin C essential for plant growth

In a study published in the on line edition of The Plant Journal,the scientists from the University of Exeter and Shimane University in Japan have proved for the first time that vitamin C is essential for plant growth.

K.S.Parthasarathy






Public release date: 23-Sep-2007


Contact: Sarah Hoyle
s.hoyle@exeter.ac.uk
44-013-922-62062
University of Exeter
Study shows vitamin C is essential for plant growth

Scientists from the University of Exeter and Shimane University in Japan have proved for the first time that vitamin C is essential for plant growth. This discovery could have implications for agriculture and for the production of vitamin C dietary supplements.


The study, which is now published online in The Plant Journal, describes the newly-identified enzyme, GDP-L-galactose phosphorylase, which produces vitamin C, or ascorbate, in plants. Vitamin C is already known to be an antioxidant, which helps plants deal with stresses from drought to ozone and UV radiation, but until now it was not known that plants could not grow without it.

Professor Nicholas Smirnoff of the University of Exeter, lead author on the paper said: ‘Vitamin C is the most abundant antioxidant in plants and yet its functions are poorly understood. By discovering that the new enzyme is encoded by two genes, we were able to engineer vitamin C-free plants and found that they were unable to grow.’

The discovery also identifies the new enzyme as a key player in controlling vitamin C accumulation in response to light. Vitamin C provides protection against the harmful side-effects of light during photosynthesis, the process by which light energy is used to convert carbon dioxide into plant matter.

Professor Nicholas Smirnoff continued: ‘The discovery is exciting for me because it is the culmination of a long-term research programme on vitamin C in plants at the University of Exeter. It opens new opportunities to understand fundamental growth processes in plants and to improve plant resistance to stresses in a changing climate. In the longer term I hope that it will contribute to the efforts of plant scientists to improve crop yield in a sustainable manner.’

The findings could also pave the way for a new approach to producing vitamin C dietary supplements. In Britain we spend an estimated £20 million on vitamin C tablets each year, making this the most widely-used dietary supplement. Vitamin C is currently produced by mixed fermentation and chemical synthesis. The new enzyme provides the potential to engineer microbes to produce vitamin C by a simpler one-step process.

###

This research was funded by Bio-Technical Resources, Exeter University School of Biosciences, the Japan Society for the Promotion of Science and a Biotechnology and Biological Sciences Research Council (BBSRC) studentship.

Saturday, September 22, 2007

Proposal defining kilogramme as a precise number of carbon atoms

Ronald F. Fox, a Regents’ Professor Emeritus in the School of Physics at the Georgia Institute of Technology and Theodore P. Hill – a Professor Emeritus in the Georgia Tech School of Mathematics proposed that the gram – 1/1000th of a kilogram – would henceforth be defined as the mass of exactly 18 x 14074481 (cubed) carbon-12 atoms.

The new definition needed a precise value for the Avagadro's number.In the fall of 2006 Fox and Hill submitted a paper to Physics Archives in which they proposed assigning a specific number to the constant – one of about 10 possible values within the experimental range. The authors pointed out that a precise Avogadro’s constant could also precisely redefine the measure of mass, the kilogram.The authors conceded that a precise Avogadro’s constant could also precisely redefine the measure of mass, the kilogram. The duo got inspired further when Associated Press (September 2007) noted that the mass of the official kilogramme cast 118 years is disappearing.The loss was 50 microgramme at the last check.

K.S.Parthasarathy




Public release date: 21-Sep-2007

Contact: John Toon
jtoon@gatech.edu
404-894-6986
Georgia Institute of Technology Research News
A better definition for the kilogram? Scientists propose a precise number of carbon atoms

How much is a kilogram?

It turns out that nobody can say for sure, at least not in a way that won’t change ever so slightly over time. The official kilogram – a cylinder cast 118 years ago from platinum and iridium and known as the International Prototype Kilogram or “Le Gran K” – has been losing mass, about 50 micrograms at last check. The change is occurring despite careful storage at a facility near Paris.

That’s not so good for a standard the world depends on to define mass.

Now, two U.S. professors – a physicist and mathematician – say it’s time to define the kilogram in a new and more elegant way that will be the same today, tomorrow and 118 years from now. They’ve launched a campaign aimed at redefining the kilogram as the mass of a very large – but precisely-specified – number of carbon-12 atoms.

“Our standard would eliminate the need for a physical artifact to define what a kilogram is,” said Ronald F. Fox, a Regents’ Professor Emeritus in the School of Physics at the Georgia Institute of Technology. “We want something that is logically very simple to understand.”

Their proposal is that the gram – 1/1000th of a kilogram – would henceforth be defined as the mass of exactly 18 x 14074481 (cubed) carbon-12 atoms.

The proposal, made by Fox and Theodore P. Hill – a Professor Emeritus in the Georgia Tech School of Mathematics – first assigns a specific value to Avogadro’s constant. Proposed in the 1800s by Italian scientist Amedeo Avogadro, the constant represents the number of atoms or molecules in one mole of a pure material – for instance, the number of carbon-12 atoms in 12 grams of the element. However, Avogadro’s constant isn’t a specific number; it’s a range of values that can be determined experimentally, but not with enough precision to be a single number.

Spurred by Hill’s half-serious question about whether Avogadro’s constant was an even or odd number, in the fall of 2006 Fox and Hill submitted a paper to Physics Archives in which they proposed assigning a specific number to the constant – one of about 10 possible values within the experimental range. The authors pointed out that a precise Avogadro’s constant could also precisely redefine the measure of mass, the kilogram.

Their proposal drew attention from the editors of American Scientist, who asked for a longer article published in March 2007. The proposal has so far drawn five letters, including one from Paul J. Karol, chair of the Committee on Nomenclature, Terminology and Symbols of the American Chemical Society. Karol added his endorsement to the proposal and suggested making the number divisible by 12 – which Fox and Hill did in an addendum by changing their number’s final digit from 8 to 6. So the new proposal for Avogadro’s constant became 84446886 (cubed), still within the range of accepted values.

Fast-forward to September 2007, when Fox read an Associated Press article on the CNN.com Web site about the mass disappearing from the International Prototype Kilogram. While the AP said the missing mass amounted to no more than “the weight of a fingerprint,” Fox argues that the amount could be significant in a world that is measuring time in ultra-sub-nanoseconds and length in ultra-sub-nanometers.

So Fox and Hill fired off another article to Physics Archive, this one proposing to redefine the gram as 1/12th the mass of a mole of carbon 12 – a mole long being defined as Avogrado’s number of atoms. They now hope to generate more interest in their idea for what may turn out to be a competition of standards proposals leading up to a 2011 meeting of the International Committee for Weights and Measures.

At least two other proposals for redefining the kilogram are under discussion. They include replacing the platinum-iridium cylinder with a sphere of pure silicon atoms, and using a device known as the “watt balance” to define the kilogram using electromagnetic energy. Both would offer an improvement over the existing standard – but not be as simple as what Fox and Hill have proposed, nor be exact, they say.

“Using a perfect numerical cube to define these constants yields the same level of significance – eight or nine digits – as in those integers that define the second and the speed of light,” Hill said. “A purely mathematical definition of the kilogram is experimentally neutral – researchers may then use any laboratory method they want to approximate exact masses.”

The kilogram is the last major standard defined by a physical artifact rather than a fundamental physical property. In 1983, for instance, the distance represented by a meter was redefined by how far light travels in 1/299,792,458 seconds – replacing a metal stick with two marks on it.

“We suspect that there will be some public debate about this issue,” Fox said. “We want scientists and science teachers and others to think about this problem because we think they can have an impact. Public discussion may play an important role in determining how one of the world’s basic physical constants is defined.”

How important is this issue to the world’s future technological development"

“When you make physical and chemical measurements, it’s important to have as high a precision as possible, and these standards really define the limits of precision,” Fox said. “The lack of an accurate standard leaves some inconsistency in how you state results. Having a unique standard could eliminate that.”

While the new definition would do away with the need for a physical representation of mass, Fox says people who want a physical artifact could still have one – though carbon can’t actually form a perfect cube with the right number of atoms. And building one might take some time.

“You could imagine having a lump of matter that actually had exactly the right number of atoms in it,” Fox noted. “If you could build it by some kind of self-assembly process – as opposed to building it atom-by-atom, which would take a few billion years – you could have new kilogram artifact made of carbon. But there’s really no need for that. Even if you built a perfect kilogram, it would immediately be inaccurate as soon as a single atom was sloughed off or absorbed.”

###

Technical Contacts: Ron Fox (770-433-8950); E-mail: (ron.fox@physics.gatech.edu) or Ted Hill (805-528-1331); E-mail: (hilltp66@charter.net).

[ Back to EurekAlert! ] [ Print Article | E-mail Article | Close Window ]

Thursday, September 20, 2007

Antibiotics overprescribed by GPs

Many physicians appear to prescribe antibiotics without applying their mind.Van Duijn suggests that the results of his study should be used to update quality assurance programs and postgraduate courses, to emphasise the use of evidence-based prognostic criteria (e.g. chronic respiratory co-morbidity and old age) as an indication to prescribe antibiotics instead of single signs of inflammation or diagnostic labels.

K.S.Parthasarathy


Public release date: 19-Sep-2007


Contact: Charlotte Webber
press@biomedcentral.com
44-020-763-19980
BioMed Central
Antibiotics overprescribed by GPs

GPs are unnecessarily giving patients antibiotics for respiratory tract (RT) infections which would clear up on their own. Doctors tend to over-emphasise symptoms such as white spots in the throat, rather than looking at factors such as old age and co-morbidity, which would affect a patient's recovery, according to an article published in the online open access journal, BMC Family Practice.

Huug J. van Duijn and his team at the Julius Center for Health Sciences and Primary Care from the University Medical Center Utrecht, The Netherlands, looked at the practice records of 163 GPs from 85 Dutch practices over a 12 month period, and carried out a survey of the doctors' attitudes to prescribing antibiotics for RT infections. Diagnostic labelling (the tendency to encode RT episodes as infections rather than as symptoms) seemed to be an arbitrary process, often used to justify antibiotic prescribing. GPs may give out antibiotics unnecessarily to defend themselves against unforeseen complications, even if these are unlikely to materialize.

Although Dutch GPs prescribe relatively small antibiotic volumes and international colleagues often envy the quality assurance system in Dutch primary care with guidelines and peer review groups, Van Duijn suggests that the results of his study should be used to update quality assurance programs and postgraduate courses, to emphasise the use of evidence-based prognostic criteria (e.g. chronic respiratory co-morbidity and old age) as an indication to prescribe antibiotics instead of single signs of inflammation or diagnostic labels. "Even in the Netherlands there is an over-prescribing of antibiotics; about 50% of the antibiotic prescriptions for acute RT episodes are not in accordance with Dutch national guidelines," says van Duijn. "Considering costs, side-effects and the growing resistance to pathogens, it is important to rationalise antibiotic prescribing as much as possible."

###

Article
Diagnostic labelling and other GP characteristics as determinants of antibiotic prescribing for acute respiratory tract episodes
Huug J. van Duijn, Marijke M. Kuyvenhoven, Hanneke M. Tiebosch, François G. Schellevis and Theo J.M. Verheij
BMC Family Practice (in press)

During the embargo, article available at: http://www.biomedcentral.com/imedia/5436403961360476_article.pdf?random=803709

After the embargo, article available from the journal website at: http://www.biomedcentral.com/bmcfampract/

Please quote the journal in any story you write. If you are writing for the web, please link to the article. All articles are available free of charge, according to BioMed Central's Open Access policy.

Article citation and URL available on request at press@biomedcentral.com on the day of publication

For author contact details, please contact Charlotte Webber on +44 (0)20 7631 9980 or press@biomedcentral.com

BioMed Central (http://www.biomedcentral.com) is an independent online publishing house committed to providing open access to peer-reviewed biological and medical research. This commitment is based on the view that immediate free access to research and the ability to freely archive and reuse published information is essential to the rapid and efficient communication of science.

BioMed Central currently publishes over 160 journals across biology and medicine. In addition to open-access original research, BioMed Central also publishes reviews, commentaries and other non-original-research content. Depending on the policies of the individual journal, this content may be open access or provided only to subscribers.

Friday, September 14, 2007

Academy releases emergency preparedness tools to offer shelter to millions of people

In spite of spending billions of dollars for drawing emergency preparedness plans, millions of people are at risk in USA.This is primarily because they do not account for critical problems people face when they actually try to protect themselves.The New York Academy of Medicine today released a report and tools—available at www.redefiningreadiness.

K.S.Parthasarathy


Public release date: 13-Sep-2007


Contact: Kathryn Cervino
kcervino@nyam.org
212-822-7285
New York Academy of Medicine
Academy releases emergency preparedness tools to enable millions more people to shelter in place
Online 'Redefining Readiness' tools harness the public's knowledge to address a fundamental flaw in planning
New academy tools help people prepare to shelter in place in the face of emergencies.


Although the nation has invested billions of dollars preparing to respond to emergencies, current plans leave millions of Americans at risk because they do not account for critical problems people face when they actually try to protect themselves. To fix this fundamental flaw, The New York Academy of Medicine is today releasing a report and tools—available at www.redefiningreadiness.net—that will enable households, work places, schools and early childhood/youth programs, and governments to anticipate and address problems they would face in emergencies. The tools are released during National Preparedness Month, an initiative of the U.S. Department of Homeland Security.

The report, With the Public’s Knowledge, We Can Make Sheltering in Place Possible, is based on two years’ work gathering the insights and experiences of nearly 2,000 people who live and work in four communities around the country. It identifies serious and unanticipated problems that currently make it neither feasible nor safe for many people to shelter in place. In conjunction with that report, the Academy is releasing four Shelter-in-Place Issue Sets (in both Spanish and English) to help members of households and organizations recognize and address their own vulnerabilities in these kinds of emergencies. Sheltering in place means staying inside whatever building you happen to be in—a workplace, school, store, or at home—for a period of a few hours to several days in order to stay safe, even if that requires you to be separated from other family members.

“Sheltering in place is a very important protective strategy in situations ranging from dirty bombs, toxic explosions, and chemical spills to much more common emergencies, like electrical blackouts and snowstorms,” said Roz D. Lasker, MD, Director of the Academy’s Center for the Advancement of Collaborative Strategies in Health and Division of Public Health, and lead author of the report.

The Academy’s main report documents that the emergency preparedness instructions being given to people and organizations do not address many important sheltering-in-place issues and sometimes make matters worse. Among the many gaps it uncovered:

* The public is being instructed to keep a supply of food and water in their homes, and most keep their medications there as well. But in a shelter-in-place emergency, many people will not be at home and will need to take shelter in other buildings, so their home-supply of food, water, or medicines won’t be accessible.

* The public is being told to identify places for family members to reunite in the event of an emergency. But those instructions don’t address situations in which it might be unsafe to go to such a place, such as if you would have to go through a danger zone to get there.

* While instructions describe how to identify and seal “safe rooms” in homes, schools, and other buildings, they pay little attention to assuring that the rooms can accommodate the number of people who are likely to need shelter, provide them with breathable air and tolerable temperatures, or give them safe access to water, food, lavatories, telephones, and medical supplies.

* Schools have been preparing for emergencies that affect the school directly, but children are also at risk if their parents and other guardians need to shelter in place because of an emergency and no other adult is available to pick the children up or be at home with them after school.

“The disconnect between current instructions and the problems people face in shelter-in-place emergencies isn’t surprising, since the public never had an opportunity to think about these situations in such detail before,” Lasker said.

The Academy has been harnessing the public’s knowledge about emergencies for several years now, with generous support from the W. K. Kellogg Foundation. In 2004, the Academy’s research study, Redefining Readiness: Terrorism Planning Through the Eyes of the Public, predicted that large numbers of people would suffer or die unnecessarily in emergencies, because planners were developing instructions for the public to follow without finding out whether it is actually possible, or safe, for all groups to do so. The prediction was proven to be correct during Hurricane Katrina, when many people could not follow instructions to evacuate due to barriers that had not been identified or addressed beforehand.

Over the past two years, the Academy has been working to prevent such needless death and suffering with teams in four Redefining Readiness demonstration sites in Carlsbad, NM; Chicago, IL; Savannah, GA; and southeast Oklahoma. In more than 200 small group discussions, almost 2,000 residents from diverse backgrounds explored the particular problems they would face trying to protect themselves in shelter-in-place emergencies, and the actions that they and other people and organizations could take. “Because of these efforts, we now know how to protect many more Americans in shelter-in-place emergencies than is currently possible,” said Lasker.

The insights generated in the small group discussions provided the basis for the Academy’s four Shelter-in-Place Issue Sets, which are tailored specifically to people in households, work places, schools and early childhood/youth programs, and governments. These practical tools—which consist of sets of questions rather than instructions—are designed to help users become aware of critical protection problems that their own household or organization can address and to develop workable solutions. The four issue sets are available on-line in Spanish as well as English.

Nan D. Hunter, JD, Director of the Center for Health, Science, and Public Policy at Brooklyn Law School and a co-author of the Academy’s new report, highlighted the importance of these tools for schools and work places. “The issue sets can help these organizations avoid liability by clarifying what they might reasonably be expected to do in shelter-in-place emergencies,” Hunter said. “Government agencies and private philanthropies can go a long way toward helping schools and work places realize those expectations – protecting employees, students, and customers in the process – by integrating the use of the issue sets in their current grant programs and by providing schools and work places with other incentives and supports.”

“This work is an important example of ways in which the Academy can play a role in assuring that individuals and communities affected by policies and programs have a great voice in creating them and thereby making them more effective” said Jo Ivey Boufford, MD, Academy President.

###

Founded in 1847, The New York Academy of Medicine is an independent, non-partisan, non-profit institution whose mission is to enhance the health of the public. Our research, education, community engagement, and evidence-based advocacy seek to improve the health of people living in cities, especially disadvantaged and vulnerable populations. The impact of these initiatives reaches into neighborhoods in New York City, across the country, and around the world. We work with community based organizations, academic institutions, corporations, the media, and government to catalyze and contribute to changes that promote health. Visit us online at www.nyam.org.

For more information about National Preparedness Month, an initiative each September of the U.S. Department of Homeland Security designed to encourage Americans to take simple steps to prepare for emergencies in their homes, businesses and schools, visit www.ready.gov.

Thursday, September 13, 2007

Mobile phone use not associated with any adverse health effects, new study

Mobile phone use is increasing at a fast pace. Researchers are involved in intensive studies on the possible adverse health effects of its use.On September 12 as part of its 2007 Report, the Mobile Telecommunications and Health Research (MTHR) Programme concluded that mobile phones have not been found to be associated with any biological or adverse health effects. The research programme included the largest and most robust studies of electrical hypersensitivity undertaken anywhere in the world.

K.S.Parthasarathy




Public release date: 12-Sep-2007

Contact: Science Media Center
44-020-767-02980
University of Nottingham
New report on mobile phone research published

Mobile phones have not been found to be associated with any biological or adverse health effects, according to the UK’s largest investigation into the possible health risks from mobile telephone technology.

The Mobile Telecommunications and Health Research (MTHR) Programme published its conclusions on September 12 as part of its 2007 Report.

The six-year research programme, chaired by Professor Lawrie Challis, Emeritus Professor of Physics at The University of Nottingham, has found no association between short term mobile phone use and brain cancer. Studies on volunteers also showed no evidence that brain function was affected by mobile phone signals or the signals used by the emergency services (TETRA).

The MTHR programme management committee believes there is no need to support further work in this area.

The research programme also included the largest and most robust studies of electrical hypersensitivity undertaken anywhere in the world. These studies have found no evidence that the unpleasant symptoms experienced by sufferers are the result of exposure to signals from mobile phones or base stations.

The situation for longer-term exposure is less clear as studies have so far only included a limited number of participants who have used their phones for ten years or more. The committee recommends more research be conducted in this area.

The MTHR programme also investigated whether mobile phones might affect cells and tissue beyond simply heating them. The results so far show no evidence for this and the committee believes there is no need to support further work in this area.

Professor Lawrie Challis, Chairman of MTHR, said: “This is a very substantial report from a large research programme. The work reported today has all been published in respected peer-reviewed scientific or medical journals.

“The results are so far reassuring but there is still a need for more research, especially to check that no effects emerge from longer-term phone use from adults and from use by children.”

The research programme has also funded some basic measurements of radio signals from microcell and picocell base stations such as those found in airports, railway stations and shopping malls. These have shown that exposures are well below international guidelines.

Additional studies also confirmed that the use of a mobile phone while driving, whether hand-held or hands-free, causes impairment to performance comparable to that from other in-car distractions. There are however indications that the demand on cognitive resources from mobile phones may be greater.

###

Details of all the projects supported by the Programme are published on its web site: http://www.mthr.org.uk

Notes to editors:

The University of Nottingham is Britain's University of the Year (The Times Higher Awards 2006). It undertakes world-changing research, provides innovative teaching and a student experience of the highest quality. Ranked by Newsweek in the world's Top 75 universities, its academics have won two Nobel Prizes since 2003. The University is an international institution with campuses in the United Kingdom, Malaysia and China.

The MTHR Programme was set up in response to the research recommendations contained within the ‘Stewart Report’. The Programme received approximately £8.8 million of funding from a variety of government and industry sources.

To ensure the independence of the research, scientific management of the programme was entrusted to an independent Programme Management Committee made up of independent experts, mostly senior university academics. Funds contributed by the sponsors of the Programme are managed on behalf of the Committee by the Department of Health as Secretariat to the Programme.

The first Chairman of the Programme Management Committee was Sir William Stewart and he was succeeded in November 2002 by Professor Lawrie Challis, Emeritus Professor of Physics at the University of Nottingham and formerly Vice-chairman of the Stewart Committee.

The Programme was set up in 2001 and has supported 28 individual research projects, mostly undertaken in UK universities. Of these, 23 have now been completed and most have published results in peer-reviewed scientific and medical journals (23 papers to date, with more expected in the near future). The Report 2007 summarises the state of knowledge at the time of the Stewart Report and the current state of knowledge, taking account of both research supported by the Programme and that carried out elsewhere. It also provides an indication of future research priorities.

More information is available from the Science Media Centre on +44 (0)20 7670 2980 smc@sciencemediacentre.org; or Media Relations Manager Tim Utton in the University’s Media and Public Relations Office on +44 (0)115 846 8092, tim.utton@nottingham.ac.uk. For non-news media enquires call Nigel Cridland – MTHR Scientific Co-ordinator +44 (0)1235 822666 MTHRDL@hpa.org.uk

Wednesday, September 12, 2007

Aspartame, the non-nutritive sweetener is safe, researchers say

An international expert panel from 10 universities and medical schools evaluated the safety of aspartame, a non nutritive sweetener for people of all ages and with a variety of health conditions and concluded that there is no evidence that it causes cancer, neurological damage or other health problems in human. The extensive review covered 500 reports, including toxicological, clinical and epidemiological studies dating from 1970’s pre-clinical work to the latest studies on the high-intensity sweetener, along with use levels and regulations data (Critical Reviews in Toxicology, September 2007).

In 1965, scientists accidentally discovered aspartame which is 200 times sweeter than sucrose.Aspartame has the same number of calories as sugar on a weight-to-weight basis; it can be added to food or pharmaceuticals at a fraction of what would be needed with sucrose to achieve the same sweetness, with far fewer calories.

K.S.Parthasarathy



Public release date: 11-Sep-2007

Contact: Ellen Ternes
eternes@umd.edu
301-405-4627
Kellen Communications
Aspartame is safe, study says

A sweeping review of research studies of aspartame says there is no evidence that the non-nutritive sweetener causes cancer, neurological damage or other health problems in humans

Looking at more than 500 reports, including toxicological, clinical and epidemiological studies dating from 1970’s preclinical work to the latest studies on the high-intensity sweetener, along with use levels and regulations data, an international expert panel from 10 universities and medical schools evaluated the safety of aspartame for people of all ages and with a variety of health conditions. Their study is published in the September issue of Critical Reviews in Toxicology.

“There have been continued questions in the media and on the internet about the safety of aspartame,” said panel member and University of Maryland food and nutrition professor Bernadene Magnuson. “Our study is a very comprehensive review of all of the research that’s been done on aspartame. Never before has a group with the breadth of experience of this panel looked at this question.”

Aspartame

A non-nutritive sweetener, aspartame is approximately 200 times sweeter than sucrose, the accepted standard for sweetness. Though aspartame has the same number of calories as sugar on a weight-to-weight basis, it can be added to food or pharmaceuticals at a fraction of what would be needed with sucrose to achieve the same sweetness, with far fewer calories.

Aspartame was discovered by accident in 1965, and since then has become a popular sweetener in more than 6000 food and pharmaceutical products that range from soft drinks to ketchup.

Aspartame Consumption

The panel used the latest data – 2001-02 -- from the National Health and Nutrition Examination Surveys (NHANES) to determine the most current levels of aspartame consumption.

“Even the very highest consumers of aspartame are well below the acceptable daily intake (ADI) and well below the amounts used in animal testing,” said Magnuson.

Evaluation Findings

The team reviewed studies that tested a number of health effects of varying levels of aspartame, including amounts that far exceed the acceptable daily intake, on animals and humans. In addition to healthy adults and children, studies also looked at effects on adults and children with diabetes, hyperactive and sugar-sensitive children, and people with Parkinson’s disease and depression.

The Expert Panel’s evaluation concluded the following:

Overall:

Aspartame is safe at current levels of consumption, which remain well below established ADI levels, even among high user sub-populations. No credible evidence was found that aspartame is carcinogenic, neurotoxic or has or any other adverse effects when consumed even at levels many times the established ADI levels.

Specifically:

* Based on results of several long term studies, aspartame does not have carcinogenic or cancer-promoting activity.

* Results of extensive investigation in studies that mimic human exposure do not show any evidence of neurological effects, such as memory and learning problems, of aspartame consumption.

* Overall the weight of the evidence indicates that aspartame has no effect on behavior, cognitive function, neural function or seizures in any of the groups studied.

* Aspartame has not been shown to have adverse effects on reproductive activity or lactation.

* Studies conclude that aspartame is safe for use by diabetics and may aid diabetics in adhering to a sugar-free diet.

* There is no evidence to support an association between aspartame consumption and obesity. On the contrary, when used in multidisciplinary weight control programs, aspartame may actually aid in long-term weight control.

* The studies provide no evidence to support an association between aspartame and brain or hematopoietic tumor development.

###

Expert Panel Members

In addition to Bernadene Magnuson, the Expert Panel included: George. A. Burdock, the Burdock Group; John Doull, University of Kansas Medical School; Robert M. Kroes, Institute for Risk Assessment Sciences, Utrecht, The Netherlands; Gary M Marsh, University of Pittsburgh; Michael W. Pariza, University of Wisconsin; Peter S. Spencer, Oregon Health and Science University; William J. Waddell, University of Louisville Medical School; Ronald Walker, University of Surrey, Great Britain; and Gary M.Williams, New York Medical School.

“Aspartame: A Safety Evaluation Based on Current Use Levels, Regulations and Toxicological and Epidemiological Studies” was funded by unrestricted support from Ajinomoto Company, Inc. The abstract may be accessed at http://informaworld.com/crtox. To talk with Bernadene Magnuson, contact Ellen Ternes, 301-405-4627, eternes@umd.edu.