Monday, March 21, 2011

Dark Fluid: Dark Matter And Dark Energy May Be Two Faces Of Same Coin


Dark Fluid: Dark Matter And Dark Energy May Be Two Faces Of Same Coin

Feb. 1, 2008
Astronomers at the University of St Andrews believe they can "simplify the dark side of the universe" by shedding new light on two of its mysterious constituents.

                           Dr HongSheng Zhao, of the University's School of Physics and Astronomy, has shown that the puzzling dark matter and its counterpart dark energy may be more closely linked than was previously thought.
Only 4% of the universe is made of known material - the other 96% is traditionally labelled into two sectors, dark matter and dark energy.
A British astrophysicist and Advanced Fellow of the UK's Science and Technology Facilities Council, Dr Zhao points out, "Both dark matter and dark energy could be two faces of the same coin.
"As astronomers gain understanding of the subtle effects of dark energy in galaxies in the future, we will solve the mystery of astronomical dark matter at the same time. "
Astronomers believe that both the universe and galaxies are held together by the gravitational attraction of a huge amount of unseen material, first noted by the Swiss astronomer Fritz Zwicky in 1933, and now commonly referred to as dark matter.
Dr Zhao reports that, "Dark energy has already revealed its presence by masking as dark matter 60 years ago if we accept that dark matter and dark energy are linked phenomena that share a common origin."
In Dr Zhao's model, dark energy and dark matter are simply different manifestations of the same thing, which he has considered as a 'dark fluid'. On the scale of galaxies, this dark fluid behaves like matter and on the scale of the Universe overall as dark energy, driving the expansion of the Universe. Importantly, his model, unlike some similar work, is detailed enough to produce the same 3:1 ratio of dark energy to dark matter as is predicted by cosmologists.
Efforts are currently underway to hunt for very massive dark-matter particles with a variety of experiments. The Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) in Geneva is a particle accelerator that amongst other objectives, could potentially detect dark matter particles.
According to Dr Zhao, these efforts could turn out to be fruitless. He said, "In this simpler picture of universe, the dark matter would be at a surprisingly low energy scale, too low to be probed by upcoming Large Hadron Collider.
"The search for dark-matter particles so far has concentrated on highly-energetic particles. If dark matter however is a twin phenomenon of dark energy, it will not show up at instruments like the LHC, but has been seen over and over again in galaxies by astronomers."
However, the Universe might be absent of dark-matter particles at all. The findings of Dr Zhao are also compatible with an interpretation of the dark component as a modification of the law of gravity rather than particles or energy.
Dr Zhao concluded. "No matter what dark matter and dark energy are, these two phenomena are likely not independent of each other."
Background
Theories of the physics of gravity were first developed by Isaac Newton in 1687 and refined by Albert Einstein’s theory of General Relativity in 1916 which stated that the speed of gravity is equal to the speed of light. However, Einstein was never fully decided on whether his equation should add an omnipresent constant source, now called dark energy in general.
Astronomers following Fred Zwicky have also speculated additional sources to Einstein's equation in the form of non-light emitting material, called dark matter in general. Apart from very light neutrinos neither dark sources have been confirmed experimentally.
Dr Zhao and his collaborators' findings have recently been published by Astrophysical Journal Letters in December 2007, and Physics Review D. 2007.

Major Clue in Long-Term Memory-Making

(Mar. 20, 2011)
You may remember the color of your loved one's eyes for years. But how? Scientists believe that long-term potentiation (LTP) -- the long-lasting increase of signals across a connection between brain cells -- underlies our ability to remember over time and to learn, but how that happens is a central question in neuroscience.
                       Researchers at Duke University Medical Center have found a cascade of signaling molecules that allows a usually very brief signal to last for tens of minutes, providing the brain framework for stronger connections (synapses) that can summon a memory for a period of months or even years.
Their findings about how the synapses change the strength of connections could have a bearing on Alzheimer's disease, autism and mental retardation, said Ryohei Yasuda, Ph.D., assistant professor of neurobiology and senior author.
"We found that a biochemical process that lasts a long time is what causes memory storage," said Yasuda, who is a Howard Hughes Medical Institute Early Career Scientist.
This work was published in the March 20 issue of Nature.
The researchers were investigating the signaling molecules that regulate the actin cytoskeleton, which serves as the structural framework of synapses.
"The signaling molecules could help to rearrange the framework, and give more volume and strength to the synapses," Yasuda said. "We reasoned that a long-lasting memory could possibly come from changes in the building block assemblies."
The Duke researchers knew that long-term potentiation, a long-lasting set of electrical impulses in nerve cells, is triggered by a transient increase of calcium (Ca2+) ions in a synapse. They devised experiments to learn exactly how the short Ca2+ signal, which lasts only for ~0.1s, is translated into long-lasting (more than an hour) change in synaptic transmission.
The team used a 2-photon microscopy technique to visualize molecular signaling within single synapses undergoing LTP, a method developed in the Yasuda lab. This microscopy method allowed the team to monitor molecular activity in single synapses while measuring the synapses for increase in their volume and strength of the connections.
They found that signaling molecules Rho and Cdc42, regulators of the actin cytoskeleton, are activated by CaMKII, and relay a CaMKII signal into signals lasting many minutes. These long-lasting signals are important for maintaining long-lasting plasticity of synapses, the ability of the brain to change during learning or memorization.
Many mental diseases such as mental retardation and Alzheimer's disease are associated with abnormal Rho and Cdc42 signals, Yasuda said. "Thus, our finding will provide many insights into these diseases."
Other authors include lead author Hideji Murakoshi and Hong Wang of the Duke Department of Neurobiology.
This study was funded by Howard Hughes Medical Institute, National Institute of Mental Health, National Institute of Neurological Disorders and Stroke, National Institute of Drug Abuse, the Alzheimer's Association and the Japan Society for the Promotion of Science.

Two New SCAP Documents Help Improve Automating Computer Security Management

                              It's increasingly difficult to keep up with all the vulnerabilities present in today's highly complex operating systems and applications. Attackers constantly search for and exploit these vulnerabilities to commit identity fraud, intellectual property theft and other attacks. The National Institute of Standards and Technology (NIST) has released two updated publications that help organizations to find and manage vulnerabilities more effectively, by standardizing the way vulnerabilities are identified, prioritized and reported.
                       Computer security departments work behind the scenes at government agencies and other organizations to keep computers and networks secure. A valuable tool for them is security automation software that uses NIST's Security Content Automation Protocol (SCAP). Software based on SCAP can be used to automatically check individual computers to see if they have any known vulnerabilities and if they have the appropriate security configuration settings and patches in place. Security problems can be identified quickly and accurately, allowing them to be resolved before hackers can exploit them.
The first publication, The Technical Specifications for the Security Content Automation Protocol (SCAP) Version 1.1 (NIST Special Publication (SP) 800-126 Revision 1) refines the protocol's requirements from the SCAP 1.0 version. SCAP itself is a suite of specifications for standardizing the format and nomenclature by which security software communicates to assess software flaws, security configurations and software inventories.
SP 800-126 Rev. 1 tightens the requirements of the individual specifications in the suite to support SCAP's functionality and ensure interoperability between SCAP tools. It also adds a new specification -- the Open Checklist Interactive Language (OCIL) -- that allows security experts to gather information that is not accessible by automated means. For example, OCIL could be used to ask users about their recent security awareness training or to prompt a system administrator to review security settings only available through a proprietary graphical user interface. Additionally, SCAP 1.1 calls for the use of the 5.8 version of the Open Vulnerability and Assessment Language (OVAL).
NIST and others provide publicly accessible repositories of security information and standard security configurations in SCAP formats, which can be downloaded and used by any tool that complies with the SCAP protocol. For example, the NIST-run National Vulnerability Database (NVD) provides a unique identifier for each reported software vulnerability, an analysis of its potential damage and a severity score. The NVD has grown from 6,000 listings in 2002 to about 46,000 in early 2011. It is updated daily.
The second document, Guide to Using Vulnerability Naming Schemes (Special Publication 800-51 Revision 1), provides recommendations for naming schemes used in SCAP. Before these schemes were standardized, different organizations referred to vulnerabilities in different ways, which created confusion. These naming schemes "enable better synthesis of information about software vulnerabilities and misconfigurations," explained co-author David Waltermire, which minimizes confusion and can lead to faster security fixes. The Common Vulnerabilities and Exposures (CVE) scheme identifies software flaws; the Common Configuration Enumeration (CCE) scheme classifies configuration issues.
SP 800-51 Rev.1 provides an introduction to both naming schemes and makes recommendations for using them. It also suggests how software and service vendors should use the vulnerability names and naming schemes in their products and service offerings.
                    

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | cheap international calls