Scientists in Portugal just found a new molecular mechanism behind colorectal cancer in which a mutated and a normal, but over-expressed, gene cooperate and are both needed to create the disease. The research, published in the journal Gastroenteroloy1, also reveals how a technique called RNA interference can – by inactivating both genes - kill, in just 48 hours, as much as 80% of cancer cells. These are extremely promising results if transferred into new therapies for humans against a disease, which is still one of the most common cancers in the western world.

Colorectal cancer affects the colon, rectum and appendix and is not only the third most common form of cancer, but also the second cancer-related cause of death in the Western world, according to the World Health Organization. The disease kills about 655,000 people per year worldwide, with 16,000 only in the UK, even if it has a high cure rate if early detected and treated.

Mind readers have long been the domain of folklore and science fiction. But some new findings demonstrate the power of computational modeling to improve our understanding of how the brain processes information and thoughts and it brings scientists closer to knowing how specific thoughts activate our brains.

In their most recent work a computer scientist, Tom Mitchell, and a cognitive neuroscientist, Marcel Just, both of Carnegie Mellon University, used fMRI data to develop a sophisticated computational model that can predict the brain activation patterns associated with concrete nouns, or things that we experience through our senses, even if the computer did not already have the fMRI data for that specific noun.

The researchers first built a model that took the fMRI activation patterns for 60 concrete nouns broken down into 12 categories including animals, body parts, buildings, clothing, insects, vehicles and vegetables. The model also analyzed a text corpus, or a set of texts that contained more than a trillion words, noting how each noun was used in relation to a set of 25 verbs associated with sensory or motor functions. Combining the brain scan information with the analysis of the text corpus, the computer then predicted the brain activity pattern of thousands of other concrete nouns.

In cases where the actual activation patterns were known, the researchers found that the accuracy of the computer model's predictions was significantly better than chance. The computer can effectively predict what each participant's brain activation patterns would look like when each thought about these words, even without having seen the patterns associated with those words in advance.

Much of the coverage of autism in the media focuses on the arguments of advocates, scientists, and government officials over the relationship between vaccines and autism. But out of the spotlight, a bigger story is brewing: the hunt for autism genes, a technically difficult hunt which is pressing forward using all of the tools modern genetics has to offer. If you are like me, news stories about autism have left you with only a vague impression of the current scientific state of understanding, the impression that researchers strongly deny any link between autism and vaccines, but have little else to say about what the real cause of autism might be.

If that is your impression, you'll perhaps be surprised to learn that roughly 20% of autism cases in the US are linked to known genetic changes, a minor fraction of autism cases to be sure, but much higher than I would have guessed. That autism has a genetic basis is a well-established finding, and while this by no means rules out environmental factors, genetics is at the core of the recent progress scientists have made in understanding autism. The genetics of autism, however, is not simple - no surprise, since autism involves our most complex organ, the brain, in one of its most complex functions, social interaction. Untangling the genetic and environmental factors that underlie autism will be tough, but in the process we will learn more about how many different genes work together in a child to control the developing brain.

As I wrote a few days ago, if we agree that the nature of science is along the lines I have described, next we need to ask why it is so. Platt, in his classic 1964 article on strong inference, briefly mentions a number of answers, which he dismisses without discussion, but that I think are actually a large part of the reason "hard" and "soft" sciences appear to be so different. These alternative hypotheses for why a given science may behave “softly” include, as Platt puts it, “the tractability of the subject, or the quality of education of the men [sic] drawn into it, or the size of research contracts.” In other words, particle physics, say, may be more successful than ecology because it is easier (more tractable), or because ecologists tend to be dumber than physicists, or because physicists get a lot more money for their research than ecologists do.

The second option is rather offensive (to the ecologists at least), but more importantly there are no data at all to back it up. And it is difficult to see how one could possibly measure the alleged differential “education” of people attracted to different scientific disciplines. Nearly all professional scientists nowadays have a Ph.D. in their discipline, as well as years of postdoctoral experience at conducting research and publishing papers. It is hard to imagine a reliable quantitative measure of the relative difficulty of their respective academic curricula, and it is next to preposterous to argue that scientists attracted to certain disciplines are smarter than those who find a different area of research more appealing. It would be like attempting to explain the discrepancy between the dynamism of 20th century jazz music and the relative stillness of symphonic (“classical”) music by arguing that jazz musicians are better educated or more talented than classically trained ones.
An article published in the Journal of Forensic Science details the fruits of a collaboration between the University of Leicester and the Northamptonshire Police, which led to a “major breakthrough” in crime detection, perhaps allowing “hundreds of cold cases being reopened,” according to a press release. The University’s Forensic Research Center has been working with Northamptonshire Police's scientific support unit to develop new ways of taking fingerprints from a crime scene. The collaboration between the boffins and bobbies – boffin being British slang for someone engaged in technical or scientific research, apparently, and bobby being slang for police – was formally launched May 14. (For those without an intimate knowledge of U.K. geography, Northamptonshire Police headquarters is located in Northampton, about 70 miles NW of London. The University of Leicester is another 40 miles or so northwest of Northampton.) The newly developed method enables scientists to visualize fingerprints even after the print itself has been removed, the press release said.

It used to be you went to see a doctor, he gave expert advice and you did what you were told.

In today's world, patients have the opportunity to become more knowledgeable, sometimes increasing problems (diagnosing themselves) and sometimes causing impatience with hurried doctors who don't want to argue but most often a better understanding of the issues is good for everyone.

Due to that, there is growing interest in shared decision-making (SDM) in which the clinician and patient go through all phases of the decision-making process together, share treatment preferences, and reach an agreement on treatment choice.

Brown dwarfs, "failed stars", are a class of objects that represent the missing link between the lowest-mass stars and the gas-giant planets, such as Jupiter and Saturn. Brown dwarfs are the faintest and coolest objects that can be directly observed outside the solar system, emitting as little as 1/300,000th of the energy of the sun and having surface temperatures around 800° F - that's the temperature of a pizza oven and more than 9,000° F cooler than the surface of the sun.

Astronomers have used ultrasharp images obtained with the Keck Telescope and Hubble Space Telescope to determine for the first time the masses of the coldest class brown dwarfs. With masses as light as 3 percent the mass of the sun, these are the lowest mass free-floating objects ever weighed outside the solar system. The observations are a major step in testing the theoretical predictions of objects that cannot generate their own internal energy, both brown dwarfs and gas-giant planets. The new findings, which are being presented in a press conference today at the American Astronomical Society meeting in St. Louis, show that the predictions may have some problems.

Microscopic robots crafted to maneuver separately without any obvious guidance are now assembling into self-organized structures after years of continuing research led by a Duke University computer scientist.

Each microrobot is shaped something like a spatula but with dimensions measuring just microns, or millionths of a meter. They are almost 100 times smaller than any previous robotic designs of their kind and weigh even less, Donald added.

Formally known as microelectromechanical system (MEMS) microrobots, the devices are of suitable scale for Lilliputian tasks such as moving around the interiors of laboratories-on-a-chip.

Investigators of the University of Naples have explored the inability to express emotions (alexithymia) in panic disorder in a paper published in the third 2008 issue of Psychotherapy and Psychosomatics.

In patients with panic disorder (PD), the difficulty to identify and manage emotional experience might contribute to the enduring vulnerability to panic attacks. Such a difficulty might reflect a dysfunction of fronto-temporo-limbic circuits.

The present study was designed to test the hypothesis that drug-free patients with PD, as compared with healthy subjects (HS), show a higher prevalence of alexithymia, greater difficulty in emotional stimuli processing and poorer performance on neuropsychological tests exploring the activity of fronto-temporo-limbic circuits.

Flat screen displays currently used in computer monitors, television sets and numerous other electronic devices are all built on a glass base. Most use liquid crystal devices (LCDs), which filter light from behind to form an image.

But the glass substrate makes LCD displays rigid and fragile, limiting their use. Now display manufacturers are working to develop a new generation of robust, flexible displays that can be curved to fit the shape of a product or even rolled up like a magazine. The question is, which of the technologies under development is the best?

Big industrial names such as Nokia, Thales and Philips, as well as universities, research centres and many small and medium-sized businesses have pooled their skills and expertise to thoroughly test a large number of materials and techniques.