Tag: animals (page 3 of 9)

Aliens Might Weigh As Much As Polar Bears And Be Taller Than The Tallest Man Who Ever Lived





Excerpt from huffingtonpost.com

No one really knows whether we're alone or if the universe is brimming with brainy extraterrestrials. But that hasn't stopped scientists from trying to figure out what form intelligent aliens might take. 

And as University of Barcelona cosmologist Dr. Fergus Simpson argues in a new paper, most intelligent alien species would likely exceed 300 kilograms (661 pounds)--with the median body mass "similar to that of a polar bear."

If such a being had human proportions, Simpson told The Huffington Post in an email, it would be taller than Robert Wadlow, who at 8 feet, 11 inches is believed to have been the tallest human who ever lived.

robert wadlowRobert Wadlow (1918-1940), the tallest man who ever lived.


Simpson's paper, which is posted on the online research repository arXiv.org, is chockablock with formidable-looking mathematical equations. But as he explained in the email, his starting point was to consider the relationship between the number of individuals in a population on Earth and the body mass of those individuals:
"Ants easily outnumber us because they are small. Our larger bodies require a much greater energy supply from the local resources, so it would be impossible for us to match the ant population. Now apply this concept to intelligent life across the universe. On average, we should expect physically larger species to have fewer individuals than the smaller species. And, just like with countries, we should expect to be in one of the bigger populations. In other words, we are much more likely to find ourselves to be the ants among intelligent species."

Or, as Newsweek explained Simpson's argument, there are probably more planets with relatively small animals than planets with relatively large animals. It makes sense to assume that Earth is in the former category, so we can assume that humans are probably among the smaller intelligent beings.


What do other scientists make of Simpson's paper?

“I think the average size calculation is reasonable,” Dr. Duncan Forgan, an astrobiologist at the University of St. Andrews in Scotland who wasn't involved in the research, told Newsweek.
But to Dr. Seth Shostak, senior astronomer at the SETI Institute in Mountain View, Calif., the argument is suspect.

"There is an assumption here that intelligence can come in all (reasonable) sizes, and does so with more or less equal likelihood," Shostak told The Huffington Post in an email. "That may be true, but on Earth bigger has not always been better, at least in the brains department. Dolphins have higher IQs than whales, and crows are smarter than eagles. Octopuses are cleverer than giant squids, and obviously we’re smarter than polar bears."

Ultimately, Shostak said, we can’t know whether "little green men are actually big green men" before we actually make contact.
Until then!

View Article Here Read More

Researchers Discover Fossils Of A New Species Of Terror Bird






Excerpt from huffingtonpost.com 

An army of huge carnivorous "terror birds" -- some as big as 10 feet tall -- ruled South America for tens of millions of years before going extinct some 2.5 million years ago.

Now, with the discovery of a new species of terror bird called Llallawavis scagliai, paleontologists are gaining fresh insight into this fearsome family of top predators.

More than 90 percent of the bird's fossilized skeleton was unearthed in northeastern Argentina in 2010, making it the most complete terror bird specimen ever found. 

“It’s rare to find such a complete fossil of anything, let alone a bird,” Dr. Lawrence Witmer, an Ohio University paleontologist who wasn’t involved in the new research, told Science magazine. “This is a very exciting find.”


llallawavis
Skeleton of Llallawavis scagliai on display at the Lorenzo Scaglia Municipal Museum of Natural Sciences in Mar del Plata, Argentina.
terror bird drawingPreserved skeleton of Llallawavis scagliai. Bones colored in gray were missing in the specimen. Scale bar equals 0.1 m.

Llallawavis likely lived around 3.5 million years ago, near the end of terror birds' reign, according to the researchers. It stood about four feet tall and weighed about 40 pounds.

“The discovery of this species reveals that terror birds were more diverse in the Pliocene than previously thought," Dr. Federico Degrange, a researcher at the Center for Research in Earth Sciences in Argentina and the leader of the team that identified the new species, said in a written statement. "It will allow us to review the hypothesis about the decline and extinction of this fascinating group of birds.” 

CT scans of the bird's inner ear structures indicated that its hearing was tuned for low-pitched sounds, and that it likely produced these kinds of ostrich-like sounds too.

"Low-frequency sounds are great for long-[distance] communication, or if you're a predator, for sensing the movements of prey animals," Witmer told Live Science.

The researchers hope further analyses will yield insights into the bird's vision and other senses.

An article describing the findings was published online March 20 in the Journal of Vertebrate Paleontology.

View Article Here Read More

UV light reveals hidden colors in ancient shells



UV light revealed the way ancient shells looked millions of years ago.


Excerpt from perfscience.com


Using ultra-violet (UV) light, scientists have revealed astonishing colors of about 30 ancient seashells. According to PLOS, the seashells, which are estimated to be between 6.6 and 4.8 million years old, were looking white in regular white light. The true colors of the shells appeared in UV light.




According to the researchers, “The biology of modern Conidae (cone snails)-which includes the hyperdiverse genus Conus-has been intensively studied, but the fossil record of the clade remains poorly understood, particularly within an evolutionary framework”.

In the presence of UV light, the organic matter remaining in the shells fluoresces. With this, the shells appeared similar to what they looked when living creatures used to live in them. It is yet unclear which particular compounds in the shells are releasing the light when exposed to UV rays. With the help of the technique, the researchers were able to document the coloration patterns of 28 different cone shell species found in the Dominican Republic. Out of these 28 shells, 13 were found to be the species, which were not known earlier. And this could help know about the relationship between modern species.

San Jose State University geologist Jonathan Hendricks exposed over 350 fossil specimens to ultraviolet light. 

The coloration patterns of the ancient species were compared with existing animals and doing this, researchers found many displayed similarities. According to this finding, some modern species emerge from lineages. These lineages began in the Caribbean millions of years ago.

The newly distinguished species, Conus carlottae, was also among the newly distinguished species and it has a polka-dotted shell, which is not found in modern cone snails today. Researchers are now using UV light to emit color from porcelain white seashell fossils.

View Article Here Read More

10 Pictures of Europe’s Shameful “Human Zoos”

It was not too long ago that people from France, Belgium, Germany, and other countries came to visit humans who were locked up in cages. In these zoos, humans were on exhibit in front of a large audience, locked in with animals at a local zoo. Hundreds of thousands of people would visit these minorities who were on display like animals. The humans zoos were a large attraction, as 18 million came to visit the World Fair in 1889, held in Paris. Over four hundred Aboriginals [...]

View Article Here Read More

Top Secret Government Programs That Your Not Supposed To Know About

Originally Posted at in5d.com The following is the alleged result of the actions of one or more scientists creating a covert, unauthorized notebook documenting their involvement with an Above Top Secret government program. Government publications and information obtained by the use of public tax monies cannot be subject to copyright. This document is released into the public domain for all citizens of the United States of America. THE ‘MAJIC PROJECTS’ SIGMA is the project whic [...]

View Article Here Read More

New Development in the Controversy of the ‘Yeti’ Hair Samples — Here’s the Latest



 In this undated photo made available by Britain's Channel 4 television of Oxford University genetics professor Bryan Sykes posing with a prepared DNA sample taken from  hair  from a Himalayan animal.  DNA testing is taking a bite out of the Bigfoot legend. After scientists analyzed more than 30 hair samples reportedly left behind by Bigfoot and other related beasts like Yeti and almasty, they found all of them came from more mundane animals like bears, wolves, cows and raccoons. In 2012, researchers at Oxford University and the Lausanne Museum of Zoology issued an open call asking museums, scientists and Bigfoot aficionados to share any samples they thought were from the mythical ape-like creatures. (AP/ Channel 4)
In this undated photo made available by Britain’s Channel 4 television of Oxford University genetics professor Bryan Sykes posing with a prepared DNA sample taken from hair from a Himalayan animal.



Excerpt from theblaze.com

A new study that re-analyzed so-called “yeti” hair samples from previous research that had identified them as belonging to an “anomalous ursid” might have disappointing news for those who thought the findings last year meant a “bigfoot” of sorts was still out there. Yet, the author of the original findings stands by his claims.

Research published in the journal ZooKeys found that the hair samples said to be from Central Asia and the Himalayas belong to a known species in those regions.

“We have concluded that there is no reason to believe that the two samples came from anything other than brown bears,” the authors wrote in the study abstract.


After scientists analyzed more than 30 hair samples reportedly left behind by Bigfoot and other related beasts like Yeti, they found all of them came from more mundane animals like bears, wolves, cows and raccoons. Two samples were said to have been from an “anomalous ursid,” but new analysis suggests that the samples were from brown bears. (AP/Channel 4)
These authors used mitochondrial 12S rRNA sequencing on the same samples that Oxford University’s Bryan Sykes and his fellow authors used in their study published last year. The issue Eliecer Guiterrez, a postdoctoral researcher at the Smithsonian’s National Museum of Natural History, and his colleagues found with Sykes research was that his team used a fragment of DNA.

“We made this discovery that basically that fragment of DNA is not informative to tell apart two species of bears: the brown bear and [modern-day Alaskan] polar bear,” Gutierrez told Live Science.

At the time of his 2014 study, Sykes et al. wrote “[...] it is important to bear in mind that absence of evidence is not evidence of absence and this survey cannot refute the existence of anomalous primates, neither has it found any evidence in support. […] The techniques described here put an end to decades of ambiguity about species identification of anomalous primate samples and set a rigorous standard against which to judge any future claims.”

And Sykes still holds his ground, despite the more recent findings.
“What mattered most to us was that these two hairs were definitely not from unknown primates,” Sykes told Live Science in light of the recent research. “The explanation by Gutierrez and [Ronald] Pine might be right, or it might not be.”

To NBC News, Sykes said that Gutierrez’ findings are “entirely statistical.”

“The only way forward, as I have repeatedly said, is to find a living bear that matches the 12S RNA and study fresh material from it,” he continued. “Which involves getting off your butt, not an activity I usually associate with desk-bound molecular taxonomists.”

Daniel Loxton, an editor for Junior Skeptic, which is produced by the Skeptics Society, told Live Science that people will continue to believe in and seek out yetis, bigfoots and the like, because they are”fascinated by monsters, and they’re fascinated by mysteries in general.”

Blake Smith, in a blog post for the Skeptics Society laid out the whole saga involving Sykes research and the more recent analysis by Guiterrez. Smith ultimately concluded that he’s “still convinced that Yeti and Bigfoot are not to be found in the forests and mountains of the Earth, but in the minds of people.”

View Article Here Read More

What our ancient ancestors found beautiful 50,000 years ago






Excerpt from news.discovery.com

The geode (above), described in the latest issue of Comptes Rendus Palevol, was found in the Cioarei-Boroşteni Cave, Romania. A Neanderthal had painted it with ochre.

"The Neanderthal man must have certainly attached an aesthetic importance to it, while its having been painted with ochre was an addition meant to confer symbolic value," said Marin Cârciumaru of Valahia University and colleagues.

The researchers also noted that "the geode was undoubtedly introduced into the cave by the Neanderthal," since they ruled out that it could have originated in the cave itself.

Was the geode used in rituals, or was it just a treasured object of beauty? Its precise meaning to the Neanderthal remains a mystery for now.




Based on archaeological finds, necklaces made out of Spondylus (a spiky, colorful mollusk) were all the rage. (Above)

This specimen has more of a reddish hue, but Michel Louis Séfériadès of CNRS notes that most are "a highly colored, very attractive purplish crimson." Séfériadès added that the shells were valued, early trade items and that they are now "found in the archaeological remains of settlements and cemeteries, in graves, and as isolated finds."

Some of the shells were made into jewelry, including necklaces and bracelets.

 

We sing about "five gold rings," but the rings would more likely have been ivory back in the day -- as in around 50,000 years ago, before ivory-producing animals were mostly hunted to extinction.
Early humans in northern regions, for example, made rings out of mammoth ivory. A Neanderthal site at Grotte du Renne, France yielded a carefully crafted ivory ring (above), as well as grooved and perforated "personal ornaments," according to archaeologist Paul Mellars of Cambridge University.



Charcoal (shown avove), ochre and other materials were applied to the face by early Homo sapiens as well as by other human subspecies. 

The ochre, used to paint the geode, mentioned earlier, was also used as makeup, hair dye, paint (to create rock and cave art), as well as to color garments.


Early humans used combs made out of shells and fish bones to both comb their hair and as personal decoration. (Above)

The shell from the Venus comb murex, a large predatory sea snail, is just one species that seems perfect for this purpose. Gibraltar Museum researchers Clive Finlayson and Kimberley Brown also found evidence that Neanderthals valued large, elaborate feathers, which the scientists suspect were worn by the individuals. 

Nearly all early cultures had coveted figurines holding probable symbolic value. Some of the earliest carved objects are known as "Venus" figurines. They present women with exaggerated sexual features. Their exact meaning remains unclear. (Above)

Pendants made of animal teeth were common and probably served many different functions, such as showing the hunter's success, offering symbolic protection, and just as fashion. 

Some of the funkiest-looking teeth were made into worn objects.
Animal teeth could be on a gift list dated to 540,000 years ago, and possibly earlier, as a recent study in the journal Nature found that a population of Homo erectus at Java, Indonesia, was collecting shark teeth and using them as tools and possibly as ornamentation.

 

The world's oldest known musical instrument is a bone flute (Above). While the earliest excavated flute dates to about 42,000 years ago, comparable flutes were probably made much earlier.

Flutes, like most of the items on this list, were not essential to survival, but yet they somehow contributed to the prehistoric peoples' quality of life.

View Article Here Read More

Bird Thought To Be Extinct Resurfaces In Myanmar

Jerdon's BabblerExcerpt from techtimes.comJerdon's Babbler is a species of bird that was believed to be extinct until this species unexpectedly resurfaced in Myanmar. This brown and white bird is roughly the size of a house sparrow.The bird was last ...

View Article Here Read More

Archaeologists find two lost cities deep in Honduras jungle


Archaeologists in Honduras have found dozens of artifacts at a site where they believe twin cities stood. Photograph: Dave Yoder/National Geographic
Archaeologists in Honduras have found dozens of artifacts at a site where they believe twin cities stood. Photograph: Dave Yoder/National Geographic
Excerpt from theguardian.com


Archaeological team say they have set foot in a place untouched by humans for at least 600 years in a site that may be the ‘lost city of the monkey god’

Archaeologists have discovered two lost cities in the deep jungle of Honduras, emerging from the forest with evidence of a pyramid, plazas and artifacts that include the effigy of a half-human, half-jaguar spirit.
The team of specialists in archaeology and other fields, escorted by three British bushwhacking guides and a detail of Honduran special forces, explored on foot a remote valley of La Mosquitia where an aerial survey had found signs of ruins in 2012.
Chris Fisher, the lead US archaeologist on the team, told the Guardian that the expedition – co-coordinated by the film-makers Bill Benenson and Steve Elkins, Honduras and National Geographic (which first reported the story on its site) – had by all appearances set foot in a place that had gone untouched by humans for at least 600 years.
“Even the animals acted as if they’ve never seen people,” Fisher said. “Spider monkeys are all over place, and they’d follow us around and throw food at us and hoot and holler and do their thing.”
“To be treated not as a predator but as another primate in their space was for me the most amazing thing about this whole trip,” he said.
Fisher and the team arrived by helicopter to “groundtruth” the data revealed by surveying technology called Lidar, which projects a grid of infrared beams powerful enough to break through the dense forest canopy.
The dense jungle of Honduras. Photograph: Dave Yoder/National Geographic
The dense jungle of Honduras.Photograph: Dave Yoder/National Geographic
That data showed a human-created landscape, Fisher said of sister cities not only with houses, plazas and structures, but also features “much like an English garden, with orchards and house gardens, fields of crops, and roads and paths.”
In the rainforest valley, they said they found stone structural foundations of two cities that mirrored people’s thinking of the Maya region, though these were not Mayan people. The area dates between 1000AD and 1400AD, and while very little is known without excavation of the site and surrounding region, Fisher said it was likely that European diseases had at least in part contributed to the culture’s disappearance.
The expedition also found and documented 52 artifacts that Virgilio Paredes, head of Honduras’s national anthropology and history institute, said indicated a civilisation distinct from the Mayans. Those artifacts included a bowl with an intricate carvings and semi-buried stone sculptures, including several that merged human and animal characteristics.
The cache of artifacts – “very beautiful, very fantastic,” in Fisher’s words – may have been a burial offering, he said, noting the effigies of spirit animals such as vultures and serpents.
Fisher said that while an archaeologist would likely not call these cities evidence of a lost civilisation, he would call it evidence of a culture or society. “Is it lost? Well, we don’t know anything about it,” he said.
The exploratory team did not have a permit to excavate and hopes to do so on a future expedition. “That’s the problem with archaeology is it takes a long time to get things done, another decade if we work intensively there, but then we’ll know a little more,” Fisher said.
Advertisement
“This wasn’t like some crazy colonial expedition of the last century,” he added.
Despite the abundance of monkeys, far too little is known of the site still to tie it to the “lost city of the monkey god” that one such expedition claimed to have discovered. In about 1940, the eccentric journalist Theodore Morde set off into the Honduran jungle in search of the legendary “white city” that Spanish conquistadors had heard tales of in the centuries before.
He broke out of the brush months later with hundreds of artifacts and extravagant stories of how ancient people worshipped their simian deity. According to Douglas Preston, the writer National Geographic sent along with its own expedition: “He refused to divulge the location out of fear, he said, that the site would be looted. He later committed suicide and his site – if it existed at all – was never identified.”
Fisher emphasised that archaeologists know extraordinarily little about the region’s ancient societies relative to the Maya civilisation, and that it would take more research and excavation. He said that although some academics might find it distasteful, expeditions financed through private means – in this case the film-makers Benenson and Elkins – would become increasingly commonplace as funding from universities and grants lessened.
Fisher also suggested that the Lidar infrared technology used to find the site would soon be as commonplace as radiocarbon dating: “People just have to get through this ‘gee-whiz’ phase and start thinking about what we can do with it.”
Paredes and Fisher also said that the pristine, densely-wooded site was dangerously close to land being deforested for beef farms that sell to fast-food chains. Global demand has driven Honduras’s beef industry, Fisher said, something that he found worrying.
“I keep thinking of those monkeys looking at me not having seen people before. To lose all this over a burger, it’s a really hard pill to swallow.”

View Article Here Read More

Fresh fossil studies push the dawn of man back to 2.8 million years

(Reuters) - A 2.8-million-year-old jawbone fossil with five intact teeth unearthed in an Ethiopian desert is pushing back the dawn of humankind by about half a million years.Scientists said on Wednesday the fossil represents the oldest known repres...

View Article Here Read More

What happens to your body when you give up sugar?





Excerpt from independent.co.uk
By Jordan Gaines Lewis


In neuroscience, food is something we call a “natural reward.” In order for us to survive as a species, things like eating, having sex and nurturing others must be pleasurable to the brain so that these behaviours are reinforced and repeated.
Evolution has resulted in the mesolimbic pathway, a brain system that deciphers these natural rewards for us. When we do something pleasurable, a bundle of neurons called the ventral tegmental area uses the neurotransmitter dopamine to signal to a part of the brain called the nucleus accumbens. The connection between the nucleus accumbens and our prefrontal cortex dictates our motor movement, such as deciding whether or not to taking another bite of that delicious chocolate cake. The prefrontal cortex also activates hormones that tell our body: “Hey, this cake is really good. And I’m going to remember that for the future.”
Not all foods are equally rewarding, of course. Most of us prefer sweets over sour and bitter foods because, evolutionarily, our mesolimbic pathway reinforces that sweet things provide a healthy source of carbohydrates for our bodies. When our ancestors went scavenging for berries, for example, sour meant “not yet ripe,” while bitter meant “alert – poison!”
Fruit is one thing, but modern diets have taken on a life of their own. A decade ago, it was estimated that the average American consumed 22 teaspoons of added sugar per day, amounting to an extra 350 calories; it may well have risen since then. A few months ago, one expert suggested that the average Briton consumes 238 teaspoons of sugar each week.
Today, with convenience more important than ever in our food selections, it’s almost impossible to come across processed and prepared foods that don’t have added sugars for flavour, preservation, or both.
These added sugars are sneaky – and unbeknown to many of us, we’ve become hooked. In ways that drugs of abuse – such as nicotine, cocaine and heroin – hijack the brain’s reward pathway and make users dependent, increasing neuro-chemical and behavioural evidence suggests that sugar is addictive in the same way, too.

Sugar addiction is real

Anyone who knows me also knows that I have a huge sweet tooth. I always have. My friend and fellow graduate student Andrew is equally afflicted, and living in Hershey, Pennsylvania – the “Chocolate Capital of the World” – doesn’t help either of us. But Andrew is braver than I am. Last year, he gave up sweets for Lent. “The first few days are a little rough,” Andrew told me. “It almost feels like you’re detoxing from drugs. I found myself eating a lot of carbs to compensate for the lack of sugar.”
There are four major components of addiction: bingeing, withdrawal, craving, and cross-sensitisation (the notion that one addictive substance predisposes someone to becoming addicted to another). All of these components have been observed in animal models of addiction – for sugar, as well as drugs of abuse.
A typical experiment goes like this: rats are deprived of food for 12 hours each day, then given 12 hours of access to a sugary solution and regular chow. After a month of following this daily pattern, rats display behaviours similar to those on drugs of abuse. They’ll binge on the sugar solution in a short period of time, much more than their regular food. They also show signs of anxiety and depression during the food deprivation period. Many sugar-treated rats who are later exposed to drugs, such as cocaine and opiates, demonstrate dependent behaviours towards the drugs compared to rats who did not consume sugar beforehand.
Like drugs, sugar spikes dopamine release in the nucleus accumbens. Over the long term, regular sugar consumption actually changes the gene expression and availability of dopamine receptors in both the midbrain and frontal cortex. Specifically, sugar increases the concentration of a type of excitatory receptor called D1, but decreases another receptor type called D2, which is inhibitory. Regular sugar consumption also inhibits the action of the dopamine transporter, a protein which pumps dopamine out of the synapse and back into the neuron after firing.
In short, this means that repeated access to sugar over time leads to prolonged dopamine signalling, greater excitation of the brain’s reward pathways and a need for even more sugar to activate all of the midbrain dopamine receptors like before. The brain becomes tolerant to sugar – and more is needed to attain the same “sugar high.”

Sugar withdrawal is also real

Although these studies were conducted in rodents, it’s not far-fetched to say that the same primitive processes are occurring in the human brain, too. “The cravings never stopped, [but that was] probably psychological,” Andrew told me. “But it got easier after the first week or so.”
In a 2002 study by Carlo Colantuoni and colleagues of Princeton University, rats who had undergone a typical sugar dependence protocol then underwent “sugar withdrawal.” This was facilitated by either food deprivation or treatment with naloxone, a drug used for treating opiate addiction which binds to receptors in the brain’s reward system. Both withdrawal methods led to physical problems, including teeth chattering, paw tremors, and head shaking. Naloxone treatment also appeared to make the rats more anxious, as they spent less time on an elevated apparatus that lacked walls on either side.
Similar withdrawal experiments by others also report behaviour similar to depression in tasks such as the forced swim test. Rats in sugar withdrawal are more likely to show passive behaviours (like floating) than active behaviours (like trying to escape) when placed in water, suggesting feelings of helplessness.
A new study published by Victor Mangabeira and colleagues in this month’s Physiology & Behavior reports that sugar withdrawal is also linked to impulsive behaviour. Initially, rats were trained to receive water by pushing a lever. After training, the animals returned to their home cages and had access to a sugar solution and water, or just water alone. After 30 days, when rats were again given the opportunity to press a lever for water, those who had become dependent on sugar pressed the lever significantly more times than control animals, suggesting impulsive behaviour.
These are extreme experiments, of course. We humans aren’t depriving ourselves of food for 12 hours and then allowing ourselves to binge on soda and doughnuts at the end of the day. But these rodent studies certainly give us insight into the neuro-chemical underpinnings of sugar dependence, withdrawal, and behaviour.
Through decades of diet programmes and best-selling books, we’ve toyed with the notion of “sugar addiction” for a long time. There are accounts of those in “sugar withdrawal” describing food cravings, which can trigger relapse and impulsive eating. There are also countless articles and books about the boundless energy and new-found happiness in those who have sworn off sugar for good. But despite the ubiquity of sugar in our diets, the notion of sugar addiction is still a rather taboo topic.
Are you still motivated to give up sugar? You might wonder how long it will take until you’re free of cravings and side-effects, but there’s no answer – everyone is different and no human studies have been done on this. But after 40 days, it’s clear that Andrew had overcome the worst, likely even reversing some of his altered dopamine signalling. “I remember eating my first sweet and thinking it was too sweet,” he said. “I had to rebuild my tolerance.”
And as regulars of a local bakery in Hershey – I can assure you, readers, that he has done just that.
Jordan Gaines Lewis is a Neuroscience Doctoral Candidate at Penn State College of Medicine

View Article Here Read More

Bees Do It, Humans Do It ~ Bees can experience false memories, scientists say



Excerpt from csmonitor.com


Researchers at Queen Mary University of London have found the first evidence of false memories in non-human animals.

It has long been known that humans – even those of us who aren't famous news anchors – tend to recall events that did not actually occur. The same is likely true for mice: In 2013, scientists at MIT induced false memories of trauma in mice, and the following year, they used light to manipulate mice brains to turn painful memories into pleasant ones.

Now, researchers at Queen Mary University of London have shown for the first time that insects, too, can create false memories. Using a classic Pavlovian experiment, co-authors Kathryn Hunt and Lars Chittka determined that bumblebees sometimes combine the details of past memories to form new ones. Their findings were published today in Current Biology.
“I suspect the phenomenon may be widespread in the animal kingdom," Dr. Chittka said in a written statement to the Monitor.
First, Chittka and Dr. Hunt trained their buzzing subjects to expect a reward if they visited two artificial flowers – one solid yellow, the other with black-and-white rings. The order didn’t matter, so long as the bee visited both flowers. In later tests, they would present a choice of the original two flower types, plus one new one. The third type was a combination of the first two, featuring yellow-and-white rings. At first, the bees consistently selected the original two flowers, the ones that offered a reward.

But a good night’s sleep seemed to change all that. One to three days after training, the bees became confused and started incorrectly choosing the yellow-and-white flower (up to fifty percent of the time). They seemed to associate that pattern with a reward, despite having never actually seen it before. In other words, the bumblebees combined the memories of two previous stimuli to generate a new, false memory.

“Bees might, on occasion, form merged memories of flower patterns visited in the past,” Chittka said. “Should a bee unexpectedly encounter real flowers that match these false memories, they might experience a kind of deja-vu and visit these flowers expecting a rich reward.”

Bees have a rather limited brain capacity, Chittka says, so it’s probably useful for them to “economize” by storing generalized memories instead of minute details.

“In bees, for example, the ability to learn more than one flower type is certainly useful,” Chittka said, “as is the ability to extract commonalities of multiple flower patterns. But this very ability might come at the cost of bees merging memories from multiple sequential experiences.”

Chittka has studied memory in bumblebees for two decades. Bees can be raised and kept in a lab setting, so they make excellent long-term test subjects.

“They are [also] exceptionally clever animals that can memorize the colors, patterns, and scents of multiple flower species – as well as navigate efficiently over long distances,” Chittka said.

In past studies, it was assumed that animals that failed to perform learned tasks had either forgotten them or hadn’t really learned them in the first place. Chittka’s research seems to show that animal memory mechanisms are much more elaborate – at least when it comes to bumblebees.

“I think we need to move beyond understanding animal memory as either storing or not storing stimuli or episodes,” Chittka said. “The contents of memory are dynamic. It is clear from studies on human memory that they do not just fade over time, but can also change and integrate with other memories to form new information. The same is likely to be the case in many animals.”

Chittka hopes this study will lead to a greater biological understanding of false memories – in animals and humans alike. He says that false memories aren’t really a “bug in the system,” but a side effect of complex brains that strive to learn the big picture and to prepare for new experiences.

“Errors in human memory range from misremembering minor details of events to generating illusory memories of entire episodes,” Chittka said. “These inaccuracies have wide-ranging implications in crime witness accounts and in the courtroom, but I believe that – like the quirks of information processing that occur in well known optical illusions – they really are the byproduct of otherwise adaptive processes.”

“The ability to memorize the overarching principles of a number of different events might help us respond in previously un-encountered situations,” Chittka added. “But these abilities might come at the expense of remembering every detail correctly.”
So, if generating false memories goes hand in hand with having a nervous system, does all this leave Brian Williams off the hook?

“It is possible that he conflated the memories,” Chittka said, “depending on his individual vulnerability to witnessing a traumatic event, plus a possible susceptibility to false memories – there is substantial inter-person variation with respect to this. It is equally possible that he was just ‘showing off’ when reporting the incident, and is now resorting to a simple lie to try to escape embarrassment. That is impossible for me to diagnose.”

But if Mr. Williams genuinely did misremember his would-be brush with death, Chittka says he shouldn’t be vilified.

“You cannot morally condemn someone for reporting something they think really did happen to them,” Chittka said. “You cannot blame an Alzheimer patient for forgetting to blow out the candle, even if they burn down the house as a result. In the same way, you can't blame someone who misremembers a crime as a result of false memory processes."

View Article Here Read More

Earth’s Moon May Not Be Critical to Life Afterall




Excerpt from space.com

The moon has long been viewed as a crucial component in creating an environment suitable for the evolution of complex life on Earth, but a number of scientific results in recent years have shown that perhaps our planet doesn't need the moon as much as we have thought.

In 1993, French astronomer Jacques Laskar ran a series of calculations indicating that the gravity of the moon is vital to stabilizing the tilt of our planet. Earth's obliquity, as this tilt is technically known as, has huge repercussions for climate. Laskar argued that should Earth's obliquity wander over hundreds of thousands of years, it would cause environmental chaos by creating a climate too variable for complex life to develop in relative peace.
So his argument goes, we should feel remarkably lucky to have such a large moon on our doorstep, as no other terrestrial planet in our solar system has such a moon. Mars' two satellites, Phobos and Deimos, are tiny, captured asteroids that have little known effect on the Red Planet. Consequently, Mars' tilt wobbles chaotically over timescales of millions of years, with evidence for swings in its rotational axis at least as large as 45 degrees. 


The stroke of good fortune that led to Earth possessing an unlikely moon, specifically the collision 4.5 billion years ago between Earth and a Mars-sized proto-planet that produced the debris from which our Moon formed, has become one of the central tenets of the 'Rare Earth' hypothesis. Famously promoted by Peter Ward and Don Brownlee, it argues that planets where everything is just right for complex life are exceedingly rare.

New findings, however, are tearing up the old rule book. In 2011, a trio of scientists — Jack Lissauer of NASA Ames Research Center, Jason Barnes of the University of Idaho and John Chambers of the Carnegie Institution for Science — published results from new simulations describing what Earth's obliquity would be like without the moon. What they found was surprising.

"We were looking into how obliquity might vary for all sorts of planetary systems," says Lissauer. "To test our code we began with integrations following the obliquity of Mars and found similar results to other people. But when we did the obliquity of Earth we found the variations were much smaller than expected — nowhere near as extreme as previous calculations suggested they would be."
Lissauer's team found that without the moon, Earth's rotational axis would only wobble by 10 degrees more than its present day angle of 23.5 degrees. The reason for such vastly different results to those attained by Jacques Laskar is pure computing power. Today's computers are much faster and capable of more accurate modeling with far more data than computers of the 1990s.

Lissauer and his colleagues also found that if Earth were spinning fast, with one day lasting less than 10 hours, or rotating retrograde (i.e. backwards so that the sun rose in the West and set in the East), then Earth stabilized itself thanks to the gravitational resonances with other planets, most notably giant Jupiter. There would be no need for a large moon. 

Earth's rotation has not always been as leisurely as the current 24 hour spin-rate. Following the impact that formed the moon, Earth was spinning once every four or five hours, but it has since gradually slowed by the moon's presence. As for the length of Earth's day prior to the moon-forming impact, nobody really knows, but some models of the impact developed by Robin Canup of the Southwest Research Institute, in Boulder, Colorado, suggest that Earth could have been rotating fast, or even retrograde, prior to the collision.

Tilted Orbits
Planets with inclined orbits could find that their increased obliquity is beneficial to their long-term climate – as long as they do not have a large moon.


"Collisions in the epoch during which Earth was formed determined its initial rotation," says Lissauer. "For rocky planets, some of the models say most of them will be prograde, but others say comparable numbers of planets will be prograde and retrograde. Certainly, retrograde worlds are not expected to be rare."

The upshot of Lissauer's findings is that the presence of a moon is not the be all and end all as once thought, and a terrestrial planet can exist without a large moon and still retain its habitability. Indeed, it is possible to imagine some circumstances where having a large moon would actually be pretty bad for life.

Rory Barnes, of the University of Washington, has also tackled the problem of obliquity, but from a different perspective. Planets on the edge of habitable zones exist in a precarious position, far enough away from their star that, without a thick, insulating atmosphere, they freeze over, just like Mars. Barnes and his colleagues including John Armstrong of Weber State University, realized that torques from other nearby worlds could cause a planet's inclination to the ecliptic plane to vary. This in turn would result in a change of obliquity; the greater the inclination, the greater the obliquity to the Sun. Barnes and Armstrong saw that this could be a good thing for planets on the edges of habitable zones, allowing heat to be distributed evenly over geological timescales and preventing "Snowball Earth" scenarios. They called these worlds "tilt-a-worlds," but the presence of a large moon would counteract this beneficial obliquity change.

"I think one of the most important points from our tilt-a-world paper is that at the outer edge of the habitable zone, having a large moon is bad, there's no other way to look at it," says Barnes. "If you have a large moon that stabilizes the obliquity then you have a tendency to completely freeze over."

Barnes is impressed with the work of Lissauer's team.
"I think it is a well done study," he says. "It suggests that Earth does not need the moon to have a relatively stable climate. I don't think there would be any dire consequences to not having a moon."

Mars' Changing Tilt
The effects of changing obliquity on Mars’ climate. Mars’ current 25-degree tilt is seen at top left. At top right is a Mars that has a high obliquity, leading to ice gather at its equator while the poles point sunwards. At bottom is Mars with low obliquity, which sees its polar caps grow in size.


Of course, the moon does have a hand in other factors important to life besides planetary obliquity. Tidal pools may have been the point of origin of life on Earth. Although the moon produces the largest tides, the sun also influences tides, so the lack of a large moon is not necessarily a stumbling block. Some animals have also evolved a life cycle based on the cycle of the moon, but that's more happenstance than an essential component for life.

"Those are just minor things," says Lissauer.

Without the absolute need for a moon, astrobiologists seeking life and habitable worlds elsewhere face new opportunities. Maybe Earth, with its giant moon, is actually the oddball amongst habitable planets. Rory Barnes certainly doesn't think we need it.
"It will be a step forward to see the myth that a habitable planet needs a large moon dispelled," he says, to which Lissauer agrees.
Earth without its moon might therefore remain habitable, but we should still cherish its friendly presence. After all, would Beethoven have written the Moonlight Sonata without it?

View Article Here Read More
Older posts Newer posts

Creative Commons License
This work is licensed under a
Creative Commons Attribution 4.0
International License
.
unless otherwise marked.

Terms of Use | Privacy Policy



Up ↑