Tag: species (page 3 of 9)

Are We An Alien Experiment?

Although its possible those responsible for our Earthen experiment may possess a far different form then we, I feel it more probable we were created in our family's image. Greg  Excerpt from rense.com  Even the most hardened skeptic mus...

View Article Here Read More

Bird Thought To Be Extinct Resurfaces In Myanmar

Jerdon's BabblerExcerpt from techtimes.comJerdon's Babbler is a species of bird that was believed to be extinct until this species unexpectedly resurfaced in Myanmar. This brown and white bird is roughly the size of a house sparrow.The bird was last ...

View Article Here Read More

Fresh fossil studies push the dawn of man back to 2.8 million years

(Reuters) - A 2.8-million-year-old jawbone fossil with five intact teeth unearthed in an Ethiopian desert is pushing back the dawn of humankind by about half a million years.Scientists said on Wednesday the fossil represents the oldest known repres...

View Article Here Read More

What happens to your body when you give up sugar?





Excerpt from independent.co.uk
By Jordan Gaines Lewis


In neuroscience, food is something we call a “natural reward.” In order for us to survive as a species, things like eating, having sex and nurturing others must be pleasurable to the brain so that these behaviours are reinforced and repeated.
Evolution has resulted in the mesolimbic pathway, a brain system that deciphers these natural rewards for us. When we do something pleasurable, a bundle of neurons called the ventral tegmental area uses the neurotransmitter dopamine to signal to a part of the brain called the nucleus accumbens. The connection between the nucleus accumbens and our prefrontal cortex dictates our motor movement, such as deciding whether or not to taking another bite of that delicious chocolate cake. The prefrontal cortex also activates hormones that tell our body: “Hey, this cake is really good. And I’m going to remember that for the future.”
Not all foods are equally rewarding, of course. Most of us prefer sweets over sour and bitter foods because, evolutionarily, our mesolimbic pathway reinforces that sweet things provide a healthy source of carbohydrates for our bodies. When our ancestors went scavenging for berries, for example, sour meant “not yet ripe,” while bitter meant “alert – poison!”
Fruit is one thing, but modern diets have taken on a life of their own. A decade ago, it was estimated that the average American consumed 22 teaspoons of added sugar per day, amounting to an extra 350 calories; it may well have risen since then. A few months ago, one expert suggested that the average Briton consumes 238 teaspoons of sugar each week.
Today, with convenience more important than ever in our food selections, it’s almost impossible to come across processed and prepared foods that don’t have added sugars for flavour, preservation, or both.
These added sugars are sneaky – and unbeknown to many of us, we’ve become hooked. In ways that drugs of abuse – such as nicotine, cocaine and heroin – hijack the brain’s reward pathway and make users dependent, increasing neuro-chemical and behavioural evidence suggests that sugar is addictive in the same way, too.

Sugar addiction is real

Anyone who knows me also knows that I have a huge sweet tooth. I always have. My friend and fellow graduate student Andrew is equally afflicted, and living in Hershey, Pennsylvania – the “Chocolate Capital of the World” – doesn’t help either of us. But Andrew is braver than I am. Last year, he gave up sweets for Lent. “The first few days are a little rough,” Andrew told me. “It almost feels like you’re detoxing from drugs. I found myself eating a lot of carbs to compensate for the lack of sugar.”
There are four major components of addiction: bingeing, withdrawal, craving, and cross-sensitisation (the notion that one addictive substance predisposes someone to becoming addicted to another). All of these components have been observed in animal models of addiction – for sugar, as well as drugs of abuse.
A typical experiment goes like this: rats are deprived of food for 12 hours each day, then given 12 hours of access to a sugary solution and regular chow. After a month of following this daily pattern, rats display behaviours similar to those on drugs of abuse. They’ll binge on the sugar solution in a short period of time, much more than their regular food. They also show signs of anxiety and depression during the food deprivation period. Many sugar-treated rats who are later exposed to drugs, such as cocaine and opiates, demonstrate dependent behaviours towards the drugs compared to rats who did not consume sugar beforehand.
Like drugs, sugar spikes dopamine release in the nucleus accumbens. Over the long term, regular sugar consumption actually changes the gene expression and availability of dopamine receptors in both the midbrain and frontal cortex. Specifically, sugar increases the concentration of a type of excitatory receptor called D1, but decreases another receptor type called D2, which is inhibitory. Regular sugar consumption also inhibits the action of the dopamine transporter, a protein which pumps dopamine out of the synapse and back into the neuron after firing.
In short, this means that repeated access to sugar over time leads to prolonged dopamine signalling, greater excitation of the brain’s reward pathways and a need for even more sugar to activate all of the midbrain dopamine receptors like before. The brain becomes tolerant to sugar – and more is needed to attain the same “sugar high.”

Sugar withdrawal is also real

Although these studies were conducted in rodents, it’s not far-fetched to say that the same primitive processes are occurring in the human brain, too. “The cravings never stopped, [but that was] probably psychological,” Andrew told me. “But it got easier after the first week or so.”
In a 2002 study by Carlo Colantuoni and colleagues of Princeton University, rats who had undergone a typical sugar dependence protocol then underwent “sugar withdrawal.” This was facilitated by either food deprivation or treatment with naloxone, a drug used for treating opiate addiction which binds to receptors in the brain’s reward system. Both withdrawal methods led to physical problems, including teeth chattering, paw tremors, and head shaking. Naloxone treatment also appeared to make the rats more anxious, as they spent less time on an elevated apparatus that lacked walls on either side.
Similar withdrawal experiments by others also report behaviour similar to depression in tasks such as the forced swim test. Rats in sugar withdrawal are more likely to show passive behaviours (like floating) than active behaviours (like trying to escape) when placed in water, suggesting feelings of helplessness.
A new study published by Victor Mangabeira and colleagues in this month’s Physiology & Behavior reports that sugar withdrawal is also linked to impulsive behaviour. Initially, rats were trained to receive water by pushing a lever. After training, the animals returned to their home cages and had access to a sugar solution and water, or just water alone. After 30 days, when rats were again given the opportunity to press a lever for water, those who had become dependent on sugar pressed the lever significantly more times than control animals, suggesting impulsive behaviour.
These are extreme experiments, of course. We humans aren’t depriving ourselves of food for 12 hours and then allowing ourselves to binge on soda and doughnuts at the end of the day. But these rodent studies certainly give us insight into the neuro-chemical underpinnings of sugar dependence, withdrawal, and behaviour.
Through decades of diet programmes and best-selling books, we’ve toyed with the notion of “sugar addiction” for a long time. There are accounts of those in “sugar withdrawal” describing food cravings, which can trigger relapse and impulsive eating. There are also countless articles and books about the boundless energy and new-found happiness in those who have sworn off sugar for good. But despite the ubiquity of sugar in our diets, the notion of sugar addiction is still a rather taboo topic.
Are you still motivated to give up sugar? You might wonder how long it will take until you’re free of cravings and side-effects, but there’s no answer – everyone is different and no human studies have been done on this. But after 40 days, it’s clear that Andrew had overcome the worst, likely even reversing some of his altered dopamine signalling. “I remember eating my first sweet and thinking it was too sweet,” he said. “I had to rebuild my tolerance.”
And as regulars of a local bakery in Hershey – I can assure you, readers, that he has done just that.
Jordan Gaines Lewis is a Neuroscience Doctoral Candidate at Penn State College of Medicine

View Article Here Read More

Should Humanity Try to Contact Alien Civilizations?



Some researchers want to use big radio dishes like the 305-meter Arecibo Observatory in Puerto Rico to announce our presence to intelligent aliens.



Excerpt from space.com
by Mike Wall

Is it time to take the search for intelligent aliens to the next level?
For more than half a century, scientists have been scanning the heavens for signals generated by intelligent alien life. They haven't found anything conclusive yet, so some researchers are advocating adding an element called "active SETI" (search for extraterrestrial intelligence) — not just listening, but also beaming out transmissions of our own designed to catch aliens' eyes.

Active SETI "may just be the approach that lets us make contact with life beyond Earth," Douglas Vakoch, director of interstellar message composition at the SETI Institute in Mountain View, California, said earlier this month during a panel discussion at the annual meeting of the American Association for the Advancement of Science (AAAS) in San Jose.

Seeking contact


Vakoch envisions using big radio dishes such as the Arecibo Observatory in Puerto Rico to blast powerful, information-laden transmissions at nearby stars, in a series of relatively cheap, small-scale projects.

"Whenever any of the planetary radar folks are doing their asteroid studies, and they have an extra half an hour before or after, there's always a target star readily available that they can shift to without a lot of extra slough time," he said.

The content of any potential active SETI message is a subject of considerable debate. If it were up to astronomer Seth Shostak, Vakoch's SETI Institute colleague, we'd beam the entire Internet out into space.

"It's like sending a lot of hieroglyphics to the 19th century — they [aliens] can figure it out based on the redundancy," Shostak said during the AAAS discussion. "So, I think in terms of messages, we should send everything."

While active SETI could help make humanity's presence known to extrasolar civilizations, the strategy could also aid the more traditional "passive" search for alien intelligence, Shostak added.
"If you're going to run SETI experiments, where you're trying to listen for a putative alien broadcast, it may be very instructive to have to construct a transmitting project," he said. "Because now, you walk a mile in the Klingons' shoes, assuming they have them."

Cause for concern?

But active SETI is a controversial topic. Humanity has been a truly technological civilization for only a few generations; we're less than 60 years removed from launching our first satellite to Earth orbit, for example. So the chances are that any extraterrestrials who pick up our signals would be far more advanced than we are. 

This likelihood makes some researchers nervous, including famed theoretical physicist Stephen Hawking.

"Such advanced aliens would perhaps become nomads, looking to conquer and colonize whatever planets they could reach," Hawking said in 2010 on an episode of "Into the Universe with Stephen Hawking," a TV show that aired on the Discovery Channel. "If so, it makes sense for them to exploit each new planet for material to build more spaceships so they could move on. Who knows what the limits would be?"

Astrophysicist and science fiction author David Brin voiced similar concerns during the AAAS event, saying there's no reason to assume that intelligent aliens would be altruistic.

"This is an area in which discussion is called for," Brin said. "What are the motivations of species that they might carry with them into their advanced forms, that might color their cultures?"

Brin stressed that active SETI shouldn't be done in a piecemeal, ad hoc fashion by small groups of astronomers.

"This is something that should be discussed worldwide, and it should involve our peers in many other specialties, such as history," he said. "The historians would tell us, 'Well, gee, we have some examples of first-contact scenarios between advanced technological civilizations and not-so-advanced technological civilizations.' Gee, how did all of those turn out? Even when they were handled with goodwill, there was still pain."

Out there already

Vakoch and Shostak agreed that international discussion and cooperation are desirable. But Shostak said that achieving any kind of consensus on the topic of active SETI may be difficult. For example, what if polling reveals that 60 percent of people on Earth are in favor of the strategy, while 40 percent are opposed?

"Do we then have license to go ahead and transmit?" Shostak said. "That's the problem, I think, with this whole 'let's have some international discussion' [idea], because I don't know what the decision metric is."

Vakoch and Shostak also said that active SETI isn't as big a leap as it may seem at first glance: Our civilization has been beaming signals out into the universe unintentionally for a century, since the radio was invented.

"The reality is that any civilization that has the ability to travel between the stars can already pick up our accidental radio and TV leakage," Vakoch said. "A civilization just 200 to 300 years more advanced than we are could pick up our leakage radiation at a distance of several hundred light-years. So there are no increased dangers of an alien invasion through active SETI."

But Brin disputed this assertion, saying the so-called "barn door excuse" is a myth.

"It is very difficult for advanced civilizations to have picked us up at our noisiest in the 1980s, when we had all these military radars and these big television antennas," he said.

Shostak countered that a fear of alien invasion, if taken too far, could hamper humanity's expansion throughout the solar system, an effort that will probably require the use of high-powered transmissions between farflung outposts.

"Do you want to hamstring all that activity — not for the weekend, not just shut down the radars next week, or active SETI this year, but shut down humanity forever?" Shostak said. "That's a price I'm not willing to pay."

So the discussion and debate continues — and may continue for quite some time.

"This is the only really important scientific field without any subject matter," Brin said. "It's an area in which opinion rules, and everybody has a very fierce opinion."

View Article Here Read More

Bees Do It, Humans Do It ~ Bees can experience false memories, scientists say



Excerpt from csmonitor.com


Researchers at Queen Mary University of London have found the first evidence of false memories in non-human animals.

It has long been known that humans – even those of us who aren't famous news anchors – tend to recall events that did not actually occur. The same is likely true for mice: In 2013, scientists at MIT induced false memories of trauma in mice, and the following year, they used light to manipulate mice brains to turn painful memories into pleasant ones.

Now, researchers at Queen Mary University of London have shown for the first time that insects, too, can create false memories. Using a classic Pavlovian experiment, co-authors Kathryn Hunt and Lars Chittka determined that bumblebees sometimes combine the details of past memories to form new ones. Their findings were published today in Current Biology.
“I suspect the phenomenon may be widespread in the animal kingdom," Dr. Chittka said in a written statement to the Monitor.
First, Chittka and Dr. Hunt trained their buzzing subjects to expect a reward if they visited two artificial flowers – one solid yellow, the other with black-and-white rings. The order didn’t matter, so long as the bee visited both flowers. In later tests, they would present a choice of the original two flower types, plus one new one. The third type was a combination of the first two, featuring yellow-and-white rings. At first, the bees consistently selected the original two flowers, the ones that offered a reward.

But a good night’s sleep seemed to change all that. One to three days after training, the bees became confused and started incorrectly choosing the yellow-and-white flower (up to fifty percent of the time). They seemed to associate that pattern with a reward, despite having never actually seen it before. In other words, the bumblebees combined the memories of two previous stimuli to generate a new, false memory.

“Bees might, on occasion, form merged memories of flower patterns visited in the past,” Chittka said. “Should a bee unexpectedly encounter real flowers that match these false memories, they might experience a kind of deja-vu and visit these flowers expecting a rich reward.”

Bees have a rather limited brain capacity, Chittka says, so it’s probably useful for them to “economize” by storing generalized memories instead of minute details.

“In bees, for example, the ability to learn more than one flower type is certainly useful,” Chittka said, “as is the ability to extract commonalities of multiple flower patterns. But this very ability might come at the cost of bees merging memories from multiple sequential experiences.”

Chittka has studied memory in bumblebees for two decades. Bees can be raised and kept in a lab setting, so they make excellent long-term test subjects.

“They are [also] exceptionally clever animals that can memorize the colors, patterns, and scents of multiple flower species – as well as navigate efficiently over long distances,” Chittka said.

In past studies, it was assumed that animals that failed to perform learned tasks had either forgotten them or hadn’t really learned them in the first place. Chittka’s research seems to show that animal memory mechanisms are much more elaborate – at least when it comes to bumblebees.

“I think we need to move beyond understanding animal memory as either storing or not storing stimuli or episodes,” Chittka said. “The contents of memory are dynamic. It is clear from studies on human memory that they do not just fade over time, but can also change and integrate with other memories to form new information. The same is likely to be the case in many animals.”

Chittka hopes this study will lead to a greater biological understanding of false memories – in animals and humans alike. He says that false memories aren’t really a “bug in the system,” but a side effect of complex brains that strive to learn the big picture and to prepare for new experiences.

“Errors in human memory range from misremembering minor details of events to generating illusory memories of entire episodes,” Chittka said. “These inaccuracies have wide-ranging implications in crime witness accounts and in the courtroom, but I believe that – like the quirks of information processing that occur in well known optical illusions – they really are the byproduct of otherwise adaptive processes.”

“The ability to memorize the overarching principles of a number of different events might help us respond in previously un-encountered situations,” Chittka added. “But these abilities might come at the expense of remembering every detail correctly.”
So, if generating false memories goes hand in hand with having a nervous system, does all this leave Brian Williams off the hook?

“It is possible that he conflated the memories,” Chittka said, “depending on his individual vulnerability to witnessing a traumatic event, plus a possible susceptibility to false memories – there is substantial inter-person variation with respect to this. It is equally possible that he was just ‘showing off’ when reporting the incident, and is now resorting to a simple lie to try to escape embarrassment. That is impossible for me to diagnose.”

But if Mr. Williams genuinely did misremember his would-be brush with death, Chittka says he shouldn’t be vilified.

“You cannot morally condemn someone for reporting something they think really did happen to them,” Chittka said. “You cannot blame an Alzheimer patient for forgetting to blow out the candle, even if they burn down the house as a result. In the same way, you can't blame someone who misremembers a crime as a result of false memory processes."

View Article Here Read More

Another Problem for Evolution Theory? ‘Big Brain’ Gene Found in Humans, But Not in Chimps



Image: Mouse brain
M. Florio and W. Huttner / Max Planck Institute
This embryonic mouse cerebral cortex was stained to identify cell nuclei (in blue) and a marker for deep-layer neurons (in red). The human-specific gene known as ARHGAP11B was selectively expressed in the right hemisphere: Note the folding of the neocortical surface.

Excerpt from  nbcnews.com

By Tia Ghose

ave the way for the rise of human intelligence by dramatically increasing the number of neurons found in a key brain region. 

This gene seems to be uniquely human: It is found in modern-day humans, Neanderthals and another branch of extinct humans called Denisovans, but not in chimpanzees. 

By allowing the brain region called the neocortex to contain many more neurons, the tiny snippet of DNA may have laid the foundation for the human brain's massive expansion.
"It is so cool that one tiny gene alone may suffice to affect the phenotype of the stem cells, which contributed the most to the expansion of the neocortex," said study lead author Marta Florio, a doctoral candidate in molecular and cellular biology and genetics at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany. 

She and her colleagues found that the gene, called ARHGAP11B, is turned on and highly activated in the human neural progenitor cells, but isn't present at all in mouse cells. This tiny snippet of DNA, just 804 genetic bases long, was once part of a much longer gene. Somehow, this fragment was duplicated, and the duplicated fragment was inserted into the human genome. 

In follow-up experiments, the team inserted and turned on this DNA snippet in the brains of mice. The mice with the gene insertion grew what looked like larger neocortex regions. 

The researchers reviewed a wide variety of genomes from modern-day and extinct species — confirming that Neanderthals and Denisovans had this gene, while chimpanzees and mice do not. That suggests that the gene emerged soon after humans split off from chimpanzees, and that it helped pave the way for the rapid expansion of the human brain. 

Florio stressed that the gene is probably just one of many genetic changes that make human cognition special.

The gene was described in a paper published online Thursday by the journal Science.

View Article Here Read More

Why science is so hard to believe?

 
In the recent movie “Interstellar,” set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked.


Excerpt from 


There’s a scene in Stanley Kubrick’s comic masterpiece “Dr. Strangelove” in which Jack D. Ripper, an American general who’s gone rogue and ordered a nuclear attack on the Soviet Union, unspools his paranoid worldview — and the explanation for why he drinks “only distilled water, or rainwater, and only pure grain alcohol” — to Lionel Mandrake, a dizzy-with-anxiety group captain in the Royal Air Force.
Ripper: “Have you ever heard of a thing called fluoridation? Fluoridation of water?”
Mandrake: “Ah, yes, I have heard of that, Jack. Yes, yes.”Ripper: “Well, do you know what it is?”
Mandrake: “No. No, I don’t know what it is, no.”
Ripper: “Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?” 

The movie came out in 1964, by which time the health benefits of fluoridation had been thoroughly established and anti-fluoridation conspiracy theories could be the stuff of comedy. Yet half a century later, fluoridation continues to incite fear and paranoia. In 2013, citizens in Portland, Ore., one of only a few major American cities that don’t fluoridate, blocked a plan by local officials to do so. Opponents didn’t like the idea of the government adding “chemicals” to their water. They claimed that fluoride could be harmful to human health.

Actually fluoride is a natural mineral that, in the weak concentrations used in public drinking-water systems, hardens tooth enamel and prevents tooth decay — a cheap and safe way to improve dental health for everyone, rich or poor, conscientious brushers or not. That’s the scientific and medical consensus.
To which some people in Portland, echoing anti-fluoridation activists around the world, reply: We don’t believe you.
We live in an age when all manner of scientific knowledge — from the safety of fluoride and vaccines to the reality of climate change — faces organized and often furious opposition. Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative.
Science doubt has become a pop-culture meme. In the recent movie “Interstellar,” set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked.


The debate about mandated vaccinations has the political world talking. A spike in measles cases nationwide has President Obama, lawmakers and even potential 2016 candidates weighing in on the vaccine controversy. (Pamela Kirkland/The Washington Post)
In a sense this is not surprising. Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable and rich in rewards — but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.
We’re asked to accept, for example, that it’s safe to eat food containing genetically modified organisms (GMOs) because, the experts point out, there’s no evidence that it isn’t and no reason to believe that altering genes precisely in a lab is more dangerous than altering them wholesale through traditional breeding. But to some people, the very idea of transferring genes between species conjures up mad scientists running amok — and so, two centuries after Mary Shelley wrote “Frankenstein,” they talk about Frankenfood.
The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy. Should we be afraid that the Ebola virus, which is spread only by direct contact with bodily fluids, will mutate into an airborne super-plague? The scientific consensus says that’s extremely unlikely: No virus has ever been observed to completely change its mode of transmission in humans, and there’s zero evidence that the latest strain of Ebola is any different. But Google “airborne Ebola” and you’ll enter a dystopia where this virus has almost supernatural powers, including the power to kill us all.
In this bewildering world we have to decide what to believe and how to act on that. In principle, that’s what science is for. “Science is not a body of facts,” says geophysicist Marcia McNutt, who once headed the U.S. Geological Survey and is now editor of Science, the prestigious journal. “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.”
The scientific method leads us to truths that are less than self-evident, often mind-blowing and sometimes hard to swallow. In the early 17th century, when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense — because it sure looks like the sun’s going around the Earth, and you can’t feel the Earth spinning. Galileo was put on trial and forced to recant. Two centuries later, Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales and even deep-sea mollusks is still a big ask for a lot of people.
Even when we intellectually accept these precepts of science, we subconsciously cling to our intuitions — what researchers call our naive beliefs. A study by Andrew Shtulman of Occidental College showed that even students with an advanced science education had a hitch in their mental gait when asked to affirm or deny that humans are descended from sea animals and that the Earth goes around the sun. Both truths are counterintuitive. The students, even those who correctly marked “true,” were slower to answer those questions than questions about whether humans are descended from tree-dwelling creatures (also true but easier to grasp) and whether the moon goes around the Earth (also true but intuitive).
Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They nest in our brains, chirping at us as we try to make sense of the world.
Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics. We might get a prostate-specific antigen test, even though it’s no longer generally recommended, because it caught a close friend’s cancer — and we pay less attention to statistical evidence, painstakingly compiled through multiple studies, showing that the test rarely saves lives but triggers many unnecessary surgeries. Or we hear about a cluster of cancer cases in a town with a hazardous-waste dump, and we assume that pollution caused the cancers. Of course, just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not random. Yet we have trouble digesting randomness; our brains crave pattern and meaning.
Even for scientists, the scientific method is a hard discipline. They, too, are vulnerable to confirmation bias — the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them. Once the results are published, if they’re important enough, other scientists will try to reproduce them — and, being congenitally skeptical and competitive, will be very happy to announce that they don’t hold up. Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or an absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.
That provisional quality of science is another thing a lot of people have trouble with. To some climate-change skeptics, for example, the fact that a few scientists in the 1970s were worried (quite reasonably, it seemed at the time) about the possibility of a coming ice age is enough to discredit what is now the consensus of the world’s scientists: The planet’s surface temperature has risen by about 1.5 degrees Fahrenheit in the past 130 years, and human actions, including the burning of fossil fuels, are extremely likely to have been the dominant cause since the mid-20th century.
It’s clear that organizations funded in part by the fossil-fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few skeptics. The news media gives abundant attention to such mavericks, naysayers, professional controversialists and table thumpers. The media would also have you believe that science is full of shocking discoveries made by lone geniuses. Not so. The (boring) truth is that science usually advances incrementally, through the steady accretion of data and insights gathered by many people over many years. So it has with the consensus on climate change. That’s not about to go poof with the next thermometer reading.
But industry PR, however misleading, isn’t enough to explain why so many people reject the scientific consensus on global warming.
The “science communication problem,” as it’s blandly called by the scientists who study it, has yielded abundant new research into how people decide what to believe — and why they so often don’t accept the expert consensus. It’s not that they can’t grasp it, according to Dan Kahan of Yale University. In one study he asked 1,540 Americans, a representative sample, to rate the threat of climate change on a scale of zero to 10. Then he correlated that with the subjects’ science literacy. He found that higher literacy was associated with stronger views — at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce their worldviews.
Americans fall into two basic camps, Kahan says. Those with a more “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change. In contrast, people with a “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to — some kind of tax or regulation to limit emissions.
In the United States, climate change has become a litmus test that identifies you as belonging to one or the other of these two antagonistic tribes. When we argue about it, Kahan says, we’re actually arguing about who we are, what our crowd is. We’re thinking: People like us believe this. People like that do not believe this.
Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers. “We’re all in high school. We’ve never left high school,” says Marcia McNutt. “People still have a need to fit in, and that need to fit in is so strong that local values and local opinions are always trumping science. And they will continue to trump science, especially when there is no clear downside to ignoring science.”
Meanwhile the Internet makes it easier than ever for science doubters to find their own information and experts. Gone are the days when a small number of powerful institutions — elite universities, encyclopedias and major news organizations — served as gatekeepers of scientific information. The Internet has democratized it, which is a good thing. But along with cable TV, the Web has also made it possible to live in a “filter bubble” that lets in only the information with which you already agree.
How to penetrate the bubble? How to convert science skeptics? Throwing more facts at them doesn’t help. Liz Neeley, who helps train scientists to be better communicators at an organization called Compass, says people need to hear from believers they can trust, who share their fundamental values. She has personal experience with this. Her father is a climate-change skeptic and gets most of his information on the issue from conservative media. In exasperation she finally confronted him: “Do you believe them or me?” She told him she believes the scientists who research climate change and knows many of them personally. “If you think I’m wrong,” she said, “then you’re telling me that you don’t trust me.” Her father’s stance on the issue softened. But it wasn’t the facts that did it.
If you’re a rationalist, there’s something a little dispiriting about all this. In Kahan’s descriptions of how we decide what to believe, what we decide sometimes sounds almost incidental. Those of us in the science-communication business are as tribal as anyone else, he told me. We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community. When I mentioned to Kahan that I fully accept evolution, he said: “Believing in evolution is just a description about you. It’s not an account of how you reason.”
Maybe — except that evolution is real. Biology is incomprehensible without it. There aren’t really two sides to all these issues. Climate change is happening. Vaccines save lives. Being right does matter — and the science tribe has a long track record of getting things right in the end. Modern society is built on things it got right.
Doubting science also has consequences, as seen in recent weeks with the measles outbreak that began in California. The people who believe that vaccines cause autism — often well educated and affluent, by the way — are undermining “herd immunity” to such diseases as whooping cough and measles. The anti-vaccine movement has been going strong since a prestigious British medical journal, the Lancet, published a study in 1998 linking a common vaccine to autism. The journal later retracted the study, which was thoroughly discredited. But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters. (Anti-vaccine activist and actress Jenny McCarthy famously said on “The Oprah Winfrey Show,” “The University of Google is where I got my degree from.”)
In the climate debate, the consequences of doubt are likely to be global and enduring. Climate-change skeptics in the United States have achieved their fundamental goal of halting legislative action to combat global warming. They haven’t had to win the debate on the merits; they’ve merely had to fog the room enough to keep laws governing greenhouse gas emissions from being enacted.
Some environmental activists want scientists to emerge from their ivory towers and get more involved in the policy battles. Any scientist going that route needs to do so carefully, says Liz Neeley. “That line between science communication and advocacy is very hard to step back from,” she says. In the debate over climate change, the central allegation of the skeptics is that the science saying it’s real and a serious threat is politically tinged, driven by environmental activism and not hard data. That’s not true, and it slanders honest scientists. But the claim becomes more likely to be seen as plausible if scientists go beyond their professional expertise and begin advocating specific policies.
It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app. It’s the way science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else — but their dogma is always wilting in the hot glare of new research. In science it’s not a sin to change your mind when the evidence demands it. For some people, the tribe is more important than the truth; for the best scientists, the truth is more important than the tribe.

View Article Here Read More

Will new ruling finally free Lolita after 40 years in captivity at Miami Seaquarium?



Excerpt from seattletimes.com

A decision to list the captive orca Lolita for federal protection is expected to set the stage for a lawsuit from advocates seeking the whale’s release.

Seattle Times staff reporter



A Puget Sound orca held for decades at Miami’s Seaquarium will gain the protection of the federal Endangered Species Act, a move expected to set the stage for a lawsuit from advocates seeking the whale’s release.

The National Oceanic and Atmospheric Administration (NOAA) announced Wednesday the decision to list Lolita as part of the southern resident killer whales of Puget Sound, which already are considered endangered under the federal act. 

Whale activists, who petitioned for this status, have long campaigned for Lolita’s return to Puget Sound. They hope the listing will provide a stronger legal case to release Lolita than did a previous lawsuit that centered on alleged violations of the federal Animal Welfare Act.

“This gives leverage under a much stronger law,” said Howard Garrett of the Whidbey Island based Orca Network, which hopes a San Juan Island cove will one day serve as the site for Lolita to re-enter the wild.

NOAA Fisheries officials on Wednesday described their decision in narrow terms, which set no broader precedents. It does not address whether Lolita should be released from the Seaquarium.
“This is a listing decision,” said Will Stelle, the NOAA Fisheries regional administrator for the West Coast. “It is not a decision to free Lolita.” 

Aquarium officials have repeatedly said they have no intention of releasing the orca. 

“Lolita has been part of the Miami Seaquarium family for 44 years,” said Andrew Hertz, Seaquarium general manager, in a statement. 

“Lolita is healthy and thriving in her home where she shares habitat with Pacific white-sided dolphins. There is no scientific evidence that ... Lolita could survive in a sea pen or the open waters of the Pacific Northwest, and we are not willing to treat her life as an experiment.”

Orcas, also known as killer whales, are found in many of the world’s oceans. The southern resident population, which spends several months each year in Puget Sound, is the only group listed in the U.S. under the Endangered Species. 

The three pods in the population were reduced by captures by marine parks between 1965 and 1975, NOAA says. Among them was a roundup in Penn Cove where seven whales were captured, including Lolita. 

The southern resident pods now number fewer than 80. Possible causes for the decline are reduced prey, pollutants that could cause reproductive problems and oil spills, according to NOAA Fisheries.
Under the Endangered Species Act, it is illegal to cause a “take” of a protected orca, which includes harming or harassing them.
Wednesday, NOAA officials said holding an animal captive, in and of itself, does not constitute a take. 

Orca activists are expected to argue in their lawsuit that Lolita's cramped conditions result in a prohibited take.

There is “rising public scorn for the whole idea of performing orcas,” said Garrett, who hopes Seaquarium will decide to release Lolita without a court order. 

But NOAA officials still have concerns about releasing captive whales, and any plan to move or release Lolita would require “rigorous scientific review,” the agency said in a statement.
The concerns include the possibility of disease transmission, the ability of a newly released orca to find food and behavior patterns from captivity that could impact wild whales.

NOAA said previous attempts to release captive orcas and dolphins have often been unsuccessful and some have ended in death.

Garrett said the plan for Lolita calls for her to be taken to a netted area of the cove, which could be enlarged later. She would be accompanied by familiar trainers who could “trust and reassure her every bit of the way,” he said. 

The controversy over releasing captive whales has been heightened by the experience of Keiko, a captive orca that starred in the 1993 movie “Free Willy,” about a boy who pushed for the release of a whale.

In 1998, Keiko was brought back to his native waters off Iceland to reintroduce him to life in the wild. That effort ended in 2003 when he died in a Norwegian fjord. 

Garrett, who visited Keiko in Iceland in 1999, said he was impressed by the reintroduction effort, and that there was plenty of evidence that Keiko was able to catch fish on his own.

“The naysayers predicted that as soon as he got into the (Icelandic) waters he would die, and wild orcas would kill him,” Garrett said. “He proved that 180-degrees wrong. He loved it.”

Mark Simmons, who for two years served as director of animal husbandry for the Keiko-release effort, has a different view. He says Keiko never was able to forage for fish on his own, and that he continued to seek out human contact at every opportunity. 

Simmons wrote a book called “Killing Keiko,” that accuses the release effort of leading to a long slow death for the orca, which he says lacked food and then succumbed to an infection.

“It’s not really the fact that Keiko died, but how he died,” Garrett said Wednesday.

View Article Here Read More

Scientists discover organism that hasn’t evolved in more than 2 billion years



Nonevolving bacteria
These sulfur bacteria haven't evolved for billions of years.
Credit: UCLA Center for the Study of Evolution and the Origin of Life

Excerpt from natmonitor.com
By Justin Beach

If there was a Guinness World Record for not evolving, it would be held by a sulfur-cycling microorganism found off the course of Australia. According to research published in the Proceedings of the National Academy of Sciences, they have not evolved in any way in more than two billion years and have survived five mass extinction events.
According to the researchers behind the paper, the lack of evolution actually supports Charles Darwin’s theory of evolution by natural selection.
The researchers examined the microorganisms, which are too small to see with the naked eye, in samples of rocks from the coastal waters of Western Australia. Next they examined samples of the same bacteria from the same region in rocks 2.3 billion years old. Both sets of bacteria are indistinguishable from modern sulfur bacteria found off the coast of Chile.





“It seems astounding that life has not evolved for more than 2 billion years — nearly half the history of the Earth. Given that evolution is a fact, this lack of evolution needs to be explained,” said J. William Schopf, a UCLA professor of earth, planetary and space sciences in the UCLA College who was the study’s lead author in a statement.
Critics of Darwin’s theory of evolution might be tempted to jump on this discovery as proof that Darwin was wrong, but that would be a mistake.
Darwin’s work focused more on species that changed, rather than species that didn’t. However, there is nothing in Darwin’s work that states that a successful species that has found it’s niche in an ecosystem has to change. Unless there is change in the ecosystem or competition for resources there would be no reason for change.
“The rule of biology is not to evolve unless the physical or biological environment changes, which is consistent with Darwin. These microorganisms are well-adapted to their simple, very stable physical and biological environment. If they were in an environment that did not change but they nevertheless evolved, that would have shown that our understanding of Darwinian evolution was seriously flawed.” said Schopf, who also is director of UCLA’s Center for the Study of Evolution and the Origin of Life.
It is likely that there were genetic mutations in the organisms. Mutations are fairly random and happen in all species, but unless those mutations are improvements that help the species function better in the environment, they usually do not get passed on.
Schopf said that the findings provide further proof that Darwin’s ideas were right.
The oldest fossils analyzed for the study date back to the Great Oxidation Event. This event, which occurred between 2.2 and 2.4 billion years ago, saw a substantial increase in Earth’s oxygen levels. That period also saw an increase in sulfates and nitrates, which is all that the microorganisms would have needed to survive and reproduce.
Shopf and his team used Raman spectroscopy, which allows scientists to examine the composition and chemistry of rocks as well as confocal laser scary microscopy to generate 3-D images of fossils embedded in rock.
The research was funded by NASA Astrobiology Institute, in the hope that it will help the space agency to find life elsewhere.

View Article Here Read More

10 Mysterious Biblical Figures No One Can Explain






listverse.com

The canonical Bible is filled with mysterious characters, many of whom drop in for a cameo, do their thing, and then slide out, never to be heard from again. Some are merely extras, but some have a contextual presence that begs further examination. And some are, well, just weird.

10  Melchizedek

01
 

Probably the single most mysterious figure in the Bible, Melchizedek was a priest-king of Salem (later known as Jerusalem) in the time of Abram (Abraham), suggesting a religious organization, complete with ritual and hierarchy, that predated the Jewish nation and their priestly lineage from the tribe of Levi. He is only portrayed as active in one passage, although he is alluded to once in Psalms, and several times in the New Testament’s Epistle to the Hebrews.
Some Jewish disciplines insist that Melchizedek was Shem, Noah’s son. He is thought of, in Christian circles, as a proto-messiah, embodying certain traits later given to Christ. New Testament writings assert that Christ was “a priest forever in the order of Melchizedek,” indicating an older and deeper covenant with God than the Abrahamic-Levite lineage.
Hebrews 7, though presents him in a more unusual light. In verses 3 and 4:
“Without father, without mother, without descent, having neither beginning of days, nor end of life; but made like unto the Son of God; abideth a priest continually. Now consider how great this man was, unto whom even the patriarch Abraham gave the tenth of the spoils.”
Not only do these verses grant Melchizedek a hierarchical level above the most important Jewish patriarch, they assign him mystical qualities. Some take this to mean an earlier incarnation of Christ. Others see it as an ancient manifestation of the Holy Spirit. His identity, role, and theological function have long been debated.
The paucity of scriptural references have added to the mystery, making him a somewhat spectral figure. As such, newer spiritual traditions, as well as New Age quacks, have taken liberties with his persona. Gnostics insisted he became Jesus, and he is cited as a high-level priest in Masonic and Rosicrucian lore. Joseph Smith wrote that he was the greatest of all prophets, and Mormons still trace their priesthood back to him. The Urantia, a 20th-century pseudo-Bible that claims to merge religion, philosophy, and science, insists he’s the first in an evolutionary succession of deification manifestations, with Abraham being his first convert.
There is even a school of thought that Melchizedek is a title or assumed character name, sort of a theological 007, played by a series of Judeo-Christian James Bonds. 

The lore of Melchizedek is confusing but deep and fascinating. Apocryphal books give us more details, some cryptic, some relatively mundane. The Second Book of Enoch is particularly informative, insisting Melchizedek was born of a Virgin. When his mother Sophonim (the wife of Noah’s brother Nir) died in childbirth, he sat up, clothed himself, and sat beside her corpse, praying and preaching. After 40 days, he was taken by an archangel to the Garden of Eden, protected by angels and avoiding the Great Flood without passage on Uncle Noah’s ark.

9  Cain’s Wife

02
 

Cain was, according to Genesis, the first human ever born. He later killed his younger brother Abel in a hissy fit over his sacrifice of meat being more favored than Cain’s sacrificial fruit basket. God put a mark on Cain and cursed the ground he farmed, forcing him into a life as a wandering fugitive. 

That part of the story is fairly well known. Later, though, we read that he settled in the Land of Nod, and, all of a sudden, he has a wife. Absolutely nothing else is mentioned about her. We don’t even know where she came from. In fact, the question of where Cain got his wife, when his immediate family were apparently the only people in the world, has sent many a perceptive young Sunday schooler down the road of skepticism. 

Some have posited a mysterious other tribe of people, maybe created after Adam and Eve, maybe even another race or species. But the standard response is that Adam and Eve had many other sons and daughters to populate the Earth. The only way to keep the human race going would be to mate with siblings, nieces, nephews, and cousins. 

In fact, though the Holy Bible is silent on her identity, the apocryphal Book of Jubilees tells us exactly who was Cain’s wife: his sister Awan, who bore his son Enoch.

8  Joseph Barsabbas

03
 

After Judas Iscariot turned in his resignation by selling out his boss, Jesus’s disciples rushed to fill the open position and bring the number back up to a more theologically apt 12. The remaining disciples, including the newly convinced Thomas, looked over the candidates from the 120 or so adherents who followed Jesus. Then they cast lots to pick who would fill the position. 

It went to Matthias, a fairly mysterious character himself. We don’t know where he came from or his previous occupation. Some think he was actually the diminutive Zacchaeus, the tax collector who climbed a sycamore tree to get a better glimpse of Jesus’s ride on the donkey.
The man who lost out was Joseph Barsabbas, also known as Joseph Justus. We know nothing solid about him, even less than we know about Matthias.
There is, however, one bit of interesting speculation. A list of names presented in Mark 6:3 includes some of Christ’s earliest and most loyal adherents. One of these is a man named Joses, and another is James the Just. Biblical scholar Robert Eisenman suggests that James carried on Jesus’s work, and the writer of the Book of Acts assigned him an alias to minimize his importance.

7  The Beloved Disciple

04
 

In the Gospel of John, several references are made to “the disciple whom Jesus loved.” This particular favorite is present at the Last Supper, the crucifixion, and after the resurrection. The writer of the Gospel of John even states that the testimony of this disciple is the basis for the text. But there is considerable debate over the identity of this mystery figure.
The most obvious nominee is John the Apostle, one of Christ’s inner circle of 12 and the namesake of the Gospel. But none of the 12 apostles were present at the crucifixion, so that crosses him off the list. Lazarus, resurrected by Christ, is also considered. He seems to have been present at the cited events and is referred to specifically, in the story of His death and resurrection, as “he whom Thou lovest.”
Mary Magdalene, Judas, Jesus’s brother James, or an unnamed disciple, possibly even a Roman or governmental official, have all been considered. There is even a school of thought that John is an interactive gospel, with the reader being the beloved disciple.

6  Simon Magus

05
 

“Simony” is the selling of church position or privilege. It is named for Simon Magus, or Simon the Magician, who makes only a brief appearance in the Bible, in Acts 8:9–24. Simon has since become synonymous with heretical thought, and religious exploitation.
He is presented as a powerful magician with a large following of in Samaria, who converts to Christianity and wishes to learn from apostles Peter and Phillip. When he sees the gifts of the Holy Spirit, including speaking in tongues and an ecstatic spiritual state, he offers the men money if they will give him the secret to passing these gifts to others. They are not amused.
Apocryphal texts reveal quite a bit more, like his alleged ability to levitate and even fly, emphasizing that he was something akin to a cult leader in his hometown. It is suggested that his conversion is more for economic purposes than spiritual, and he set himself up as a messianic figure himself, competing for the Jesus dollar with his own homespun theology.
He is thought by some to be a founder of Gnosticism, a patchwork of various religious systems that relied heavily on Judaic and Christian symbolism.

5  Onan

06
 

Not unlike Simon Magus, Onan’s brief appearance inspired a name for a particular action.
He was the second son of Abraham’s grandson Judah, the patriarch and namesake of one of the 12 tribes of Israel. His older brother, Er (yes, just “Er”) was “wicked in the sight of the Lord,” so God killed him. What he did to deserve such an execution remains a mystery.
Tradition at the time dictated that Er’s widow, Tamar, become Onan’s wife. Onan had to impregnate her to keep the lineage alive, but he was not as wild about the idea. Maybe it was the thought of impending fatherhood, or Tamar just wasn’t his type. So, taking matters into his own hands, he committed the first recorded act of coitus interruptus. Or, as Genesis 38:9 so poetically put it: “And Onan knew that the seed should not be his; and it came to pass, when he went in unto his brother’s wife, that he spilled it on the ground, lest that he should give seed to his brother.” God was displeased and slew Onan.
The whole tale gets even more sordid. Onan had a younger brother, Shelah. Customarily, he would have been next in line to impregnate Tamar, but Judah forbade it. Tamar, rather than graciously accepting forced spinsterhood, seduced Judah and (became pregnant) by the old man. Judah fathered twins Zerah and Perez, the latter of whom was listed by Matthew as an ancestor of Jesus’s earthly father Joseph...
Some have even suggested that Onan’s death warns that sex is meant only for purposes of reproduction, and not for pleasure.

4  Nicodemus

07
 

Nicodemus was a member of the Sanhedrin, a council of men who ruled on Jewish law and governance. He became a friend, follower, and intellectual foil for Jesus, whose egalitarian teachings often ran counter to the Sanhedrin’s rigid decrees. He was also a Pharisee, a leader within the Jewish community who toadied up to the Roman government at the time of Christ’s arrest and subsequent crucifixion.
He is mentioned three times in the New Testament, all in the Gospel of John. He subtly defends Jesus as the Pharisees discuss His impending arrest. Later, he helps prepare Jesus’s body for burial, indicating he had become an adherent to Christ and His teachings.
The first time he is mentioned, however, is in dialogue with Jesus, and these conversations reveal some of the most important aspects of Christian theology, such as the notion of being “born again” and the most famous reference to the divinity of Christ, John 3:16: “For God so loved the world, that he gave his only begotten Son, that whosoever believeth in him should not perish, but have everlasting life.”
This detailed conversation explores the divide between the Old Covenant’s dogmatic and exclusive Jewish Law and the New Covenant’s spiritually inclusive concepts. But for a vital contributor to such an important passage of the New Testament, Nicodemus remains a mysterious figure. Some scholars have suggested he may be Nicodemus ben Gurion, a Talmudic figure of wealth and mystical power. Christian tradition suggest he was martyred, and he is venerated as a saint. His name has come to be synonymous with seekers of the truth and is used as a character in many works of biblically inspired fiction.

3  James The Just

08
 

He is considered, next to Paul and Peter, the most important apostolic figure in the Church’s history. The Book of Acts specifically names him the head of the Christian church in Jerusalem, and he is frequently cited, both scripturally and apocryphally, as being consulted by both Paul and Peter. So who is he?
Traditionally, he is thought of as Jesus’s brother (or, more precisely, His half-brother). Jesus is listed, in the Gospels, as having siblings, some younger than Him. One was named James.
But James was a common name, and there are several mentioned in the Bible. Two of the 12 disciples were named James, but both are listed as having different fathers than Jesus, and neither went on to become James the Just. James the son of Zebedee went on to be known as James the Great, and James the son of Alphaeus was called James the Less.
It is known that he was a contemporary of Jesus, although he seems to have had no real inner-circle status during Christ’s ministry. The apocryphal Gospel of Thomas says Christ Himself designated James to lead the movement upon His death. The Apostle Paul initially seems respectful, even subservient, to “James the Lord’s brother,” calling him a “pillar” of the movement, even though he was later to disagree with him on matters of doctrine.
Some, though, have suggested the “brother” designation was spiritual, rather than physical. St. Jerome, among others, suggested that the doctrine of perpetual virginity indicated James could be a cousin, which, given the tribal associations and clannishness of the Jewish community of the time, seems valid. Such a relationship would indicate a certain social proximity without necessarily being a true sibling.

2  Simon The Zealot

09
 

Of Christ’s 12 disciples, none are more mysterious than Simon the Zealot. His name was meant to differentiate him from Simon Peter and has come to symbolize, for some, that he was a member of a similarly named political movement that advocated Jewish defiance to Roman law. Some have speculated that he acted, within Christ’s inner circle, as a political adviser. His presence then indicated that Jesus had a revolutionary political agenda.
The truth is much less exciting. The “Zealot” movement did not take place until long after the time that Christ would have given Simon his sobriquet, and there has never been any serious evidence that Simon, despite the designation, was a political radical. The name, and the word upon which it is based, did not take on those aggressive undertones until the movement itself was in full swing. More than likely, Simon was given his name because of intense spiritual devotion, rather than any radical political stance.
Nothing else is known of him, at least not with any surety. The Catholic Encyclopedia mentions him as possibly being a brother or cousin of Jesus, with no real evidence. The Eastern Orthodox tradition says he developed his zeal when Jesus attended his wedding and changed water into wine. Some legends say he was martyred; the philosopher Justus Lipsius somehow got it into his head that he was sawed in half.

1  Og

10
 

Cited twice specifically, but alluded to frequently in general terms, the Nephilim were a race of violent giants that lived in the pre-Flood world at the same time as humanity. Were they, as some suggest, the offspring of demons and human women? Fallen angels themselves? Or simply the descendants of Seth mentioned in the Dead Sea scrolls, a tribe of cranky cases cursed by God for their rebelliousness? Regardless, they evolved and became known by other names, like the Raphaim, and frequently battled humans for land and power.
The most storied of them was Og, the King of Bashan. He was killed, along with his entire army, and his kingdom was ransacked. All of the survivors—men, women, and children—were put to death, and the strongest and most powerful line of Nephilim descendants was eliminated. Some Nephilim bloodlines continued to do battle with the Israelites, though they were becoming less powerful and dying out. One tribe, the Anakim, allied themselves with the human tribes in Philistia. Goliath was thought to have been one of the last few descendants of the Nephilim.
Goliath’s height is given in the earliest manuscripts as 275 centimeters (9′). That’s hardly as awe-inspiring as the creature laying in Og’s bed, which measured, according to Deuteronomy, 400 centimeters (13′ 6″). That’s basically Yao Ming sitting on Shaquille O’Neal’s shoulders.
Biblically, descendants of the Nephilim could not have survived the Flood, even though Og and other giants are post-Flood figures. Some biblical literalists have attributed their later existence to the descendants of Noah’s family hooking up, once again, with demons. Or, being fallen angels and not human, they did survive the flood.
Jewish tradition gets deeper into information about the Nephilim and their descendants, going against the grain of the biblical account. It tells of Og booking passage on the Ark by promising to act as a slave to Noah and his family. Other accounts have him hanging on to the side of the Ark and riding the flood out rodeo-style.

View Article Here Read More

New Religion and Science Study Reveals ‘Post-Seculars’ Reject Evolution





Excerpt from huffingtonpost.com

(RNS) Meet the “Post-Seculars” — the one in five Americans who no one seems to have noticed before in endless rounds of debates pitting science vs. religion.

They’re more strongly religious than most “Traditionals” (43 percent of Americans) and more scientifically knowledgeable than “Moderns” (36 percent) who stand on science alone, according to two sociologists’ findings in a new study.

“We were surprised to find this pretty big group (21 percent) who are pretty knowledgeable and appreciative about science and technology but who are also very religious and who reject certain scientific theories,” said Timothy O’Brien, co-author of the research study, released Thursday (Jan. 29) in the American Sociological Review.

Put another way, there’s a sizable chunk of Americans out there who are both religious and scientifically minded but who break with both packs when faith and science collide.

Post-Seculars pick and choose among science and religion views to create their own “personally compelling way of understanding the world,” said O’Brien, assistant professor at University of Evansville in Indiana.

O’Brien and co-author Shiri Noy, an assistant professor of sociology at University of Wyoming, examined responses from 2,901 people to 18 questions on knowledge of and attitudes toward science, and four religion-related questions in the General Social Surveys conducted in 2006, 2008 and 2010.

Many findings fit the usual way the science-religion divide is viewed:

— Moderns, who stand on reason, scored high on scientific knowledge and scored lowest on religion questions regarding biblical authority and the strength of their religious ties.

— Traditionals, who lean toward religion, scored lower on science facts and were least likely to agree that “the benefits of scientific research outweigh the harmful results.”

However, the data turned up a third perspective – people who defied the familiar breakdown. The authors dubbed them “Post-Secular” to jump past a popular theory that Americans are moving way from religion to become more secular, O’Brien said.

Post-Seculars — about half of whom identify as conservative Protestants — know facts such as how lasers work, what antibiotics do and the way genetics affect inherited illnesses.

But when it comes to three main areas where science and Christian-centric religious views conflict — on human evolution, the Big Bang origin of the universe and the age of the Earth — Post-Seculars break away from the pack with very significantly different views from Traditionals and Moderns.

Areas where the factions are clear:

graphic

The universe began with a huge explosion:
Traditional: 21 percent
Modern: 68 percent
Post Secular: 6 percent

Human beings developed from earlier species of animals:
Traditional: 33 percent
Modern: 88 percent
Post-Secular: 3 percent

The continents have been moving for millions of years and will move in the future:
Traditional: 66 percent
Modern: 98 percent
Post-Secular: 80 percent

“Post-Seculars are smart. They know what scientists think. They just don’t agree on some key issues, and that has impact on their political views,” said O’Brien.

When the authors looked at views on the authority of the Bible and how strongly people said they were affiliated with their religion, Post-Seculars put the most faith in Scripture and were much more inclined to say they were strongly religious. And where science and faith conflict on hot-button issues, they side with the religious perspective.

For example, Moderns are the most supportive of embryonic stem cell research and abortion rights for women, but Post-Seculars, who are nonetheless largely positive about science and society, are more skeptical in both areas, O’Brien said.

Candidates running in the 2016 elections might take note.

Where people fall in these three groups can predict their attitudes on political issues where science and religion both have claims, O’Brien said, even after accounting for the usual suspects — social class, political ideology or church attendance.

View Article Here Read More

Elon Musk drops space plans into Seattle’s lap




Excerpt from seattletimes.com

Elon Musk thought three major trends would drive the future: the Internet, the quest for sustainable energy and space exploration. He’s got skin in all three games.

Of all the newcomers we’ve seen here lately, one of the more interesting is Elon Musk.

The famous entrepreneur isn’t going to live here, at least not yet. But earlier this month he did announce plans to bulk up an engineering center near Seattle for his SpaceX venture. The invitation-only event was held in the shadow of the Space Needle.
If the plan happens, SpaceX would join Planetary Resources and Blue Origin in a budding Puget Sound space hub. With talent from Boeing, the aerospace cluster and University of Washington, this offers fascinating potential for the region’s future.

Elon Musk sounds like the name of a character from a novel that would invariably include the sentence, “he had not yet decided whether to use his powers for good or for evil.”

He is said to have been the inspiration for the character Tony Stark, played by Robert Downey Jr. in the “Iron Man” movies. He’s also been compared to Steve Jobs and even Thomas Edison.

The real Musk seems like a nice-enough chap, at least based on his ubiquitous appearances in TED talks and other venues.

Even the semidishy essay in Marie Claire magazine by his first wife, Justine, is mostly about the challenge to the marriage as Musk became very rich, very young, started running with a celebrity crowd and exhibited the monomaniacal behavior common to the entrepreneurial tribe.

A native of South Africa, Musk emigrated to Canada and finally to the United States, where he received degrees from the University of Pennsylvania’s prestigious Wharton School. He left Stanford’s Ph.D. program in applied physics after two days to start a business.
In 1995, he co-founded Zip2, an early Internet venture for newspapers. Four years later, he co-founded what would become PayPal. With money from eBay’s acquisition of PayPal, he started SpaceX. He also invested in Tesla Motors, the electric-car company, eventually becoming chief executive. Then there’s Solar City, a major provider of solar-power systems.

Musk has said that early on he sensed three major trends would drive the future: the Internet, the quest for sustainable energy and space exploration. He’s got skin in all three games.

At age 43, Musk is seven years younger than Jeff Bezos and more than 15 years younger than Bill Gates.

His achievements haven’t come without controversy. Tesla played off several states against each other for a battery factory. Nevada, desperate to diversify its low-wage economy, won, if you can call it that.

The price tag was $1.4 billion in incentives and whether it ever pays off for the state is a big question. A Fortune magazine investigation showed Musk not merely as a visionary but also a master manipulator with a shaky deal. Musk, no shrinking violet, fired back on his blog.

SpaceX is a combination of the practical and the hyperambitious, some would say dreamy.

On the practical side, the company is one of those chosen by the U.S. government to resupply the International Space Station. Musk also hopes to put 4,000 satellites in low-Earth orbit to provide inexpensive Internet access worldwide.

The satellite venture will be based here, with no financial incentives from the state.

But he also wants to make space travel less expensive, generate “a lot of money” through SpaceX, and eventually establish a Mars colony.

“SpaceX, or some combination of companies and governments, needs to make progress in the direction of making life multiplanetary, of establishing a base on another planet, on Mars — being the only realistic option — and then building that base up until we’re a true multiplanet species,” he said during a TED presentation.

It’s heady stuff. And attractive enough to lead Google and Fidelity Investments to commit $1 billion to SpaceX.

Also, in contrast with the “rent-seeking” and financial plays of so many of the superwealthy, Musk actually wants to create jobs and solve practical problems.

If there’s a cautionary note, it is that market forces alone can’t address many of our most serious challenges. Indeed, in some cases they make them worse.

Worsening income inequality is the work of the hidden hand, unfettered by antitrust regulation, progressive taxation, unions and protections against race-to-the-bottom globalization.

If the hidden costs of spewing more carbon into the atmosphere are not priced in, we have today’s market failure exacerbating climate change. Electric cars won’t fix that as long as the distortions favoring fossil fuels remain.

So a broken, compromised government that’s cutting research dollars and failing to invest in education and forward-leaning infrastructure is a major impediment.

The United States did not reach the moon because of a clever billionaire, but through a national endeavor to serve the public good. I know, that’s “so 20th century.” 

Also, as Northwestern University economist Robert Gordon might argue, visionaries such as Thomas Edison grabbed relatively low-hanging fruit, with electrification creating huge numbers of jobs. 

Merely recovering the lost demand of the Great Recession has proved difficult. Another electrificationlike revolution that lifts all boats seems improbable.

I’m not sure that’s true. But it will take more than Iron Man to rescue the many Americans still suffering.

View Article Here Read More
Older posts Newer posts

Creative Commons License
This work is licensed under a
Creative Commons Attribution 4.0
International License
.
unless otherwise marked.

Terms of Use | Privacy Policy



Up ↑