Tag: began (page 3 of 12)

Jupiter May Be Behind The Mysterious ‘Gaping Hole’ In Our Solar System

Excerpt from huffingtonpost.comWhen astronomers began studying other solar systems in the Milky Way galaxy back in the 1990s, they noticed something peculiar: most of these systems have big planets that circle their host stars in tight orbits, a fin...

View Article Here Read More

Hypatia, Ancient Alexandria’s Great Female Scholar

An avowed paganist in a time of religious strife, Hypatia was also one of the first women to study math, astronomy and philosophy On the streets of Alexandria, Egypt, a mob led by Peter the Lector brutally murdered Hypatia, one of the last great thinkers of ancient Alexandria. (Mary Evans Picture Library / Alamy) By Sarah Zielinskismithsonian.com One day on the streets of Alexandria, Egypt, in the year 415 or 416, a mob of Christian zealots led by Peter the Lector accosted a wom [...]

View Article Here Read More

NASA video illustrates ‘X-ray wind’ blasting from a black hole

This artist's illustration shows interstellar gas, the raw material of star formation, being blown away.Excerpt from cnet.com It takes a mighty wind to keep stars from forming. Researchers have found one in a galaxy far, far away -- and NASA mad...

View Article Here Read More

Circular thinking: Stonehenge’s origin is subject of new theory




Excerpt from theguardian.com

Whether it was a Druid temple, an astronomical calendar or a centre for healing, the mystery of Stonehenge has long been a source of speculation and debate. Now a dramatic new theory suggests that the prehistoric monument was in fact “an ancient Mecca on stilts”.

The megaliths would not have been used for ceremonies at ground level, but would instead have supported a circular wooden platform on which ceremonies were performed to the rotating heavens, the theory suggests.

Julian Spalding, an art critic and former director of some of the UK’s leading museums, argues that the stones were foundations for a vast platform, long since lost – “a great altar” raised up high towards the heavens and able to support the weight of hundreds of worshippers.

“It’s a totally different theory which has never been put forward before,” Spalding told the Guardian. “All the interpretations to date could be mistaken. We’ve been looking at Stonehenge the wrong way: from the earth, which is very much a 20th-century viewpoint. We haven’t been thinking about what they were thinking about.”

Since Geoffrey of Monmouth wrote in the 12th century that Merlin had flown the stones from Ireland, theories on Stonehenge, from plausible to absurd, have abounded. In the last decade alone, the monument has been interpreted as “the prehistoric Lourdes” where people brought the sick to be healed by the power of the magic bluestones from Wales and as a haunted place of the dead contrasting with seasonal feasts for the living at nearby Durrington Walls. 

The site pored over by archaeologists for centuries still produces surprises, including the outline of stones now missing, which appeared in the parched ground in last summer’s drought and showed that the monument was not left unfinished as some had believed, but was once a perfect circle.

Spalding, who is not an archaeologist, believes that other Stonehenge theorists have fallen into error by looking down instead of up. His evidence, he believes, lies in ancient civilisations worldwide. As far afield as China, Peru and Turkey, such sacred monuments were built high up, whether on manmade or natural sites, and in circular patterns possibly linked to celestial movements.

He said: “In early times, no spiritual ceremonies would have been performed on the ground. The Pharaoh of Egypt and the Emperor of China were always carried – as the Pope used to be. The feet of holy people were not allowed to touch the ground. We’ve been looking at Stonehenge from a modern, earth-bound perspective.”
“All the great raised altars of the past suggest that the people who built Stonehenge would never have performed celestial ceremonies on the lowly earth,” he went on. “That would have been unimaginably insulting to the immortal beings, for it would have brought them down from heaven to bite the dust and tread in the dung.”

Spalding’s theory has not met with universal approval. Prof Vincent Gaffney, principal investigator on the Stonehenge Hidden Landscapes Project at Bradford University, said he held “a fair degree of scepticism” and Sir Barry Cunliffe, a prehistorian and emeritus professor of European archaeology at Oxford University, said: “He could be right, but I know of no evidence to support it”.
The archaeologist Aubrey Burl, an authority on prehistoric stone circles, said: “There could be something in it. There is a possibility, of course. Anything new and worthwhile about Stonehenge is well worth looking into, but with care and consideration.”

On Monday Spalding publishes his theories in a new book, titled Realisation: From Seeing to Understanding – The Origins of Art. It explores our ancestors’ understanding of the world, offering new explanations of iconic works of art and monuments.

Stonehenge, built between 3000 and 2000BC, is England’s most famous prehistoric monument, a UNESCO World Heritage site on Salisbury Plain in Wiltshire that draws more than 1 million annual visitors. It began as a timber circle, later made permanent with massive blocks of stone, many somehow dragged from dolerite rock in the Welsh mountains. Spalding believes that ancient worshippers would have reached the giant altar by climbing curved wooden ramps or staircases.

View Article Here Read More

How Quantum Physics will change your life and amaze the world!

 Excerpt from educatinghumanity.com "Anyone not shocked by quantum mechanics has not yet understood it."Niels Bohr10 Ways Quantum Physics Will Change the WorldEver want to have a "life do over", teleport, time travel, have your computer wor...

View Article Here Read More

Rare & severe geomagnetic storm enables Aurora Borealis to be seen from U.S. tonight

Excerpt from mashable.com Thanks to a rare, severe geomagnetic storm, the Northern Lights may be visible on Tuesday night in areas far to the south of its typical home in the Arctic.  The northern tier of the U.S., from Washington State to Michiga...

View Article Here Read More

Are We An Alien Experiment?

Although its possible those responsible for our Earthen experiment may possess a far different form then we, I feel it more probable we were created in our family's image. Greg  Excerpt from rense.com  Even the most hardened skeptic mus...

View Article Here Read More

When did humans first begin to wear clothes?



Excerpt from todayifoundout.com

Determining exactly when humans began wearing clothes is a challenge, largely because early clothes would have been things like animal hides, which degrade rapidly. Therefore, there’s very little archaeological evidence that can be used to determine the date that clothing started being worn. 

There have been several different theories based on what archaeologists have been able to find. For instance, based on genetic skin-coloration research, humans lost body hair around one million years ago—an ideal time to start wearing clothes for warmth. The first tools used to scrape hides date back to 780,000 years ago, but animal hides served other uses, such as providing shelter, and it’s thought that those tools were used to prepare hides for that, rather than clothing. Eyed needles started appearing around 40,000 years ago, but those tools point to more complex clothing, meaning clothes had probably already been around for a while.
All that being said, scientists have started gathering alternative data that might help solve the mystery of when we humans started covering our bits.

A recent University of Florida study concluded that humans started wearing clothes some 170,000 years ago, lining up with the end of the second-to-last ice age. How did they figure that date out? By studying the evolution of lice.

Scientists observed that clothing lice are, well, extremely well-adapted to clothing. They hypothesized that body lice must have evolved to live in clothing, which meant that they weren’t around before humans started wearing clothes. The study used DNA sequencing of lice to calculate when clothing lice started to genetically split from head lice.

The findings of the study are significant because they show that clothes appeared some 70,000 years before humans started to migrate north from Africa into cooler climates. The invention of clothing was probably one factor that made migration possible.
This timing also makes sense due to known climate factors in that era.  As Ian Gilligan, a lecturer at the Australian National University, said that the study gave “an unexpectedly early date for clothing, much earlier than the earliest solid archaeological evidence, but it makes sense. It means modern humans probably started wearing clothes on a regular basis to keep warm when they were first exposed to Ice Age conditions.”

As to when humans moved on from animal hides and into textiles, the first fabric is thought to have been an early ancestor of felt. From there, early humans took up weaving some 27,000 years ago, based on impressions of baskets and textiles on clay. Around 25,000 years ago, the first Venus figurines—little statues of women—appeared wearing a variety of different clothes that pointed to weaving technology being in place by this time.
From there, more recent ancient civilizations discovered many materials they could fashion into clothing. For instance, Ancient Egyptians produced linen around 5500 BC, while the Chinese likely started producing silk around 4000 B.C.

As for clothing for fashion, instead of just keeping warm, it is thought that this occurred relatively early on. The first example of dyed flax fibers were found in a cave in the Republic of Georgia and date back to 36,000 years ago. That being said, while they may have added colour, early clothes seem to have been much simpler than the clothing we wear today—mostly cloth draped over the shoulder and pinned at the waist.

Around the mid-1300s in certain regions of the world, with some technological advances in previous century, clothing fashion began to change drastically from what it was before. For instance, clothing started to be made to form fit the human body, with curved seams, laces, and buttons. Contrasting colours and fabrics also became popular in England. From this time, fashion in the West began to change at an alarming rate, largely based on aesthetics, whereas in other cultures fashion typically changed only with great political upheaval, meaning changes came more slowly in most other cultures.

The Industrial Revolution, of course, had a huge impact on the clothing industry. Clothes could now be made en mass in factories rather than just in the home and could be transported from factory to market in record time. As a result, clothes became drastically cheaper, leading to people having significantly larger wardrobes and contributing to the constant change in fashion that we still see today.

View Article Here Read More

Do we really want to know if we’re not alone in the universe?



Frank Drake, the founder of Search for Extraterrestrial Intelligence (SETI), at his home in Aptos, Calif. (Ramin Rahimian for The Washington Post)


Excerpt from washingtonpost.com

It was near Green Bank, W.Va., in 1960 that a young radio astronomer named Frank Drake conducted the first extensive search for alien civilizations in deep space. He aimed the 85-foot dish of a radio telescope at two nearby, sun-like stars, tuning to a frequency he thought an alien civilization might use for interstellar communication.

But the stars had nothing to say.

So began SETI, the Search for Extraterrestrial Intelligence, a form of astronomical inquiry that has captured the imaginations of people around the planet but has so far failed to detect a single “hello.” Pick your explanation: They’re not there; they’re too far away; they’re insular and aloof; they’re zoned out on computer games; they’re watching us in mild bemusement and wondering when we’ll grow up.

Now some SETI researchers are pushing a more aggressive agenda: Instead of just listening, we would transmit messages, targeting newly discovered planets orbiting distant stars. Through “active SETI,” we’d boldly announce our presence and try to get the conversation started.

Naturally, this is controversial, because of . . . well, the Klingons. The bad aliens.

 NASA discovers first Earth-size planet in habitable zone of another star

"NASA's Kepler Space Telescope has discovered the first validated Earth-size planet orbiting in the habitable zone of a distant star, an area where liquid water might exist on its surface. The planet, Kepler-186f, is ten percent larger in size than Earth and orbits its parent star, Kepler-186, every 130 days. The star, located about 500 light-years from Earth, is classified as an M1 dwarf and is half the size and mass of our sun." (NASA Ames Research Center)
“ETI’s reaction to a message from Earth cannot presently be known,” states a petition signed by 28 scientists, researchers and thought leaders, among them SpaceX founder Elon Musk. “We know nothing of ETI’s intentions and capabilities, and it is impossible to predict whether ETI will be benign or hostile.”

This objection is moot, however, according to the proponents of active SETI. They argue that even if there are unfriendlies out there, they already know about us. That’s because “I Love Lucy” and other TV and radio broadcasts are radiating from Earth at the speed of light. Aliens with advanced instruments could also detect our navigational radar beacons and would see that we’ve illuminated our cities.

“We have already sent signals into space that will alert the aliens to our presence with the transmissions and street lighting of the last 70 years,” Seth Shostak, an astronomer at the SETI Institute in California and a supporter of the more aggressive approach, has written. “These emissions cannot be recalled.”

That’s true only to a point, say the critics of active SETI. They argue that unintentional planetary leakage, such as “I Love Lucy,” is omnidirectional and faint, and much harder to detect than an intentional, narrowly focused signal transmitted at a known planet.

View Article Here Read More

Fresh fossil studies push the dawn of man back to 2.8 million years

(Reuters) - A 2.8-million-year-old jawbone fossil with five intact teeth unearthed in an Ethiopian desert is pushing back the dawn of humankind by about half a million years.Scientists said on Wednesday the fossil represents the oldest known repres...

View Article Here Read More

Recent Disappearances & Strangeness in the Bermuda Triangle

Excerpt from paranormal.lovetoknow.com By Michelle Radcliff The Bermuda Triangle is an area of mostly open ocean located between Bermuda, Miami, Florida and San Juan, Puerto Rico. The unexplained disappearances of hundreds of ships and air...

View Article Here Read More

Earth’s Moon May Not Be Critical to Life Afterall




Excerpt from space.com

The moon has long been viewed as a crucial component in creating an environment suitable for the evolution of complex life on Earth, but a number of scientific results in recent years have shown that perhaps our planet doesn't need the moon as much as we have thought.

In 1993, French astronomer Jacques Laskar ran a series of calculations indicating that the gravity of the moon is vital to stabilizing the tilt of our planet. Earth's obliquity, as this tilt is technically known as, has huge repercussions for climate. Laskar argued that should Earth's obliquity wander over hundreds of thousands of years, it would cause environmental chaos by creating a climate too variable for complex life to develop in relative peace.
So his argument goes, we should feel remarkably lucky to have such a large moon on our doorstep, as no other terrestrial planet in our solar system has such a moon. Mars' two satellites, Phobos and Deimos, are tiny, captured asteroids that have little known effect on the Red Planet. Consequently, Mars' tilt wobbles chaotically over timescales of millions of years, with evidence for swings in its rotational axis at least as large as 45 degrees. 


The stroke of good fortune that led to Earth possessing an unlikely moon, specifically the collision 4.5 billion years ago between Earth and a Mars-sized proto-planet that produced the debris from which our Moon formed, has become one of the central tenets of the 'Rare Earth' hypothesis. Famously promoted by Peter Ward and Don Brownlee, it argues that planets where everything is just right for complex life are exceedingly rare.

New findings, however, are tearing up the old rule book. In 2011, a trio of scientists — Jack Lissauer of NASA Ames Research Center, Jason Barnes of the University of Idaho and John Chambers of the Carnegie Institution for Science — published results from new simulations describing what Earth's obliquity would be like without the moon. What they found was surprising.

"We were looking into how obliquity might vary for all sorts of planetary systems," says Lissauer. "To test our code we began with integrations following the obliquity of Mars and found similar results to other people. But when we did the obliquity of Earth we found the variations were much smaller than expected — nowhere near as extreme as previous calculations suggested they would be."
Lissauer's team found that without the moon, Earth's rotational axis would only wobble by 10 degrees more than its present day angle of 23.5 degrees. The reason for such vastly different results to those attained by Jacques Laskar is pure computing power. Today's computers are much faster and capable of more accurate modeling with far more data than computers of the 1990s.

Lissauer and his colleagues also found that if Earth were spinning fast, with one day lasting less than 10 hours, or rotating retrograde (i.e. backwards so that the sun rose in the West and set in the East), then Earth stabilized itself thanks to the gravitational resonances with other planets, most notably giant Jupiter. There would be no need for a large moon. 

Earth's rotation has not always been as leisurely as the current 24 hour spin-rate. Following the impact that formed the moon, Earth was spinning once every four or five hours, but it has since gradually slowed by the moon's presence. As for the length of Earth's day prior to the moon-forming impact, nobody really knows, but some models of the impact developed by Robin Canup of the Southwest Research Institute, in Boulder, Colorado, suggest that Earth could have been rotating fast, or even retrograde, prior to the collision.

Tilted Orbits
Planets with inclined orbits could find that their increased obliquity is beneficial to their long-term climate – as long as they do not have a large moon.


"Collisions in the epoch during which Earth was formed determined its initial rotation," says Lissauer. "For rocky planets, some of the models say most of them will be prograde, but others say comparable numbers of planets will be prograde and retrograde. Certainly, retrograde worlds are not expected to be rare."

The upshot of Lissauer's findings is that the presence of a moon is not the be all and end all as once thought, and a terrestrial planet can exist without a large moon and still retain its habitability. Indeed, it is possible to imagine some circumstances where having a large moon would actually be pretty bad for life.

Rory Barnes, of the University of Washington, has also tackled the problem of obliquity, but from a different perspective. Planets on the edge of habitable zones exist in a precarious position, far enough away from their star that, without a thick, insulating atmosphere, they freeze over, just like Mars. Barnes and his colleagues including John Armstrong of Weber State University, realized that torques from other nearby worlds could cause a planet's inclination to the ecliptic plane to vary. This in turn would result in a change of obliquity; the greater the inclination, the greater the obliquity to the Sun. Barnes and Armstrong saw that this could be a good thing for planets on the edges of habitable zones, allowing heat to be distributed evenly over geological timescales and preventing "Snowball Earth" scenarios. They called these worlds "tilt-a-worlds," but the presence of a large moon would counteract this beneficial obliquity change.

"I think one of the most important points from our tilt-a-world paper is that at the outer edge of the habitable zone, having a large moon is bad, there's no other way to look at it," says Barnes. "If you have a large moon that stabilizes the obliquity then you have a tendency to completely freeze over."

Barnes is impressed with the work of Lissauer's team.
"I think it is a well done study," he says. "It suggests that Earth does not need the moon to have a relatively stable climate. I don't think there would be any dire consequences to not having a moon."

Mars' Changing Tilt
The effects of changing obliquity on Mars’ climate. Mars’ current 25-degree tilt is seen at top left. At top right is a Mars that has a high obliquity, leading to ice gather at its equator while the poles point sunwards. At bottom is Mars with low obliquity, which sees its polar caps grow in size.


Of course, the moon does have a hand in other factors important to life besides planetary obliquity. Tidal pools may have been the point of origin of life on Earth. Although the moon produces the largest tides, the sun also influences tides, so the lack of a large moon is not necessarily a stumbling block. Some animals have also evolved a life cycle based on the cycle of the moon, but that's more happenstance than an essential component for life.

"Those are just minor things," says Lissauer.

Without the absolute need for a moon, astrobiologists seeking life and habitable worlds elsewhere face new opportunities. Maybe Earth, with its giant moon, is actually the oddball amongst habitable planets. Rory Barnes certainly doesn't think we need it.
"It will be a step forward to see the myth that a habitable planet needs a large moon dispelled," he says, to which Lissauer agrees.
Earth without its moon might therefore remain habitable, but we should still cherish its friendly presence. After all, would Beethoven have written the Moonlight Sonata without it?

View Article Here Read More

Why science is so hard to believe?

 
In the recent movie “Interstellar,” set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked.


Excerpt from 


There’s a scene in Stanley Kubrick’s comic masterpiece “Dr. Strangelove” in which Jack D. Ripper, an American general who’s gone rogue and ordered a nuclear attack on the Soviet Union, unspools his paranoid worldview — and the explanation for why he drinks “only distilled water, or rainwater, and only pure grain alcohol” — to Lionel Mandrake, a dizzy-with-anxiety group captain in the Royal Air Force.
Ripper: “Have you ever heard of a thing called fluoridation? Fluoridation of water?”
Mandrake: “Ah, yes, I have heard of that, Jack. Yes, yes.”Ripper: “Well, do you know what it is?”
Mandrake: “No. No, I don’t know what it is, no.”
Ripper: “Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?” 

The movie came out in 1964, by which time the health benefits of fluoridation had been thoroughly established and anti-fluoridation conspiracy theories could be the stuff of comedy. Yet half a century later, fluoridation continues to incite fear and paranoia. In 2013, citizens in Portland, Ore., one of only a few major American cities that don’t fluoridate, blocked a plan by local officials to do so. Opponents didn’t like the idea of the government adding “chemicals” to their water. They claimed that fluoride could be harmful to human health.

Actually fluoride is a natural mineral that, in the weak concentrations used in public drinking-water systems, hardens tooth enamel and prevents tooth decay — a cheap and safe way to improve dental health for everyone, rich or poor, conscientious brushers or not. That’s the scientific and medical consensus.
To which some people in Portland, echoing anti-fluoridation activists around the world, reply: We don’t believe you.
We live in an age when all manner of scientific knowledge — from the safety of fluoride and vaccines to the reality of climate change — faces organized and often furious opposition. Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative.
Science doubt has become a pop-culture meme. In the recent movie “Interstellar,” set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked.


The debate about mandated vaccinations has the political world talking. A spike in measles cases nationwide has President Obama, lawmakers and even potential 2016 candidates weighing in on the vaccine controversy. (Pamela Kirkland/The Washington Post)
In a sense this is not surprising. Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable and rich in rewards — but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.
We’re asked to accept, for example, that it’s safe to eat food containing genetically modified organisms (GMOs) because, the experts point out, there’s no evidence that it isn’t and no reason to believe that altering genes precisely in a lab is more dangerous than altering them wholesale through traditional breeding. But to some people, the very idea of transferring genes between species conjures up mad scientists running amok — and so, two centuries after Mary Shelley wrote “Frankenstein,” they talk about Frankenfood.
The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy. Should we be afraid that the Ebola virus, which is spread only by direct contact with bodily fluids, will mutate into an airborne super-plague? The scientific consensus says that’s extremely unlikely: No virus has ever been observed to completely change its mode of transmission in humans, and there’s zero evidence that the latest strain of Ebola is any different. But Google “airborne Ebola” and you’ll enter a dystopia where this virus has almost supernatural powers, including the power to kill us all.
In this bewildering world we have to decide what to believe and how to act on that. In principle, that’s what science is for. “Science is not a body of facts,” says geophysicist Marcia McNutt, who once headed the U.S. Geological Survey and is now editor of Science, the prestigious journal. “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.”
The scientific method leads us to truths that are less than self-evident, often mind-blowing and sometimes hard to swallow. In the early 17th century, when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense — because it sure looks like the sun’s going around the Earth, and you can’t feel the Earth spinning. Galileo was put on trial and forced to recant. Two centuries later, Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales and even deep-sea mollusks is still a big ask for a lot of people.
Even when we intellectually accept these precepts of science, we subconsciously cling to our intuitions — what researchers call our naive beliefs. A study by Andrew Shtulman of Occidental College showed that even students with an advanced science education had a hitch in their mental gait when asked to affirm or deny that humans are descended from sea animals and that the Earth goes around the sun. Both truths are counterintuitive. The students, even those who correctly marked “true,” were slower to answer those questions than questions about whether humans are descended from tree-dwelling creatures (also true but easier to grasp) and whether the moon goes around the Earth (also true but intuitive).
Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They nest in our brains, chirping at us as we try to make sense of the world.
Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics. We might get a prostate-specific antigen test, even though it’s no longer generally recommended, because it caught a close friend’s cancer — and we pay less attention to statistical evidence, painstakingly compiled through multiple studies, showing that the test rarely saves lives but triggers many unnecessary surgeries. Or we hear about a cluster of cancer cases in a town with a hazardous-waste dump, and we assume that pollution caused the cancers. Of course, just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not random. Yet we have trouble digesting randomness; our brains crave pattern and meaning.
Even for scientists, the scientific method is a hard discipline. They, too, are vulnerable to confirmation bias — the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them. Once the results are published, if they’re important enough, other scientists will try to reproduce them — and, being congenitally skeptical and competitive, will be very happy to announce that they don’t hold up. Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or an absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.
That provisional quality of science is another thing a lot of people have trouble with. To some climate-change skeptics, for example, the fact that a few scientists in the 1970s were worried (quite reasonably, it seemed at the time) about the possibility of a coming ice age is enough to discredit what is now the consensus of the world’s scientists: The planet’s surface temperature has risen by about 1.5 degrees Fahrenheit in the past 130 years, and human actions, including the burning of fossil fuels, are extremely likely to have been the dominant cause since the mid-20th century.
It’s clear that organizations funded in part by the fossil-fuel industry have deliberately tried to undermine the public’s understanding of the scientific consensus by promoting a few skeptics. The news media gives abundant attention to such mavericks, naysayers, professional controversialists and table thumpers. The media would also have you believe that science is full of shocking discoveries made by lone geniuses. Not so. The (boring) truth is that science usually advances incrementally, through the steady accretion of data and insights gathered by many people over many years. So it has with the consensus on climate change. That’s not about to go poof with the next thermometer reading.
But industry PR, however misleading, isn’t enough to explain why so many people reject the scientific consensus on global warming.
The “science communication problem,” as it’s blandly called by the scientists who study it, has yielded abundant new research into how people decide what to believe — and why they so often don’t accept the expert consensus. It’s not that they can’t grasp it, according to Dan Kahan of Yale University. In one study he asked 1,540 Americans, a representative sample, to rate the threat of climate change on a scale of zero to 10. Then he correlated that with the subjects’ science literacy. He found that higher literacy was associated with stronger views — at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce their worldviews.
Americans fall into two basic camps, Kahan says. Those with a more “egalitarian” and “communitarian” mind-set are generally suspicious of industry and apt to think it’s up to something dangerous that calls for government regulation; they’re likely to see the risks of climate change. In contrast, people with a “hierarchical” and “individualistic” mind-set respect leaders of industry and don’t like government interfering in their affairs; they’re apt to reject warnings about climate change, because they know what accepting them could lead to — some kind of tax or regulation to limit emissions.
In the United States, climate change has become a litmus test that identifies you as belonging to one or the other of these two antagonistic tribes. When we argue about it, Kahan says, we’re actually arguing about who we are, what our crowd is. We’re thinking: People like us believe this. People like that do not believe this.
Science appeals to our rational brain, but our beliefs are motivated largely by emotion, and the biggest motivation is remaining tight with our peers. “We’re all in high school. We’ve never left high school,” says Marcia McNutt. “People still have a need to fit in, and that need to fit in is so strong that local values and local opinions are always trumping science. And they will continue to trump science, especially when there is no clear downside to ignoring science.”
Meanwhile the Internet makes it easier than ever for science doubters to find their own information and experts. Gone are the days when a small number of powerful institutions — elite universities, encyclopedias and major news organizations — served as gatekeepers of scientific information. The Internet has democratized it, which is a good thing. But along with cable TV, the Web has also made it possible to live in a “filter bubble” that lets in only the information with which you already agree.
How to penetrate the bubble? How to convert science skeptics? Throwing more facts at them doesn’t help. Liz Neeley, who helps train scientists to be better communicators at an organization called Compass, says people need to hear from believers they can trust, who share their fundamental values. She has personal experience with this. Her father is a climate-change skeptic and gets most of his information on the issue from conservative media. In exasperation she finally confronted him: “Do you believe them or me?” She told him she believes the scientists who research climate change and knows many of them personally. “If you think I’m wrong,” she said, “then you’re telling me that you don’t trust me.” Her father’s stance on the issue softened. But it wasn’t the facts that did it.
If you’re a rationalist, there’s something a little dispiriting about all this. In Kahan’s descriptions of how we decide what to believe, what we decide sometimes sounds almost incidental. Those of us in the science-communication business are as tribal as anyone else, he told me. We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community. When I mentioned to Kahan that I fully accept evolution, he said: “Believing in evolution is just a description about you. It’s not an account of how you reason.”
Maybe — except that evolution is real. Biology is incomprehensible without it. There aren’t really two sides to all these issues. Climate change is happening. Vaccines save lives. Being right does matter — and the science tribe has a long track record of getting things right in the end. Modern society is built on things it got right.
Doubting science also has consequences, as seen in recent weeks with the measles outbreak that began in California. The people who believe that vaccines cause autism — often well educated and affluent, by the way — are undermining “herd immunity” to such diseases as whooping cough and measles. The anti-vaccine movement has been going strong since a prestigious British medical journal, the Lancet, published a study in 1998 linking a common vaccine to autism. The journal later retracted the study, which was thoroughly discredited. But the notion of a vaccine-autism connection has been endorsed by celebrities and reinforced through the usual Internet filters. (Anti-vaccine activist and actress Jenny McCarthy famously said on “The Oprah Winfrey Show,” “The University of Google is where I got my degree from.”)
In the climate debate, the consequences of doubt are likely to be global and enduring. Climate-change skeptics in the United States have achieved their fundamental goal of halting legislative action to combat global warming. They haven’t had to win the debate on the merits; they’ve merely had to fog the room enough to keep laws governing greenhouse gas emissions from being enacted.
Some environmental activists want scientists to emerge from their ivory towers and get more involved in the policy battles. Any scientist going that route needs to do so carefully, says Liz Neeley. “That line between science communication and advocacy is very hard to step back from,” she says. In the debate over climate change, the central allegation of the skeptics is that the science saying it’s real and a serious threat is politically tinged, driven by environmental activism and not hard data. That’s not true, and it slanders honest scientists. But the claim becomes more likely to be seen as plausible if scientists go beyond their professional expertise and begin advocating specific policies.
It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app. It’s the way science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else — but their dogma is always wilting in the hot glare of new research. In science it’s not a sin to change your mind when the evidence demands it. For some people, the tribe is more important than the truth; for the best scientists, the truth is more important than the tribe.

View Article Here Read More
Older posts Newer posts

Creative Commons License
This work is licensed under a
Creative Commons Attribution 4.0
International License
.
unless otherwise marked.

Terms of Use | Privacy Policy



Up ↑