Via Flowing Data, here are a set of world, regional, and country maps made of beeswax by Chinese artist Ren Ri by manipulating the movement of the hive’s queen, thus directing the hive’s workers to build up the wax according to the artist’s template. The result is an impressively faithful maps. You can see these maps and more in his Yuansu catalogue off of his PearlLam Galleries exhibition page.
Category: Global Issues
Nick Land recently pointed to Robin Hanson’s 1998 essay, The Great Filter, which looks at possible explanations to Fermi’s Paradox, by looking at critical steps necessary to galactic colonization and asking where the greatest barrier to further progression is. Land sees this implying that many civilizations are exterminated Shortly thereafter, Jim posted a response that the Great Filter lies not before us, but behind us.
While Jim might be right that intelligent life of similar caliber to humans is rare in the galaxy, he starts his post off with an assumption that seems to me to be unwarranted:
There seem to be no great obstacles to intelligent life devouring the galaxy.
In fact, obstacles are legion in a project of interstellar colonization. The first of these is the sheer distance between stars. The nearest star, Proxima Centauri, is over 2,000 times further away than the Voyager 1 spacecraft, which, launched in 1977, is the furthest manmade object from Earth. To send a living crew that distance, the spacecraft would require speeds at least hundreds of times that of the voyager probes, and the amount of energy required rises with the square of the speed, so hundreds of times the speed requires myriads of times the energy
Second, not all stars are going to have habitable way stations for a prospectively interstellar civilizations to colonize and develop before sending out more colonists to the next waypoint. Proxima Centauri is again a good example. As a red dwarf flare star whose stellar flares make any potentially habitable planets that might orbit it less accommodating and possibly entirely useless for the purpose of installing a colonial civilization capable of sending future generations of colonists to worlds further out. This challenge both effectively increases both the necessary distance traveled per voyage between stars and the cost of the measures put in place to allow colonists to settle an alien world.
Earth has countless amenities that we don’t even notice by virtue of our having evolved in response to Earth’s environment. The amount of oxygen in the atmosphere closely matches what is ideal for us because we are sculpted around such parameters, not vice versa. As such, the likelihood of finding a planet with the right oxygen levels to support human life outside of a sealed environment with the gas balance controlled by humans is minimal and the probability only goes down when other parameters like temperature or surface pressure are added. This adds an additional challenge, because it limits the possible development of any potential daughter civilization, as it will likely be confined to an environment sealed off from the original atmosphere of the planet it inhabits. Since such civilizations serve as launching pads for a second wave of colonization to stars further on in any scenario by which a civilization successfully conquers an entire galaxy, this also introduces a barrier beyond the initial wave of colonization.
The third and most important problem is motivation. Humans have done a remarkable job of proliferating around the planet and we have the ability to venture into space, though we haven’t yet sent ourselves out of Earth’s gravity well. However, contrary to Robin Hanson’s suggestions, our ability to colonize and proliferate on Earth combined with our ability to reach beyond the Earth’s atmosphere does not imply that we will actively pursue a voyage to the stars. We have a fundamental difference in the magnitude of the challenges that face humans colonizing Earth versus colonizing a planet around another star, and ultimately, because it requires a capital investment that dwarfs that of any human project past or present with little to no hope of a return on investment for those who stay on Earth.
While it is true as Robin Hanson suggests, that natural selection rewards those who break with the established strategy in ways that exploit previously untapped resources, the means by which it does so tend not to involve an actual conscious desire to proliferate. Instead, strategies developed through natural selection involve the confluence of many instincts. Instincts, which have if anything, proven themselves to be thwarted by the mix of technology an intelligence, as the use of contraception shows. Yes, it’s possible that a select few individuals could muster the necessary resources to back a project to send Earthly life to a planet orbiting another star, but the chance of success combined with the cost and sheer difficulty of such a mission makes it unlikely that such individuals will arise.
Ultimately, the barriers to interstellar travel do represent a great filter preventing the colonization of the galaxy by an intelligent civilization. Whether they are the Great Factor is difficult to judge. However, it should be noted that as exterminators go, the lack of will to send colonists to other stars is one of the slowest-acting civilization killers in existence, and probably less consequential than more proximate causes.
I haven’t posted anything regarding the topic of climate change, but I did want to highlight what I thought was a very good article by John Derbyshire in Taki’s Magazine under the title “Al Gore’s Dream of Power”, which touches on some recent comments by former Vice President Al Gore comparing global warming skeptics with racists during the civil rights era. While many liberals upon hearing the title of the article and that the author is a conservative would assume that this was another piece aimed at discrediting scientific consensus by attacking Al Gore, Derbyshire states that science tells us that:
The Earth’s climate is variable. It is currently varying on an overall (several-year moving average) warming trend. Some part of this current trend is due to human activity.
Indeed, while many elements of the political Right have hewn to a line that anthropogenic global warming is a hoax, the fact is that the great majority of climate scientists accept some form of anthropogenic global warming and the evidence against it is currently pretty weak. What really is at issue is the normative aspect of this fact and the politics surrounding it. Conservatives have done themselves some damage by choosing to dispute the science itself, though open dispute is healthy for science, rather than asking whether the effects of an average temperature rise were truly dire events and discussing the power, or lack thereof, that political actors have in shaping our climatic future.
Those two issues are ones of great uncertainty, but in both cases, the evidence is stacked against those seeking radical political approaches to climatic issues. Contrary to the claims of Bill Nye, there is nothing that suggests that Irene was “caused” by climate change. It was, in fact, a rather mundane hurricane whose most noteworthy feature was that its path led to America’s largest city and media center, New York. The repeated conflation of periodic weather events with rising global temperatures is an area with regard to climate that those on the political Left are frequently grasping at straws with regard to evidence.
Further, while higher average temperatures may be detrimental to some regions, it is hard to argue that expanded temperate zones and longer growing seasons in higher latitudes as well as the opening of new sea lanes during summer months, all of which are likely consequences of an upward creep in average global temperature, are bad events. The extent to which this would be a dominant trend relative to predictions of sea level rise that threatens low-lying populations such as Bangladesh is outside the scope of reliable climate modeling.
Finally, the constellation of interests around the globe make it unlikely that any major global political action will succeed at preventing a significant amount of fossil fuel usage, which is suspected of being man’s largest contribution to the current climate trend. Furthermore, given fossil fuels’ status as the most commons sources of energy, stemming from their reliability, versatility, and prices, combined with their finite availability, any cuts in short term usage are likely to be nearly balanced out by greater future usage as easily accessed deposits would simply be depleted more slowly, barring civilizational collapse, an outcome far worse than what all but the most dire predictions regarding Earth’s climate countenance.
In any event, I would recommend John Derbyshire’s piece, as I think it is an accurate assessment of the landscape we face in terms of the science, the politics, and the uncertainty, and advocates a sound approach to thinking about the topic, one that has been eschewed both by most on the Left and the Right.
In the wake of the 9.0 Mw earthquake that took place off the coast of Japan and its associated tsunami, an area that has caught the attention of the news media, particularly in the West has been the condition of the reactors at the Fuskushima I Nuclear Power Plant. Because of the public sensitivity and ignorance with regard to nuclear issues, a point well-illustrated by the fact that MRI machines are so-named despite employing essentially the same technology as nuclear magnetic resonance spectrometers, the reporting of the story has generated a sense of hysteria among the public, who are frightened by talk of explosions and meltdown and by pictures of technicians in protective suits scanning locals in Fukushima province for radiation exposure.
In the wake of this, I want to point out a few sources that take a sober look at the problem. The first is from Mutant Frog Travelogue, which presents the situation with the necessary caveats to those currently in Japan. The second is the MIT NSE Nuclear Information Hub, which provides periodic updates as well as background on the mechanics. The third is the following video conversation between science journalist John Horgan and nuclear engineer Rod Adams:
Vodpod videos no longer available.
To be fair, Adam’s is an adamant pro-nuclear advocate, so his instinct to downplay the dangers of this incident may be overly strong, but his background gives him a sense of proportion that is sorely lacking in many of the reporters and consumers of information that have clamored around this story.
I do wish the best to the people of Japan as they begin the long process of rebuilding their country and reestablishing normality, and I do think that this is a grave and serious matter that deserves the attention of the relevant authorities and that necessary precautions should and are being taken. However, it seems clear to me that most of the attitudes concerning the Fukushima I plant have bordered on hysteria, including ridiculous moves by some foreign governments and one of the responsibilities of both providers and consumers of information is to obtain a proper context for what is happening and that is happening it too few places.
Update: Steve Hsu points to an interview with the UK government’s Chief Scientific Officer John Beddington on the website of the British embassy in Tokyo. Hsu highlights the following passage regarding what a meltdown would mean:
If the Japanese fail to keep the reactors cool and fail to keep the pressure in the containment vessels at an appropriate level, you can get this, you know, the dramatic word “meltdown.” But what does that actually mean? What a meltdown involves is the basic reactor core melts, and as it melts, nuclear material will fall through to the floor of the container. There it will react with concrete and other materials that is likely.
Remember this is the reasonable worst case, we don’t think anything worse is going to happen. In this reasonable worst case you get an explosion. You get some radioactive material going up to about 500 meters up into the air. Now, that’s really serious, but it’s serious again for the local area. It’s not serious for elsewhere, even if you get a combination of that explosion it would only have nuclear material going in to the air up to about 500 meters.
If you then couple that with the worst possible weather situation, i.e. prevailing weather taking radioactive material in the direction of Greater Tokyo and you had maybe rainfall which would bring the radioactive material down, do we have a problem? The answer is unequivocally no. Absolutely no issue.
The problems are within 30 km of the reactor. And to give you a flavor for that, when Chernobyl had a massive fire at the graphite core, material was going up not just 500 meters but to 30,000 feet; it was lasting not for the odd hour or so but lasted months, and that was putting nuclear radioactive material up into the upper atmosphere for a very long period of time. But even in the case of Chernobyl, the exclusion zone that they had was about 30 kilometers. And in that exclusion zone, outside that, there is no evidence whatsoever to indicate people had problems from the radiation.
The problems with Chernobyl were people were continuing to drink the water, continuing to eat vegetables and so on and that was where the problems came from. That’s not going to be the case here. So what I would really reemphasize is that this is very problematic for the area and the immediate vicinity and one has to have concerns for the people working there. Beyond that 20 or 30 kilometers, it’s really not an issue for health.
Again, this is very serious, but a sense of proportion is warranted here.