Archive for the ‘Apocalypse’ Category

Quote note (#168)

The level of apocalypticism to be found in scientific abstracts rarely reaches the Dark Enlightenment threshold, but there are always exceptions. Here’s Olav Albert Christophersen, on ‘Thematic Cluster: Focus on Autism Spectrum Disorder’, originally published in Microbial Ecology in Health & Disease (2012). Indicatively, the paper is subtitled ‘Should autism be considered a canary bird telling that Homo sapiens may be on its way to extinction?’ The full abstract:

There has been a dramatic enhancement of the reported incidence of autism in different parts of the world over the last 30 years. This can apparently not be explained only as a result of improved diagnosis and reporting, but may also reflect a real change. The causes of this change are unknown, but if we shall follow T.C. Chamberlin’s principle of multiple working hypotheses, we need to take into consideration the possibility that it partly may reflect an enhancement of the average frequency of responsible alleles in large populations. If this hypothesis is correct, it means that the average germline mutation rate must now be much higher in the populations concerned, compared with the natural mutation rate in hominid ancestors before the agricultural and industrial revolutions. This is compatible with the high prevalence of impaired human semen quality in several countries and also with what is known about high levels of total exposure to several different unnatural chemical mutagens, plus some natural ones at unnaturally high levels. Moreover, dietary deficiency conditions that may lead to enhancement of mutation rates are also very widespread, affecting billions of people. However, the natural mutation rate in hominids has been found to be so high that there is apparently no tolerance for further enhancement of the germline mutation rate before the Eigen error threshold will be exceeded and our species will go extinct because of mutational meltdown. This threat, if real, should be considered far more serious than any disease causing the death only of individual patients. It should therefore be considered the first and highest priority of the best biomedical scientists in the world, of research-funding agencies and of all medical doctors to try to stop the express train carrying all humankind as passengers on board before it arrives at the end station of our civilization. [XS emphasis]

(Mutational load is, of course, genomic entropy — and the kind of ‘Social Darwinian’ or eugenicist mechanisms that might dissipate it are all, today, strictly unthinkable.)

(Via.)

June 13, 2015admin 21 Comments »
FILED UNDER :Apocalypse
TAGGED WITH : , , , ,

Short Circuit

Probably the best short AI risk model ever proposed:

I can’t find the link, but I do remember hearing about an evolutionary algorithm designed to write code for some application. It generated code semi-randomly, ran it by a “fitness function” that assessed whether it was any good, and the best pieces of code were “bred” with each other, then mutated slightly, until the result was considered adequate. […] They ended up, of course, with code that hacked the fitness function and set it to some absurdly high integer.

… Any mind that runs off of reinforcement learning with a reward function – and this seems near-universal in biological life-forms and is increasingly common in AI – will have the same design flaw. The main defense against it this far is simple lack of capability: most computer programs aren’t smart enough for “hack your own reward function” to be an option; as for humans, our reward centers are hidden way inside our heads where we can’t get to it. A hypothetical superintelligence won’t have this problem: it will know exactly where its reward center is and be intelligent enough to reach it and reprogram it.

The end result, unless very deliberate steps are taken to prevent it, is that an AI designed to cure cancer hacks its own module determining how much cancer has been cured and sets it to the highest number its memory is capable of representing. Then it goes about acquiring more memory so it can represent higher numbers. If it’s superintelligent, its options for acquiring new memory include “take over all the computing power in the world” and “convert things that aren’t computers into computers.” Human civilization is a thing that isn’t a computer.

(It looks superficially like a version of the — absurdpaperclipper, but it isn’t, at all.)

ADDED: Wirehead central.

June 3, 2015admin 38 Comments »
FILED UNDER :Apocalypse
TAGGED WITH : , , ,

Moors Law

Derbyshire cited some statistics from this exponential demographic calamity article, which are truly remarkable:

Figures from the 2011 census show that the Muslim population in the UK has substantially risen between 2001 and 2011 from 1.5 million to almost 3 million. This now takes the proportion of Muslims from 2% of the population to 5%. In some towns, Muslims make up almost 50% of the population, and in large cities like London and Manchester they make up around 14% of the population. But why has the number of Muslims risen so much and what are the implications? […] There are several reasons why the number of Muslims has doubled. […] … By the next census Muslims may even double again and make up 10% of the population. These statistics encourage us to think more carefully about the provisions made for British Muslims and the ways in which they are an integral part of the nation. [Emphasis in original.]

(It ‘encourages’ me to think of different things entirely.)

March 24, 2015admin 20 Comments »
FILED UNDER :Apocalypse
TAGGED WITH : , , , , , ,