<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Outside in &#187; Futurism</title>
	<atom:link href="http://www.xenosystems.net/tag/futurism/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.xenosystems.net</link>
	<description>Involvements with reality</description>
	<lastBuildDate>Thu, 05 Feb 2015 01:26:08 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1</generator>
	<item>
		<title>Quote note (#125)</title>
		<link>http://www.xenosystems.net/quote-note-125/</link>
		<comments>http://www.xenosystems.net/quote-note-125/#comments</comments>
		<pubDate>Thu, 30 Oct 2014 05:10:55 +0000</pubDate>
		<dc:creator><![CDATA[admin]]></dc:creator>
				<category><![CDATA[Discriminations]]></category>
		<category><![CDATA[Acceleration]]></category>
		<category><![CDATA[Capitalism]]></category>
		<category><![CDATA[Futurism]]></category>
		<category><![CDATA[Neoreaction]]></category>
		<category><![CDATA[War]]></category>

		<guid isPermaLink="false">http://www.xenosystems.net/?p=3994</guid>
		<description><![CDATA[Another blog comment reproduction, this one from More Right, where Nyan Sandwich lays out the basic stress-lines of a potential tech-comm schism (of a kind initially &#8212; and cryptically &#8212; proposed in a tweet): There are definitely two opposing theories of a fast high-tech future. I call them “Accelerationism” and “Futurism” “Accelerationism” is the perspective [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Another blog comment reproduction, this <a href="http://www.moreright.net/open-thread-november-2014/#comment-6501">one</a> from <em>More Right</em>, where Nyan Sandwich lays out the basic stress-lines of a potential tech-comm schism (of a kind initially &#8212; and cryptically &#8212; proposed in a tweet):  </p>
<p><em>There are definitely two opposing theories of a fast high-tech future. I call them “Accelerationism” and “Futurism”</p>
<p>“Accelerationism” is the perspective that emphasizes Capital teleology, that someone is going to eat the stars (win), that humans have many inadequacies that hold us back from winning, that our machines, unbound from our sentimental conservatism could win, and advocates accelerating the arrival of the machine gods from Outside.</p>
<p>“Futurism” agrees that someone is going to win, and wants it to be *us*, that we can become God’s favored children by Nietz[schean] will to power, grit, and self improvement. That the path to the future is Man getting his shit together and improving himself, incorporating technology into himself. That Enhancement is preferable to Artifice.</p>
<p>Someone is going to win. Enhancement or Artifice? Us, or our machines?</p>
<p>I’m a futurist Techcom, Land is an accelerationist Techcom.</em></p>
<p>FWIW I think this is nicely done, but the complexities will explode when we get into the details. Fortunately, distinctions closely paralleling Nyan&#8217;s enhancement / artifice option have been quite carefully honed within certain parts of the Singularity literature. Hugo de <a href="http://turingchurch.com/2012/06/15/the-first-terran-shots-against-the-cosmists/">Garis</a>, in particular, does a lot with it &#8212; through the discrimination between &#8216;Cosmists&#8217; (artificers) and &#8216;Cyborgists&#8217; (enhancers) &#8212; although he thinks it is ultimately unstable, and a more sharply polarized species-conservative / techno-futurist conflict is bound to eventually absorb it. </p>
<p>It&#8217;s also interesting to see Nyan describe himself as a &#8220;futurist Techcom&#8221;. That&#8217;s new, isn&#8217;t it?</p>
]]></content:encoded>
			<wfw:commentRss>http://www.xenosystems.net/quote-note-125/feed/</wfw:commentRss>
		<slash:comments>66</slash:comments>
		</item>
		<item>
		<title>Gigadeath War</title>
		<link>http://www.xenosystems.net/gigadeath-war/</link>
		<comments>http://www.xenosystems.net/gigadeath-war/#comments</comments>
		<pubDate>Fri, 22 Aug 2014 11:12:17 +0000</pubDate>
		<dc:creator><![CDATA[admin]]></dc:creator>
				<category><![CDATA[Technology]]></category>
		<category><![CDATA[World]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Futurism]]></category>
		<category><![CDATA[Horror]]></category>
		<category><![CDATA[Politics]]></category>
		<category><![CDATA[War]]></category>

		<guid isPermaLink="false">http://www.xenosystems.net/?p=3370</guid>
		<description><![CDATA[Hugo de Garis argues (consistently) that controversy over permitted machine intelligence development will inevitably swamp all other political conflicts. (Here&#8216;s a video discussion on the thesis.) Given the epic quality of the scenario, and its basic plausibility, it has remained strangely marginalized up to this point. The component pieces seem to be falling into place. [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Hugo de Garis <a href="http://agi-conf.org/2008/artilectwar.pdf">argues</a> (consistently) that controversy over permitted machine intelligence development will inevitably swamp all other political conflicts. (<a href="https://www.youtube.com/watch?v=lEaAidCmxus">Here</a>&#8216;s a video discussion on the thesis.) Given the epic quality of the scenario, and its basic plausibility, it has remained strangely marginalized up to this point. The component pieces seem to be falling into place. The true element of genius in this futurist construction is <em>preemption</em>. The more one digs into that, the most twistedly dynamic it looks.</p>
<p>Among the many thought-provoking elements:</p>
<p>(1) Slow take-off is especially ominous for the de Garis model (in stark contrast to FAI arguments). The slower the process, the more time for ideological consolidation, incremental escalation, and preparation for violent confrontation.</p>
<p>(2) AI doesn&#8217;t even have to be possible for this scenario to unfold (it only has to be credible as a threat). </p>
<p>(3) De Garis&#8217; &#8216;Cosmist-Terran&#8217; division chops up familiar political spectra at strange angles. (Both NRx and the Ultra-Left contain the full C-T spectrum internally.)</p>
<p>(4) Terrans have to strike first, or lose. That asymmetry shapes everything.</p>
<p>(5) Impending Gigadeath War surely deserves a place on any filled-out horrorism list. </p>
<p><a href="http://www.xenosystems.net/wp-content/uploads/2014/08/nuclear-war-global-impacts_32431_600x450.jpg"><img src="http://www.xenosystems.net/wp-content/uploads/2014/08/nuclear-war-global-impacts_32431_600x450.jpg" alt="nuclear-war-global-impacts_32431_600x450" width="600" height="371" class="alignnone size-full wp-image-3373" /></a></p>
<p>De Garis&#8217; <a href="http://profhugodegaris.wordpress.com/">site</a>.</p>
<p>(Some topic preemption at <em>Outside in</em> <a href="http://www.xenosystems.net/the-way-of-the-worm/">here</a>.)</p>
]]></content:encoded>
			<wfw:commentRss>http://www.xenosystems.net/gigadeath-war/feed/</wfw:commentRss>
		<slash:comments>19</slash:comments>
		</item>
	</channel>
</rss>
