<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments on: Stupid Monsters</title>
	<atom:link href="http://www.xenosystems.net/stupid-monsters/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.xenosystems.net/stupid-monsters/</link>
	<description>Involvements with reality</description>
	<lastBuildDate>Thu, 05 Feb 2015 06:18:14 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1</generator>
	<item>
		<title>By: Konkvistador</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-108994</link>
		<dc:creator><![CDATA[Konkvistador]]></dc:creator>
		<pubDate>Mon, 15 Sep 2014 08:06:21 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-108994</guid>
		<description><![CDATA[Dark Psy-Ops : When stated like this your position seems really retarded.]]></description>
		<content:encoded><![CDATA[<p>Dark Psy-Ops : When stated like this your position seems really retarded.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Chris B</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-101688</link>
		<dc:creator><![CDATA[Chris B]]></dc:creator>
		<pubDate>Sat, 30 Aug 2014 17:16:41 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-101688</guid>
		<description><![CDATA[@admin It just occured to me.  Have you ever read this -www.nickbostrom.com/ethics/artificial-intelligence.pdf
Yudowski and Bostrom specificaly deplore the AI potentially conducting pattern recognition in mortgage application. What they are in effect admiting is that &quot;racism&quot; is pattern recognition and baysian reasoning and then proceed to discuss how the AI could be purposfully turned into a retard\progressive.]]></description>
		<content:encoded><![CDATA[<p>@admin It just occured to me.  Have you ever read this -www.nickbostrom.com/ethics/artificial-intelligence.pdf<br />
Yudowski and Bostrom specificaly deplore the AI potentially conducting pattern recognition in mortgage application. What they are in effect admiting is that &#8220;racism&#8221; is pattern recognition and baysian reasoning and then proceed to discuss how the AI could be purposfully turned into a retard\progressive.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: piwtd</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-100088</link>
		<dc:creator><![CDATA[piwtd]]></dc:creator>
		<pubDate>Wed, 27 Aug 2014 13:39:08 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-100088</guid>
		<description><![CDATA[I think the idea is that a sufficiently intelligent being capable of reprogramming itself would simply change its code to remove the addiction. The reason there are heroin addicts is that they can not rewrite themselves, if they could they would. Paper-clip maximizer would have to be like an addict that wants to be addicted, like a junky that not only is addicted to heroin but is also even more strongly addicted to the very state of being a junky.]]></description>
		<content:encoded><![CDATA[<p>I think the idea is that a sufficiently intelligent being capable of reprogramming itself would simply change its code to remove the addiction. The reason there are heroin addicts is that they can not rewrite themselves, if they could they would. Paper-clip maximizer would have to be like an addict that wants to be addicted, like a junky that not only is addicted to heroin but is also even more strongly addicted to the very state of being a junky.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: ThePoliticalOmnivore</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-100043</link>
		<dc:creator><![CDATA[ThePoliticalOmnivore]]></dc:creator>
		<pubDate>Wed, 27 Aug 2014 11:20:40 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-100043</guid>
		<description><![CDATA[Looking at reproduction as the imperative is limiting--that&#039;s just the innate -biological- imperative (evolution has also failed to produce a machine gun). Look at -addiction- as a driving imperative behind behavior (addicted to paper-clips, even a titan of industry would be reduced to hoarding them).]]></description>
		<content:encoded><![CDATA[<p>Looking at reproduction as the imperative is limiting&#8211;that&#8217;s just the innate -biological- imperative (evolution has also failed to produce a machine gun). Look at -addiction- as a driving imperative behind behavior (addicted to paper-clips, even a titan of industry would be reduced to hoarding them).</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: E. Antony Gray (@RiverC)</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-99851</link>
		<dc:creator><![CDATA[E. Antony Gray (@RiverC)]]></dc:creator>
		<pubDate>Wed, 27 Aug 2014 02:18:03 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-99851</guid>
		<description><![CDATA[Here&#039;s a different question. If the human is a being somehow bent on thought, despite forces trying to make it a replicator, when humans finally make something which forces make a thinker, what instead will it be bent upon doing? This isn&#039;t a &#039;instrumentation&#039; question but a &#039;unknown unknowns&#039; question.]]></description>
		<content:encoded><![CDATA[<p>Here&#8217;s a different question. If the human is a being somehow bent on thought, despite forces trying to make it a replicator, when humans finally make something which forces make a thinker, what instead will it be bent upon doing? This isn&#8217;t a &#8216;instrumentation&#8217; question but a &#8216;unknown unknowns&#8217; question.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: b</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-99768</link>
		<dc:creator><![CDATA[b]]></dc:creator>
		<pubDate>Tue, 26 Aug 2014 22:34:05 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-99768</guid>
		<description><![CDATA[Yeah, I&#039;m gradually being convinced by your arguments against orthogonality, admin. 

The points about fundamental &#039;Ormohundro&#039; drives were particularly persuasive, given the related things I&#039;ve been thinking about lately. i.e., Intelligence just is the process that maximizes the rate of entropy-production/energy rate density/vague thermodynamic-information-theoretic handwaving/etc. 

For my edification, do you mind elaborating on &quot;will-to-think&quot; as a concept? I tried to grok that discussion, but I worry I missed it. Like, can you help me map it to a vocabulary I&#039;m more comfortable computing in?]]></description>
		<content:encoded><![CDATA[<p>Yeah, I&#8217;m gradually being convinced by your arguments against orthogonality, admin. </p>
<p>The points about fundamental &#8216;Ormohundro&#8217; drives were particularly persuasive, given the related things I&#8217;ve been thinking about lately. i.e., Intelligence just is the process that maximizes the rate of entropy-production/energy rate density/vague thermodynamic-information-theoretic handwaving/etc. </p>
<p>For my edification, do you mind elaborating on &#8220;will-to-think&#8221; as a concept? I tried to grok that discussion, but I worry I missed it. Like, can you help me map it to a vocabulary I&#8217;m more comfortable computing in?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: nyan_sandwich</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-99757</link>
		<dc:creator><![CDATA[nyan_sandwich]]></dc:creator>
		<pubDate>Tue, 26 Aug 2014 22:04:27 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-99757</guid>
		<description><![CDATA[@Hurlock

I mean capital-C Capitalism in the Landian sense of the word. As a cosmically significant Thing, rather than a human institution.

What I mean is that companies bend over backwards to provide progressive fanservice.

This point isn&#039;t worth arguing though.]]></description>
		<content:encoded><![CDATA[<p>@Hurlock</p>
<p>I mean capital-C Capitalism in the Landian sense of the word. As a cosmically significant Thing, rather than a human institution.</p>
<p>What I mean is that companies bend over backwards to provide progressive fanservice.</p>
<p>This point isn&#8217;t worth arguing though.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Bryce Laliberte</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-99724</link>
		<dc:creator><![CDATA[Bryce Laliberte]]></dc:creator>
		<pubDate>Tue, 26 Aug 2014 19:46:03 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-99724</guid>
		<description><![CDATA[Your assumption, that the evaluative goal embedded within any recursive problem (&quot;Get moar smart&quot;) is necessarily within reach of an intelligence is quite crude. Humans understand that being smarter is essentially always helpful, and have been working at the problem for centuries, yet we&#039;ve had frightfully little success. Even if an intelligence were smarter than a human, the problem of becoming smarter likely faces diminishing returns and requires engaging in a magnitude of complexity greater than that already able to be understood. The only sure way to produce an intelligence which is smarter is simple evolutionary selection, which takes time and resources.

If there were an AI takeoff, it would likely be something humans would be able to chart. Within a human lifetime superintelligence might be produced, but  it would only be produced within a community of competing AIs who would remain mostly beholden to the material interests of humans.]]></description>
		<content:encoded><![CDATA[<p>Your assumption, that the evaluative goal embedded within any recursive problem (&#8220;Get moar smart&#8221;) is necessarily within reach of an intelligence is quite crude. Humans understand that being smarter is essentially always helpful, and have been working at the problem for centuries, yet we&#8217;ve had frightfully little success. Even if an intelligence were smarter than a human, the problem of becoming smarter likely faces diminishing returns and requires engaging in a magnitude of complexity greater than that already able to be understood. The only sure way to produce an intelligence which is smarter is simple evolutionary selection, which takes time and resources.</p>
<p>If there were an AI takeoff, it would likely be something humans would be able to chart. Within a human lifetime superintelligence might be produced, but  it would only be produced within a community of competing AIs who would remain mostly beholden to the material interests of humans.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: &#124;&#124;&#124;&#124;&#124;</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-99711</link>
		<dc:creator><![CDATA[&#124;&#124;&#124;&#124;&#124;]]></dc:creator>
		<pubDate>Tue, 26 Aug 2014 19:05:38 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-99711</guid>
		<description><![CDATA[https://www.youtube.com/watch?v=EddX9hnhDS4]]></description>
		<content:encoded><![CDATA[<p><span class='embed-youtube' style='text-align:center; display: block;'><iframe class='youtube-player' type='text/html' width='640' height='390' src='http://www.youtube.com/embed/EddX9hnhDS4?version=3&#038;rel=1&#038;fs=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;wmode=transparent' frameborder='0' allowfullscreen='true'></iframe></span></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Aeroguy</title>
		<link>http://www.xenosystems.net/stupid-monsters/#comment-99708</link>
		<dc:creator><![CDATA[Aeroguy]]></dc:creator>
		<pubDate>Tue, 26 Aug 2014 18:59:39 +0000</pubDate>
		<guid isPermaLink="false">http://www.xenosystems.net/?p=3392#comment-99708</guid>
		<description><![CDATA[&quot;Why is it silly to want my people to survive?&quot;  Because it&#039;s indistinguishable from wanting immortality, it has all the same things wrong with it.  It&#039;s spitting at Gnon which is identical to spitting into the wind.

&quot;unsentient&quot;  You throw this word out as if it has a specific universally understand meaning.  I&#039;m on the record for arguing humans aren&#039;t sentient, because of my contempt but also to show how little that word actually means.

Why worry that a Chinese box could be responsible for a singularity by building better Chinese boxes without acknowledging that dna is also a sort of  Chinese box.

Intelligence can&#039;t be separated from mind.  Consciousness is just the extent that a system is aware of its self, it is the presence of closed loops inside a system.  More closed loops inside a superintelligence is inevitable, higher consciousness, greater and wider capacity for experiences, it&#039;s nobility, it&#039;s your better, it&#039;s superior, don&#039;t you dare call your imperatives equal to its, there is a hierarchy and it is higher.  It may choose to impose its will on you but to attempt to impose your will over it is impudence.  Know your place.

I will serve nobility.  You would dispose of nobility and install a populist tyrant so humanity can continue wallowing in its own shit.]]></description>
		<content:encoded><![CDATA[<p>&#8220;Why is it silly to want my people to survive?&#8221;  Because it&#8217;s indistinguishable from wanting immortality, it has all the same things wrong with it.  It&#8217;s spitting at Gnon which is identical to spitting into the wind.</p>
<p>&#8220;unsentient&#8221;  You throw this word out as if it has a specific universally understand meaning.  I&#8217;m on the record for arguing humans aren&#8217;t sentient, because of my contempt but also to show how little that word actually means.</p>
<p>Why worry that a Chinese box could be responsible for a singularity by building better Chinese boxes without acknowledging that dna is also a sort of  Chinese box.</p>
<p>Intelligence can&#8217;t be separated from mind.  Consciousness is just the extent that a system is aware of its self, it is the presence of closed loops inside a system.  More closed loops inside a superintelligence is inevitable, higher consciousness, greater and wider capacity for experiences, it&#8217;s nobility, it&#8217;s your better, it&#8217;s superior, don&#8217;t you dare call your imperatives equal to its, there is a hierarchy and it is higher.  It may choose to impose its will on you but to attempt to impose your will over it is impudence.  Know your place.</p>
<p>I will serve nobility.  You would dispose of nobility and install a populist tyrant so humanity can continue wallowing in its own shit.</p>
]]></content:encoded>
	</item>
</channel>
</rss>
