# Technological singularity



## shadeguy (Mar 28, 2013)

Did anyone else came to have a daily hidden hope that at some point at this lifetime technological advance would turn their lives (well, everyone's) a lot much better (cure from disease, depression, much longer lifespan and other improvements to their lives).

To those who don't know what it is, the idea of technological singularity is that the level of technology (in terms of computational power, rate of technology changes and other factors) is rising at an exponential rate, thus at an accelerated speed over time. Actually the idea itself is not new, though it seems to me that there is an increasing interest in recent years in the claims that on the next decades we are going to witness radical changes with computer computational power far suppressing the human brain (the most famous of such 'futurists' claiming that is probably Ray Kurzweil).

I've got to admit that while some of such claims are seem to me to be exaggerated or overoptimistic (like, well- a super-advanced technology could may well destroy the earth, something that nuclear weaponry could may already do for decades now), I have been reading about this quite a lot in recent years, and I must say that the idea itself of far reaching changes directly effecting our lives in coming years seem far from obscure to me. I am writing of this because the truth is that this realization that such changes (hopefully good ones) may well happen and will lead to things like a 'cure' to aging in our own lifetime (you can read on Aubrey de Grey about that), is quite honestly the most important realization that I had in my life. It is actually the moment when I told myself that I should start to take care of my health, my life, that I should 'survive' because this life is worth living- because there is hope. Maybe even hope that it is more than being born, living some two-digit number of years and maybe even 100 and then dying, with everything you ever thought or felt then crumbling with your human brain. It even somehow broke a barrier that I had in my mind, and told me that getting out of my isolation (social, though I guess mental in a more general way) is not only something that I want but something that I must actively do right now and at any moment after. It's probably something that I should have realized much earlier and from many other different reasons, but this is how it came to me.

I very rarely write anything as long as this other than in studying with intention for other people to read. I guess that it would be interesting to know of others who are also fascinated with this subject as I am.


----------



## NeuromorPhish (Oct 11, 2012)

These are some interesting thoughts you've got... 
When i first came across the ideas of transhumanism, i felt they weren't worth pursuing, especially the idea of human life-extension. Why extend the misery? Why not spend those resources on curing diseases and mental illness instead? But then i realized that medicine and life extension aren't mutually exclusive, and that by finding cures for age-related degenerative diseases, life-span increases automatically. "Age" in itself isn't a cause of death.

As for the technological singularity and the predictions made by Ray Kurzweil... I haven't read all of the claims, but i've heard software engineers tend to disagree with him.

I'm not sure what to make of these futurists and their techno-optimism. It's very interesting, yet i find myself skeptical to speculative claims about the future.


----------



## nullptr (Sep 21, 2012)

NeuromorPhish said:


> These are some interesting thoughts you've got...
> When i first came across the ideas of transhumanism, i felt they weren't worth pursuing, especially the idea of human life-extension. Why extend the misery? Why not spend those resources on curing diseases and mental illness instead? But then i realized that medicine and life extension aren't mutually exclusive, and that by finding cures for age-related degenerative diseases, life-span increases automatically. "Age" in itself isn't a cause of death.
> 
> As for the technological singularity and the predictions made by Ray Kurzweil... I haven't read all of the claims, but i've heard software engineers tend to disagree with him.
> ...


They do? Why?


----------



## NeuromorPhish (Oct 11, 2012)

galacticsenator said:


> They do? Why?


http://spectrum.ieee.org/computing/software/ray-kurzweils-slippery-futurism


> Kurzweil is confident, for instance, that by 2029 researchers, having reverse engineered the human brain, will build an AI that can pass as human. (He has a US $20 000 bet to that effect with computing pioneer Mitchell Kapor riding at the Long Bets Web site.) Neuroscientists, AI researchers, and others have objected that no one today has more than the faintest idea of how to accomplish these feats and that his time line is highly unrealistic. Kurzweil dismisses all such objections: The obstacles will undoubtedly melt away in the face of Moore's Law and the unstoppable acceleration of technology.


----------



## Arthur Pendragon (Mar 17, 2013)

shadeguy said:


> Maybe even hope that it is more than being born, living some two-digit number of years and maybe even 100 and then dying, with everything you ever thought or felt then crumbling with your human brain.


Currently, I'd put your hopes at 120 years, as that is the consensus of the limit without radically reprogramming the structure of the neuron. I can tell you that you are not the only one who believes that this won't be science fiction


----------



## huh (Mar 19, 2007)

Yeah, Ray Kurzwel is somewhat of an over-optimistic nutbag. I started reading his book "The Singularity is Near" but I couldn't make it past the first 50 pages because so many of his predictions and claims seemed ridiculous. This also a guy that takes hundreds (roughly 150-250?) of vitamins each day in hopes to live long enough where our sufficiently advanced technology will allow him to live for a huge stretch of time. Talk about some expensive urine...lulz.


----------



## shadeguy (Mar 28, 2013)

huh said:


> Yeah, Ray Kurzwel is somewhat of an over-optimistic nutbag. I started reading his book "The Singularity is Near" but I couldn't make it past the first 50 pages because so many of his predictions and claims seemed ridiculous. This also a guy that takes hundreds (roughly 150-250?) of vitamins each day in hopes to live long enough where our sufficiently advanced technology will allow him to live for a huge stretch of time. Talk about some expensive urine...lulz.


i have been wondering about his aggresive supplements routine. form what i see its not even clear that taking i few supplements is beneficial. there are still many things that we dont know about the human body and how it works, the simple logic of 'take vitamin & minerals because they are good for you' was proven wrong many time. its probably best for now to avoid taking such radical steps unless maybe someone is sick and is told by experts that they should.


----------



## enfield (Sep 4, 2010)

there are different definitions of or schools of thought when it comes to the technological singularity. some may be more meaningful than others. in particular, maybe the difference of stressing the idea of an intelligence explosion as compared to the broader idea of accelerating change and technological progress, and the things that may mean.
http://yudkowsky.net/singularity/schools



shadeguy said:


> Did anyone else came to have a daily hidden hope that at some point at this lifetime technological advance would turn their lives (well, everyone's) a lot much better (cure from disease, depression, much longer lifespan and other improvements to their lives).
> 
> To those who don't know what it is, the idea of technological singularity is that the level of technology (in terms of computational power, rate of technology changes and other factors) is rising at an exponential rate, thus at an accelerated speed over time. Actually the idea itself is not new, though it seems to me that there is an increasing interest in recent years in the claims that on the next decades we are going to witness radical changes with computer computational power far suppressing the human brain (the most famous of such 'futurists' claiming that is probably Ray Kurzweil).
> 
> ...


i have been reading about it for the last few years as well. it will be dismissed as the rapture of the nerds (lul) or for being ultra-geeky, but i don't know how those things could be farther from the truth. it may be the single most meaningful thing i have ever come across - on the one hand the real threat of extinction for humans it presents, and on the other, the possibility of transcendence for all of us, freedom from so many of the evolutionary constraints which are imposed on us, and either set to happen in the near future, potentially even in our own lifetimes! if that is true, and the more you investigate and the more smart people you see migrate to the topic the more one typically feels it may be, than it must be paid attention to. that it must get even a fraction of the attention that it deserves (so like the chances of things going wrong and not turning out good, like humans becoming an extinct species, are reduced).


----------



## ThePeon (Sep 13, 2012)

Ray Kruzweil is a guy who is _really _afraid of death.

In many ways, his depiction of the singularity is almost religious in nature.


----------



## Arthur Pendragon (Mar 17, 2013)

shadeguy said:


> the simple logic of 'take vitamin & minerals because they are good for you' was proven wrong many time. its probably best for now to avoid taking such radical steps unless maybe someone is sick and is told by experts that they should.


When looking at these proofs, it is important to understand the model organism and the quantities used. Anything in large amounts, including water, will KILL you. However, our body does have a flaw where oxidation can cause unpredictable reactions, while being flawed in its design where it requires oxidation to perform some reactions. Therefore, antioxidants, like most things beneficial, are good in moderation.

This brings another point, however. What if you just stay away from anything that brings oxidation or is cancer-related, resulting in a stress-free environment? The fact is that our body has regulation repair mechanisms that actually BENEFIT from stress, and unless you plan on living a life with no stress at all (0 reactivity) then even stress is good in moderation.


----------



## ugh1979 (Aug 27, 2010)

huh said:


> Yeah, Ray Kurzwel is somewhat of an over-optimistic nutbag. I started reading his book "The Singularity is Near" but I couldn't make it past the first 50 pages because so many of his predictions and claims seemed ridiculous. This also a guy that takes hundreds (roughly 150-250?) of vitamins each day in hopes to live long enough where our sufficiently advanced technology will allow him to live for a huge stretch of time. Talk about some expensive urine...lulz.


Some of his predications may seem far fetched but he's got a very good track record.

I wouldn't call him a nut bag, and either do Google, who recently made him director of engineering with almost unlimited funds.

I think it's great there is someone with the confidence and ideas like Kurzweil at the helm of Google engineering. I couldn't think of a better place for him. We can only begin to imagine the products they will bring out in the future.


----------



## ugh1979 (Aug 27, 2010)

enfield said:


> there are different definitions of or schools of thought when it comes to the technological singularity. some may be more meaningful than others. in particular, maybe the difference of stressing the idea of an intelligence explosion as compared to the broader idea of accelerating change and technological progress, and the things that may mean.
> http://yudkowsky.net/singularity/schools
> 
> i have been reading about it for the last few years as well. it will be dismissed as the rapture of the nerds (lul) or for being ultra-geeky, but i don't know how those things could be farther from the truth. it may be the single most meaningful thing i have ever come across - on the one hand the real threat of extinction for humans it presents, and on the other, the possibility of transcendence for all of us, freedom from so many of the evolutionary constraints which are imposed on us, and either set to happen in the near future, potentially even in our own lifetimes! if that is true, and the more you investigate and the more smart people you see migrate to the topic the more one typically feels it may be, than it must be paid attention to. that it must get even a fraction of the attention that it deserves (so like the chances of things going wrong and not turning out good, like humans becoming an extinct species, are reduced).


Likewise i'm a big supporter of these ideas. I read about the developments on a near daily basis as I find it so interesting and exciting.

I highley recommend this website for great infographics on what's on the horizon: http://envisioningtech.com/

Their Twitter feed is worth following as well: @envisioningtech

New Scientist is great for keeping up to date with the latest developments well.


----------



## huh (Mar 19, 2007)

ugh1979 said:


> Some of his predications may seem far fetched but he's got a very good track record.
> 
> I wouldn't call him a nut bag, and either do Google, who recently made him director of engineering with almost unlimited funds.
> 
> I think it's great there is someone with the confidence and ideas like Kurzweil at the helm of Google engineering. I couldn't think of a better place for him. We can only begin to imagine the products they will bring out in the future.


I have a hard time taking most futurists seriously. The fact that Google hired him really doesn't sway my opinion. He may be intelligent about some things, but I think he's overly-optimistic and short-sighted in some respects, especially when it comes to biology and the human brain. He is not a biologist/neuroscientist, and it definitely shows. Perhaps if he was, he would be a bit more nuanced/hesitant/cautious with his predictions and claims.

His vitamin/supplement regimen certainly doesn't help bolster my confidence in his clear-thinking rationality either.


----------



## whattothink (Jun 2, 2005)

Kurzweil is a bit crazy and this particular idea is dubious. Credits to his genius of course.


----------



## ugh1979 (Aug 27, 2010)

huh said:


> I have a hard time taking most futurists seriously. The fact that Google hired him really doesn't sway my opinion. He may be intelligent about some things, but I think he's overly-optimistic and short-sighted in some respects, especially when it comes to biology and the human brain. He is not a biologist/neuroscientist, and it definitely shows. Perhaps if he was, he would be a bit more nuanced/hesitant/cautious with his predictions and claims.
> 
> His vitamin/supplement regimen certainly doesn't help bolster my confidence in his clear-thinking rationality either.


I'll give you possibly over-optimistic, but he's always changing his predictions in line with current technology so I have no problem with it.

In what respect would you say he is short-sighted though? Being short-sighted is not something i'd ever associate with futurists.

I understand what you are saying about his lack of knowledge in certain areas making some predictions seem dubious but he obviously consults people who do know the relative fields. It's not like either of us know any better so are we even qualified to be dubious?

I also understand why his confident predictions irk some who prefer to side with caution and not commit but as i've said, I like it that he is bold enough to do so. It does no harm to anyone and if anything helps put pressure on the development work he does.

I'd far rather the director of engineering at Google was someone like Kurzweil over some less ambitious cautious person.


----------



## Arthur Pendragon (Mar 17, 2013)

huh said:


> I have a hard time taking most futurists seriously. The fact that Google hired him really doesn't sway my opinion. He may be intelligent about some things, but I think he's overly-optimistic and short-sighted in some respects, especially when it comes to biology and the human brain. He is not a biologist/neuroscientist, and it definitely shows. Perhaps if he was, he would be a bit more nuanced/hesitant/cautious with his predictions and claims.
> 
> His vitamin/supplement regimen certainly doesn't help bolster my confidence in his clear-thinking rationality either.


This is exactly why you shouldn't trust pop science articles and experts. The only way to establish true information is to delve through scientific journals and look at the raw data and methodologies yourself.


----------



## davidc (Nov 20, 2008)

I don't like to think about it. If some sort of super AI is invented, then what is the point in all of us. The idea makes me feel like I should give up. Just spend all my time watching TV, waiting for a technological apocalypse to arrive.


----------



## ugh1979 (Aug 27, 2010)

davidc said:


> I don't like to think about it. If some sort of super AI is invented, then what is the point in all of us. The idea makes me feel like I should give up. Just spend all my time watching TV, waiting for a technological apocalypse to arrive.


How could the super AI be invented without us? However, the point in anything is often subjective. We make own points and reasons for living.

Maybe it's the next stage of the evolution of our species/intelligence?

Maybe mankind as we know it is on the cusp of moving from very slow biological evolution over hundreds of thousands of years to technologically augmented based evolution which takes thousands, or less, number of years to show significant change? The super AI of tomorrow could be the future "us".

Lets hope of course it all happens gradually via passive peaceful means rather than any apocalypse.


----------



## whattothink (Jun 2, 2005)

ugh1979 said:


> How could the super AI be invented without us? However, the point in anything is often subjective. We make own points and reasons for living.
> 
> Maybe it's the next stage of the evolution of our species/intelligence?
> 
> ...


I doubt these super-intelligent robots would account for the sentiment of humans in achieving this alleged goal of advancement (control of the universe?). They'd conceive feeling (notably, for us, pain) to be arbitrary and inefficient in the goal of, what - propagating? controling the physical? And what would be the fun in that? We'd be eradicated, and this blunder of ours, that would not have sentience, would conceivably expand into space for as long as time allows.

Something people like you fail to really acknowledge, but are nonetheless still governed by, is the truth that we are simple creatures who need petty struggles to flourish. We're not on some lofty mission towards conquering the universe, we're insignificant ants behind a pane of glass, and without that system that created us, we're hopelessly lost.


----------



## ugh1979 (Aug 27, 2010)

whattothink said:


> I doubt these super-intelligent robots would account for the sentiment of humans in achieving this alleged goal of advancement (control of the universe?). They'd conceive feeling (notably, for us, pain) to be arbitrary and inefficient in the goal of, what - propagating? controling the physical? And what would be the fun in that? We'd be eradicated, and this blunder of ours, that would not have sentience, would conceivably expand into space for as long as time allows.


You have taken a very us (as we are now) and them (of the distant future) approach to this with dystopian pessimism and an anthropocentric view that no future intelligence could ever be as "good" as we are. Why wouldn't or shouldn't they have feelings? I'm inclined to think feelings are crucial to advanced inevitably social intelligence. Feelings evolved in us so why not any other higher intelligence? The very fact that we could be largely responsible for shaping this new intelligence and potentially integrating with it means it will surely be very human like and capable of feelings just like our own. Maybe it will be different but maybe it will be "better". I can think of feelings I wish humans didn't have, or at least not have so intensely. (Jealousy, hate etc) Certain feelings often drive anti-social behaviour.



> Something people like you fail to really acknowledge, but are nonetheless still governed by, is the truth that we are simple creatures who need petty struggles to flourish.


The term simple in this context is highly relative so not really worth talking about but why do you think we need conflict to flourish? What a strange thing to say.

We've all been far better off in times of peace than war.



> We're not on some lofty mission towards conquering the universe, we're insignificant ants behind a pane of glass


I think conquering of the universe is beyond even the biggest current optimists dreams, but we are clearly wired to explore, colonise and expand our boundaries. However we as our current species might not be able to do that all that effectively in the next stage (space) due to the limitations and issues with our minds and bodies.

Maybe it's a job that a transhuman or post human species can better achieve.

There were many species of human that came before us, one of which in time evolved in to **** sapiens as we are now, and there may well be one or possibly many after. Our ancestor species paved the way for the advancements and proliferation we have achieved, so maybe we should be doing what we can to ensure our species lineage, doesn't become extinct , continues the advancement, and keeps the fire burning, even if that means evolving beyond our current species.

It would be a great legacy for us as we are now to leave.



> and without that system that created us, we're hopelessly lost.


What system?


----------



## thebadshepard (Oct 13, 2012)

so strange, I am staying alive with the vague hope of a decent life and seeing the wonders of technology and science in mind. Whenever I get suicidal I remind myself that although this species is generally disgusting, there are a select group of compassionate, intelligent scientists who make it worth sticking around. Technology could bring us to a new plain and erase most suffering if we stop and think before destroying ourselves. Despite the promise of the few scientists and intellectuals who truly advance the world, most nations are still run by despotic psychopaths and there is no indication that this may change. We are too stupid for our technology tbh. 

preferably a benevolent and ultra intelligent supercomputer could direct us to prosperity and peace (of course we maintain the ability to disagree with the computer and retain our capacity to run our own civilization). 

peace


----------



## ugh1979 (Aug 27, 2010)

thebadshepard said:


> so strange, I am staying alive with the vague hope of a decent life and seeing the wonders of technology and science in mind. Whenever I get suicidal I remind myself that although this species is generally disgusting, there are a select group of compassionate, intelligent scientists who make it worth sticking around. Technology could bring us to a new plain and erase most suffering if we stop and think before destroying ourselves.


Indeed back in the dark days when I was suicidal I also found hope and reason/wish to live in seeing the wonders of future technology and how it could potentially improve my life.



> Despite the promise of the few scientists and intellectuals who truly advance the world, most nations are still run by despotic psychopaths and there is no indication that this may change. We are too stupid for our technology tbh.


It's true most the world is still "stupid" but give it time. It was only a generation ago we moved in to the digital age, and one generation on cultural change scales is nothing. If in a hundred years or so time there are still countries which haven't become developed then we should worry but until then we just need to help them help themselves to do it. We're always changing, but some countries had and have advantages to make them change much quicker than others.



> preferably a benevolent and ultra intelligent supercomputer could direct us to prosperity and peace (of course we maintain the ability to disagree with the computer and retain our capacity to run our own civilization).
> 
> peace


Rather than one central ultra intelligent supercomputer it's far more likely there will be a network of ultra intelligent artificial intelligence sewn in to the fabric of our society which assists us via our personal devices etc.


----------



## shadeguy (Mar 28, 2013)

thebadshepard said:


> so strange, I am staying alive with the vague hope of a decent life and seeing the wonders of technology and science in mind. Whenever I get suicidal I remind myself that although this species is generally disgusting, there are a select group of compassionate, intelligent scientists who make it worth sticking around. Technology could bring us to a new plain and erase most suffering if we stop and think before destroying ourselves. Despite the promise of the few scientists and intellectuals who truly advance the world, most nations are still run by despotic psychopaths and there is no indication that this may change. We are too stupid for our technology tbh.
> 
> preferably a benevolent and ultra intelligent supercomputer could direct us to prosperity and peace (of course we maintain the ability to disagree with the computer and retain our capacity to run our own civilization).
> 
> peace


I remember hearing of Stephen Hawking claiming that in the long-run humanity could only survive if at least some of it migrates to space. Staying on this planet while we are acquiring increasingly powerful ways to destroy it will make impossible to avoid distraction of the human race.

It can really strike you for a moment thinking of why not humanity unites all its efforts to this goal of improving its future, especially when it's clear that technology could achieve enormous goals. But then research and technology gain today more attention and efforts than any other time in history. Eventually human beings, as all living things, are themselves biological robots driven by cause and effect, so we can't expect their evolutionary inherited human programming which includes aggression and other 'primitive' impulses to just disappear but I suppose society is also in transition just as technology is. So we should hold on, and there is a room for a cautious optimism about the future.


----------



## Arthur Pendragon (Mar 17, 2013)

thebadshepard said:


> so strange, I am staying alive with the vague hope of a decent life and seeing the wonders of technology and science in mind. Whenever I get suicidal I remind myself that although this species is generally disgusting, there are a select group of compassionate, intelligent scientists who make it worth sticking around.


Interesting that you say this. The problem arises when you are part of the select group of scientists, and everybody is selectively dependent you.



shadeguy said:


> I remember hearing of Stephen Hawking claiming that in the long-run humanity could only survive if at least some of it migrates to space. Staying on this planet while we are acquiring increasingly powerful ways to destroy it will make impossible to avoid *destruction* of the human race.


While it is possible that Earth-bound civilization may reach a Malthusian catastrophe, I wouldn't say that the destruction of the human race is the only option. It is commonly thought that in the event of an apocalypse, the destruction of humanity would allow the rebuilding of civilization as well as the reversion to political systems of the Dark Ages. However, as the cost of supporting an additional human rises above the cost of restructuring the current biological dependencies, I believe it would be more efficient and practical to revamp population control policies.


----------

