|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
A dystopian future? 1
#26578124 - 04/04/20 09:15 PM (3 years, 10 months ago) |
|
|
It seems to me that in the future – certainly by thirty years from now – we will indeed have nationalized healthcare and a universal basic income system, but that this will have come at a rather sobering cost. People’s immediate needs will be covered by social programs, at no cost to them, but it strikes me that there will be an extremely powerful elite, who own the fully automated industries and basically run the government. This system will compensate the population through a universal basic income – but I imagine it will not be very lucrative. So we will have a population that is free to do as it wishes, within reason, but one that will not have much spending power and that will be totally subservient to a system that represents the grossest division of class and wealth in human history. This status quo will, doubtless, be very difficult to change if anyone wanted to. And after a certain point – after A.I. is able to replace doctors, lawyers, judges and pilots, and what have you – everyone outside of the elite will be in a powerless position, and the elite, whoever they are, will be untouchable. That this would not grow into a dark dystopia is, it would appear, difficult to imagine. But then, it begs the question as to how long A.I. will need humans around…
How accurate or inaccurate do you feel this prediction to be? If the latter, in what direction do you see civilization going? Can we avoid an undesirable situation for society, or is there even any reason to worry?
-------------------- Vi Veri Universum Vivus Vici
|
Grapefruit
Freak in the forest


Registered: 05/09/08
Posts: 5,744
Last seen: 3 years, 1 month
|
|
Likely for the US because of the inflexibility of the political system and the extent to which money already constitutes political currency. Quite possible for europe too, although it has a more authoritarian system with stronger political idealism, I think politicians there are less likely to sell out for capital.
Don't think it will happen to China as politicians there seem pretty keen to keep hold of their authoritarian/totalitarian control and ideals. A lot of people already see it as very dystopian and in many ways it is but they have at least been able to keep their poor fed and sheltered impressively well for a country with low GDP.
Russia has already sold out to gangsterism and that may well last the fall of Putin, it remains to be seen.
What with modern weaponary being very strong and able to keep revolution in check easily I think every country will eventually go the way of either selling out to capital or back to totalitarian dictator states.
-------------------- Little left in the way of energy; or the way of love, yet happy to entertain myself playing mental games with the rest of you freaks until the rivers run backwards. "Chat your fraff Chat your fraff Just chat your fraff Chat your fraff"
|
laughingdog
Stranger

Registered: 03/14/04
Posts: 4,829
|
Re: A dystopian future? [Re: Grapefruit]
#26578608 - 04/05/20 02:27 AM (3 years, 10 months ago) |
|
|
. Well, DQ, it assumes surviving climate change, & surviving more viruses (which are predicted), no covid-19 accidents in nuclear submarines, and ignores what CRISPR may do. . So I don't think any predictions are possible given the level of instability at present. . But it may be as good a guess as any other scenario. . Anything dystopian seems more likely than anything utopian. . Should we survive, all these wild cards, further loss of any remaining privacy, in regards to the state seems sure to vanish.
As regards: "free to do as it wishes, within reason". The devil is in the details of how that is defined. Seems you are imagining, a rather gray world. What will keep the masses content? What sort of education will be available? You may have the start of a good sic-fi novel....
|
redgreenvines
irregular verb


Registered: 04/08/04
Posts: 37,703
|
|
in star trek, they still have cities and vineyards, it can't be all bad
--------------------
_ 🧠 _
|
thealienthatategod
retrovertigo


Registered: 10/10/17
Posts: 2,658
Last seen: 2 days, 14 hours
|
|
what's worrying is that the dystopia may be worshiped. how many will trap themselves inside of a collective invisible prison willfully? as the old physicality is ushered out, what ideologies will become dominant, as a majority of humans don't even seem to know what they want?
industrial society has been manipulating the majority since it first established its social order. technological society easily manipulates the majority because the progress and consequences of technology are seen as necessaries. the new equilibrium is adjusting humans themselves to fit the new order, so that those who choose to serve in the invisible prison do not lose faith in its progress. what last bit of local autonomy humans have will be lost for good.
|
laughingdog
Stranger

Registered: 03/14/04
Posts: 4,829
|
|
Quote:
thealienthatategod said: what's worrying is that the dystopia may be worshiped. how many will trap themselves inside of a collective invisible prison willfully? as the old physicality is ushered out, what ideologies will become dominant, as a majority of humans don't even seem to know what they want?
industrial society has been manipulating the majority since it first established its social order. technological society easily manipulates the majority because the progress and consequences of technology are seen as necessaries. the new equilibrium is adjusting humans themselves to fit the new order, so that those who choose to serve in the invisible prison do not lose faith in its progress. what last bit of local autonomy humans have will be lost for good.
. Seems like generally good insights.
. As regards "so that those who choose to serve in the " - some might argue that already most don't consciously choose much of importance anyway - for example most have the same religion as their parents, while others might join the army, 'cause it seems like "a good career move", (after the recruiter persuaded them & 'cause their uncle served), & end up becoming killers and returning wounded for life, and so on. . Their "real" choices are about which rug matches the sofa. And some would go so far as to say the 2 party system is only another way of providing illusory choice.
Edited by laughingdog (04/05/20 11:05 AM)
|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
|
Quote:
thealienthatategod said: what's worrying is that the dystopia may be worshiped. how many will trap themselves inside of a collective invisible prison willfully? as the old physicality is ushered out, what ideologies will become dominant, as a majority of humans don't even seem to know what they want?
industrial society has been manipulating the majority since it first established its social order. technological society easily manipulates the majority because the progress and consequences of technology are seen as necessaries. the new equilibrium is adjusting humans themselves to fit the new order, so that those who choose to serve in the invisible prison do not lose faith in its progress. what last bit of local autonomy humans have will be lost for good.
Yes that's just how I see it. In accordance with your points, my suppositions above are just a logical continuation of processes that are already in place, and have been in place for a long time. Certainly, if things do evolve this way, one might possibly hope people accept it, because there won't be anything they can do about it at all. As I pointed out, as much as the elite have a stranglehold on power now, in a hyper-technological society in which nobody works, and gets a very small UBI check every month, the people in charge -- those who own the machines and run the government, own the whole economy -- will be totally untouchable.
Indeed, it was the whole thrust of my original post that in such a world, whatever relative autonomy people have had to influence social affairs, in whatever small way, will be gone. If the powers that be decide to incentivize people's lives with some sort of capacity for hope and a belief in progress, that might be desirable for the masses, but of course it would be an impotent assent. Of course, after all, commerce has been about fabricating wants for a long time, so one would expect that might continue. But if there is no general assent among the populace, that is when we slip into dystopia.
-------------------- Vi Veri Universum Vivus Vici
|
Rahz
Alive Again



Registered: 11/10/05
Posts: 9,247
|
|
Quote:
DividedQuantum said: But then, it begs the question as to how long A.I. will need humans around…
I was thinking, how long the rich will want the poor around. Limiting the population to less than a billion would solve so many environmental problems and reserve natural resources for many generations.
-------------------- rahz comfort pleasure power love truth awareness peace "You’re not looking close enough if you can only see yourself in people who look like you." —Ayishat Akanbi
|
laughingdog
Stranger

Registered: 03/14/04
Posts: 4,829
|
|
Quote:
DividedQuantum said: ...If the powers that be decide to incentivize people's lives with some sort of capacity for hope and a belief in progress, that might be desirable for the masses, but of course it would be an impotent assent. Of course, after all, commerce has been about fabricating wants for a long time, so one would expect that might continue. But if there is no general assent among the populace, that is when we slip into dystopia.
. "to incentivize people's lives with some sort of capacity for hope" that used to be a gold pocket watch & pension. Now we have TV with Jerry Springer, Professional wrestling, & soap operas --- instead of bread & circuses, to distract the masses day to day. In Huxley's novel it was prozac--whoops I mean Soma. Both Huxley & Orwell didn't dream up TV and no one imagined what the iphone would do. But what is now obvious is that humans are pretty easy to condition.
. Seems you equate dystopia with rioting. I imagine others might equate dystopia with current conditions in Detroit. While expatriates might equate dystopia with the superficiality of the entire culture of the USA.
|
thealienthatategod
retrovertigo


Registered: 10/10/17
Posts: 2,658
Last seen: 2 days, 14 hours
|
|
yes, current ideologies exert so much power over peoples that it is not evident that other ideologies even exist, unless perhaps they are specifically presented. if people don't see a conflict they also can’t see a real choice.
the system believes that human will is imposing, arrogant, and unreasonable. the system regards a human being who does not act within the conformity of the rules of the system as irresponsible.
how do you present an ideology that has values that can only be worked out in the destruction of the existing structure? until humans know what human needs are their needs will continue to be manufactured.
it is easy to measure the progress of technology, but do we measure the progress of freedom the same way?
|
thealienthatategod
retrovertigo


Registered: 10/10/17
Posts: 2,658
Last seen: 2 days, 14 hours
|
Re: A dystopian future? [Re: Rahz]
#26579426 - 04/05/20 12:38 PM (3 years, 10 months ago) |
|
|
maybe the fate of the human race will be better off if left to the mercy of A.I.
again, humans will probably accept the decisions of A.I, because the social order will show that A.I. decisions bring better results than human-made decisions.
|
laughingdog
Stranger

Registered: 03/14/04
Posts: 4,829
|
|
.....In a sense freedom is a myth. Raising a family, the life purpose of the majority of humans is biological programing and all culture and languages are likewise a form of conditioning. . Of course places like North Korea & prison are less free. But to be inflamed with a desire for total freedom is again in a sense, just another, attachment or limitation.
|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
Re: A dystopian future? [Re: Rahz]
#26579489 - 04/05/20 01:09 PM (3 years, 10 months ago) |
|
|
Quote:
Rahz said:
Quote:
DividedQuantum said: But then, it begs the question as to how long A.I. will need humans around…
I was thinking, how long the rich will want the poor around. Limiting the population to less than a billion would solve so many environmental problems and reserve natural resources for many generations.
Yes, that is definitely a realistic consideration.
-------------------- Vi Veri Universum Vivus Vici
|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
|
Quote:
laughingdog said:
. Seems you equate dystopia with rioting. I imagine others might equate dystopia with current conditions in Detroit. While expatriates might equate dystopia with the superficiality of the entire culture of the USA.
Oh I don't necessarily equate it with rioting, but more with helplessness. But indeed, dystopia is a relative term, and one could very reasonably equate it with the things you have mentioned. It is of course a rule that when things have gotten pretty bad, they usually get worse before they get better.
-------------------- Vi Veri Universum Vivus Vici
|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
|
Quote:
thealienthatategod said: maybe the fate of the human race will be better off if left to the mercy of A.I.
again, humans will probably accept the decisions of A.I, because the social order will show that A.I. decisions bring better results than human-made decisions.
Well as far as I can tell, as long as strategic nuclear weapons stockpiles exist to the degree that they do in several countries, the biggest long- or medium-term threat to our species is our own self-destruction. If the A.I. revolution happens to a transformative degree, the factor that most concerns me is the machine intelligence's ability to take human politicians' hands off the button permanently and totally. Thus finally ending the possibility of nuclear holocaust. I think that would be a real salvation for the biosphere of the planet, and what happens after that is gravy as far as history is concerned.
-------------------- Vi Veri Universum Vivus Vici
|
Rahz
Alive Again



Registered: 11/10/05
Posts: 9,247
|
|
People have been saying since the 70's the "beast" would be a computer. Sounds like hoping for an artificial God for salvation. Curious thought from the devil's advocate
-------------------- rahz comfort pleasure power love truth awareness peace "You’re not looking close enough if you can only see yourself in people who look like you." —Ayishat Akanbi
|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
Re: A dystopian future? [Re: Rahz] 1
#26579887 - 04/05/20 04:15 PM (3 years, 10 months ago) |
|
|
Well I don't think it has anything to do with concepts of God. I think it's a possibility that A.I. could be perfectly hostile, and it wouldn't be superpowerful at first, anyway. It would just be a relief to me to know that the human capacity for total self-annihilation might be defused by a powerful computer, which would share the interest of not being destroyed. And the animals could go on living, too.
-------------------- Vi Veri Universum Vivus Vici
|
thealienthatategod
retrovertigo


Registered: 10/10/17
Posts: 2,658
Last seen: 2 days, 14 hours
|
|
maybe humanity will unite under the First Church Of Artificial Intelligence:
Quote:
Levandowski expects that a super-intelligence would do a better job of looking after the planet than humans are doing, and that it would favor individuals who had facilitated its path to power. Although he cautions against taking the analogy too far, Levandowski sees a hint of how a superhuman intelligence might treat humanity in our current relationships with animals. “Do you want to be a pet or livestock?” he asks. “We give pets medical attention, food, grooming, and entertainment. But an animal that’s biting you, attacking you, barking and being annoying? I don’t want to go there.”
Enter Way of the Future. The church’s role is to smooth the inevitable ascension of our machine deity, both technologically and culturally. In its bylaws, WOTF states that it will undertake programs of research, including the study of how machines perceive their environment and exhibit cognitive functions such as learning and problem solving.
Levandowski does not expect the church itself to solve all the problems of machine intelligence—often called “strong AI”—so much as facilitate funding of the right research. “If you had a child you knew was going to be gifted, how would you want to raise it?” he asks. “We’re in the process of raising a god. So let’s make sure we think through the right way to do that. It’s a tremendous opportunity.”
|
laughingdog
Stranger

Registered: 03/14/04
Posts: 4,829
|
|
. Seems to me the idea of AI is confused. A disembodied intelligence would have no emotions and without emotions, where would motivation come from?
. There is research showing that without emotion, decision making is next to impossible. https://www.psychologytoday.com/intl/blog/inside-the-consumer-mind/201302/how-emotions-influence-what-we-buy
. Any pre programed wired in basis, for decision making, would by definition, reduce the intelligence of AI.
. And emotions, being partly products of body & hormones, and the nervous system & automatic reflexes, are always the response of an organism, with needs, attempting to survive, which implies some sort of what might be called 'a self'.
. Again 'self' is 'something', that seems rather more limited than pure intelligence.
. A computer can beat a chess master, but not choose to walk away from the game, to met a pretty broad in the audience who seems far more interesting than a silly game moving wooden pieces on a board; or just stick its tongue out at the opponent purely to see what will happen. . The conception of AI is always in very grown up and serious terms, but those who do the most effective learning, may be children who are anything but serious. This is a matter for very serious consideration.
Edited by laughingdog (04/05/20 07:15 PM)
|
DividedQuantum
Outer Head


Registered: 12/06/13
Posts: 9,825
|
|
Well I feel there are a lot of possibilities, most of which, as a practical matter, are probably unimaginable.
-------------------- Vi Veri Universum Vivus Vici
|
|