• Show: [[Made You Think]]

  • Speaker(s): [[Nat Eliason]] [[Neil Soni]]

  • Topic: [[Homo Deus - Yuval Harari]]

  • “In the early twenty-first century the train of progress is again pulling out of the station – and this will probably be the last train ever to leave the station called Homo sapiens. Those who miss this train will never get a second chance. The main products of the twenty-first century will be bodies, brains and minds, and the gap between those who know how to engineer bodies and brains and those who do not will be far bigger than the gap between Sapiens and Neanderthals. In the twenty-first century, those who ride the train of progress will acquire divine abilities of creation and destruction, while those left behind will face extinction.”

  • The whole book is just about different religions.

  • There are ways you can predict the future - human behavior and psychology, but you can't predict technology and information.

    • People in the ‘50s predicting flying cars and moon bases but nobody predicting the Internet.
    • Narratives and stories affect how we view technology.
  • A prediction can be a warning - once something is being talked about, it shapes the future of that thing, whether good or bad.

    • Communism example: it didn't take place effectively even if Marx predicted it. Governments who heard of his work took precautionary measure, and therefore it never happened.
  • It is much more acceptable to be critical about [[social networks]] today than it was 2 years ago. At the beginning everyone was considering only convenience.

    • Privacy wasn't that much of an issue, data collection wasn't viewed at as bad.
    • It used to be about convenience.
      • Targeted ads were obviously originally meant for our own good, to help us connect with better products and for our convenience. But now, privacy and information is held sacred and data collection is looked down upon.
  • Up until now the human agenda was: don't die, procreate, protect your tribe. The new agenda considers: how do we become Gods. We want more power, more money, how we live forever.

    • Distribution is the problem
      • We've beaten biology for the most part, so it's no longer our concern, it's ourselves.
  • Humans are concerned with three things particularly: famine, plague and war.

    • ""For the first time in history, more people die today from eating too much than from eating too little; more people die from old age than from infectious diseases; and more people commit suicide than are killed by soldiers, terrorists and criminals combined". "
    • "There are no more natural plagues": this was pre covid. #covid-19
      • Everything that he says in the intro about what we've done is true.. until it isn't.
      • Reference to [[Black Swan - [[Nassim Taleb]]]], based on perspective.
    • War is the only thing still around - political agendas, and more.
  • We're just fighting death - even though we may call it famine, plague or war. The real enemy is death and the way we'll fight against it is by extending life span.

    • ""In truth, so far modern medicine hasn’t extended our natural life span by a single year. Its great achievement has been to save us from premature death, and allow us to enjoy the full measure of our years. Even if we now overcome cancer, diabetes and the other major killers, it would mean only that almost everyone will get to live to ninety – but it will not be enough to reach 150, let alone 500. For that, medicine will need to re- engineer the most fundamental structures and processes of the human body, and discover how to regenerate organs and tissues. It is by no means clear that we can do that by 2100.""
    • We would be able to lengthen our live spans if we can re-engineer the length of splitting cells. [[CRISPR]], [[genetic engineering]], etc.
      • For people over 40, there's a high probability they’ll die because of only heart disease, cancer, Alzheimer, and suffer cognitive decline.
        • The challenge with cancer, a disease of aging, is that its origin is not unique, there are many reasons one can get cancer. You can't just "solve" cancer.
      • Incentives to improve, competition against technology, and appreciation of life under eternal conditions.
  • What happens when we become immortal or live much longer than we do today?

    • A large part of our artistic creativity, our political commitment and our religious piety is fuelled by the fear of death.
    • The evolution of ideas will slow down if you live longer.
    • The biggest competition to us is technology, and it only accelerates. But it seems like we're moving in opposite directions, because the acceleration of technology is what makes us live longer. But if you live longer, you're somehow less fit to compete with technology since your ideas evolve more slowly and are based in older times.
  • [[The Denial of Death]]

    • Death is a big motivator.
    • Again, "A large part of our artistic creativity, our political commitment and our religious piety is fuelled by the fear of death."
    • We're not going to become immortal, we're going to become a-mortal.
    • You don't consciously want death but at the same time, you do.
      • [[The Good Place]] - things I've thought of before, but it seems like not everyone understands the consequences of immortality
      • The increasing gap between the rich and the poor, and how only elite get access to the better technology, which makes them do worse things.
  • Historically we manipulated environment to fit us. In the future it seems we will try to manipulate us to essentially transcend the environment.

    • "For example, everybody still agreed on one thing: in order to improve education, we need to change the schools. Today, for the first time in history, at least some people think it would be more efficient to change the pupils’ biochemistry"
  • The state hopes to regulate the biochemical pursuit of happiness, separating ‘bad’ manipulations from ‘good’ ones.

    • Biochemical manipulations that strengthen political stability, social order and economic growth are allowed and even encouraged (e.g., those that calm hyperactive kids in school, or drive anxious soldiers forward into battle). While, manipulations that threaten stability and growth are banned
    • This leads to the manifestations of superhuman power with medicine.
      • Medicine always begins with helping people below the norm, but the same tools can be used to surpass the norm. (Example, people who don't need to take adderall, take adderall)
      • And therefore, if you are not cheating you are in disadvantage.
  • "The upgrading of humans into gods may follow any of three paths: biological engineering, cyborg engineering and the engineering of non-organic beings."

  • Nat and the rest of the group talk about their opinions on legalizing steroids in sports.

  • Post-Introduction

  • How Homo Sapiens conquered the world, most of this is a repetition or summary of [[Sapiens - Yuval Harari]].

    • Shared mythologies and a culture of [[religion]]s allowed us to conquer the world, and live in a world dominated by Sapiens.
    • Agricultural revolution brought about the rise of theist religions, such as polytheistic and monotheistic religions.
  • [[Modern Manifestation of Religion]] - Humanism

    • Scientific revolution: humanist religions, humans replace Gods.
      • The founding idea of humanist religions such as liberalism, communism and Nazism is that Homo sapiens has some unique and sacred essence that is the source of all meaning and authority in the universe. Everything that happens in the cosmos is judged to be good or bad according to its impact on Homo sapiens.
  • Critiques to Humanism.

    • Reliance that there's something special with humans.
    • Mind or [[consciousness]] for modern [[religion]] is the equivalent of the soul for ancient ones.
      • What we call consciousness is just observing what the body is doing. (Who is observing? Strange loops)
      • Brain vs Mind - we don't have a good grasp on the mind or what it is, or whether it is even there at all.
      • The concept of mind doesn't square with anything scientific.
    • "the better we map this [neural, cognitive] processes, the harder it becomes to explain conscious feelings. The better we understand the brain, the more redundant the mind seems"
  • People seem to have [[free will]], whether it exists or not.

    • Do we perform conscious choices under free will or are we subject to environment and past experience?
      • How do we structure our own environment or experiences to structure our own choices?
      • One mental model is to realize that people are not rational, free will, calculating beings but are instead input-output combinations.
      • By that model, nobody does anything right or wrong, either.
    • "But the million-dollar question is not whether parrots and humans can act upon their inner desires – the question is whether they can choose their desires in the first place."
    • Free will as an evolutionary result to improve survivability.
    • People erroneously jump to the conclusion that if I want to press something, I choose to want to. This is of course false. I don’t choose my desires. I only feel them, and act accordingly
    • First mover concept - even when asking why to a certain situation, you can never reach a first mover. You can explain it by past experiences or biology etc, but you can't ever reach a first mover (ie: "I chose to do this"), and by this argument we can't have free will.
    • Regardless we have or not free will, we still are responsible for our lives.
      • Dichotomy - it's easy to fall into a trap of pure nihilism, everything is "environment" and therefore nothing matters. Who is responsible?
    • Punishment should be still used to protect society from bad not-free-will behavior, whether it's environment or not.
      • Even though you are not responsible, you are still dangerous. Especially for a societal benefit.
  • [[The Self]]

    • Are you an individual self?
    • Concept of [[Intersubjective Entities]].
      • We tend to think of something as subjective or objective: interpretation vs. hard reality. But there is a third layer: intersubjective entities.
      • "Intersubjective entities depend on communication among many humans rather than on the beliefs and feelings of individual humans. Many of the most important agents in history are intersubjective. Money, for example, has no objective value. You cannot eat, drink or wear a dollar bill. Yet as long as billions of people believe in its value, you can use it to buy food, beverages and clothing."
      • Yet in truth the lives of most people have meaning only within the network of stories they tell one another.
        • Life meaning exists only within the network of stories we tell one another.
      • Why does a particular action – such as getting married in church, fasting on Ramadan or voting on election day – seem meaningful to me? Because my parents also think it is meaningful, as do my brothers, my neighbours, people in nearby cities and even the residents of far-off countries. And why do all these people think it is meaningful? Because their friends and neighbours also share the same view. People constantly reinforce each other’s beliefs in a [[self-perpetuating loop]].
      • As we develop more as a species, and move away from the natural world into the internet, for example, reality becomes much more focused on the inter-subjective.
      • Designing for a website, for example, is not an objective reality. The quality of "design" or "writing" is inter-subjective, it has no inherent meaning.
    • When measuring the width of my desk, the yardstick I am using matters little. The width of my desk remains the same whether I say it is 200 centimetres or 78.74 inches. However, when bureaucracies measure people, the yardsticks they choose make all the difference. When schools began assessing people according to precise numerical marks, the lives of millions of students and teachers changed dramatically.
      • Human cooperative networks usually judge themselves by yardsticks of their own invention and, not surprisingly, they often give themselves high marks.
  • [[Religion and science]] are not in as much in conflict as we often think it to be.

    • There are certain things that science cannot answer, such as ethical or moral gaps.
    • [[religion]] has nothing to say about scientific facts, and science should keep its mouth shut concerning religious convictions. If the Pope believes that human life is sacred, and abortion is therefore a sin, biologists can neither prove nor refute this claim.
  • #religion is interested above all in order. It aims to create and maintain the social structure. Science is interested above all in power. Through research, it aims to acquire the power to cure diseases, fight wars and produce food. As individuals, scientists and priests may give immense importance to the truth; but as collective institutions, science and religion prefer order and power over truth.

    • It would accordingly be far more accurate to view modern history as the process of formulating a deal between science and one particular religion – namely, [[humanism]]
    • "The entire contract can be summarised in a single phrase: humans agree to give up meaning in exchange for power."
      • Inter-subjective focus on power is so strong now that people react negatively to a preference to meaning.
      • For example, women criticizing women for wanting to stay at home and take care of the family, though it might be an infinitely more meaningful experience. But you are looked down upon because you're not giving preference to "objective" money, power and corporate satisfaction.
      • Even not wanting to go to college, for example, is looked down upon.
      • The more you've bought into it and the more you've commited to that reality, the more you are committed to making sure other people abide by it. This is the dangerous territory.
  • Issues that come with the top down approach - we create objective markers, whatever metric you use is what people will optimize for, that will change their behavior and effect their activities. People will gain metrics at the expense of other things.

    • The yard-stick matters a lot for subjective things, but not in the scientific world (the measure of a desk, for example).
    • Being able to "measure [[happiness]]"
  • Three main off-shoots of humanism.

    • Liberalism, orthodox humanism. Focus on liberty.
    • Socialist humanism (Communism, socialism, etc)
    • Evolutionary humanism, best for the advancement of the species. (Nazism)
      • Is Nazism objectively wrong?
      • We have the morals that we grew up with. We want to say that those morals are objective, however, if we grew up 60 years into the world where Germany won the world war, would we accept things the way they are?
      • Again, there's nothing objectively right - it depends on the yard-sticks we use to measure them.
  • The trend towards Liberalism is a natural consequence of technological evolution.

    • What happens when we begin this phase of transcendence? Many social developments seem to be sons of the current economic forces.
    • Slavery ended and women gained the right to vote because it was better for the overall economy, not because of a genuine interest of their lives.
  • Problems with UBI ([[Universal Basic Income]])

    • People need to pay taxes to be able to receive this - though it becomes easier to evade taxes. Who will fund UBI?
    • If people don't have anything to do, won't they cause mayhem? People won't become artists or entrepreneurs just because they get a check.
    • The illusion of freedom - the government has no incentive to give people free rights now that everyone has money.
      • Governments give only people rights because they need them to pay taxes.
  • Wars don't happen if there is no economical benefit.

  • City States

    • Belonging/feelings to cities and not the country. Cities are taking more popularity than States. People feel like they're being heard locally rather than aggregate.
    • Countries don't exist, they're just intersubjective realities. When a Government is losing power it creates a conflict.
    • City States would not exist as they existed in the past or as an equivalent to today's Countries. Nation states will possibly divide even more.
    • The rise of Digital Governments to organize people?
  • A slow government is good because you don't want incompetence spreading fast. The government feels inadequate compared to technology - which is a good thing.

    • Big Companies are gaining Nations-like power. Google is much faster at predicting an epidemic than UK's Health System.
    • The threat is Corporate regulations moving faster than Governments. Governments will soon be reactive, more than anything, in trying to keep pace with new technology and powerful corporations.
  • "Democracy looks like the worst kind of government, except all the other ones."

  • Individual vs Dividual: the Experiencing Self and the Narrating Self. ([[The Self]])

    • Plato’s Rationality, Energy Spirit, Desires chariot analogy. There's no real you. [[Philosophy and the Science of Human Nature]]
    • We're starting to put more faith in technology and human data that will do a better job of knowing us and the self.
  • Tech intelligence is/will be better than Human intelligence, but consciousness will be optional. Why would we need technology or AI that is conscious?

    • Consciousness seems to be an emergent property of processing speed. Consciousness seems like a side-effect of the complicated nature of the brain. Why would it have to be an emergent property? Do intelligence and consciousness go hand-in-hand?
    • Current improvements in AI are geared to improve sales, rather than improving the solutions to our needs.
  • More and more our decisions are made by machines and data.

    • AI is figuring out things that humans haven't figured out in years.
  • Apps that decide for us like Uber and Diet apps. Letting machines monitor health parameters and suggesting habits. If we rely too much on others to make decisions for us, we lose that "muscle".

    • In the short run, it's very valuable. But the dark side is the data side. If the data is being used to manipulate people, or do anything malicious, then it's a problem. The darker side is that the "attention helmet" makes people less patient to confusion, doubts or contradictions.
    • "The system may push us in that direction, because it usually rewards us for the decisions we make rather than for our doubts. Yet a life of resolute decisions and quick fixes may be poorer and shallower than one of doubts and contradictions."
  • [[The difference between an Oracle and an Agent]]

    • One which we consult, and one that makes the decisions for us and does the tasks for us, we entrust.
    • We may move towards situations in which our trusted app would interact on our behalf: scheduling a meeting, job application, dating, etc. "Autopilot"
    • "Once we can design and redesign our will, we could no longer see it as the ultimate source of all meaning and authority. For no matter what our will says, we can always make it say something else."
  • [[Dataism]], the new religion.

    • Data and algorithms will be the supreme force and we will trust them as the new Bible. "Dataism declares that the universe consists of data flows, and the value of any phenomenon or entity is determined by its contribution to data processing."
    • Is this a change we want in the world?
    • Liberalism is completely challenged by Life Sciences. 3 points: I'm an individual, I have a single essence, my self is completely free.
    • Life Science says we are just a set of bits dominated by algorithms. They are not free, but shaped by algorithms, and therefore more advanced algorithms can know you better than yourself.
    • The tech sector seems to be unattached to the emotional consequences of the things they are arguing for.
    • What do we care about more, the objective truth data reality or the subjective experiences of individuals? How much of the objective truth can we even allow ourselves to see, if everything is [[Maya and the concept of perception]]?
    • How dataism is really a religion, and the way people truly believe in it as if "God"
      • So much of Sapiens is how we are capable of so many contradictions - one person could believe in God and Liberalism at the same time.
    • Are emotions the result of data processing in the background? Your brain is probably compiling data and processing it subconsciously and outputting as a feeling or emotion. Dataism is this process outwards and conscious, and therefore, might seem more trustworthy.
    • Memory vs Data points - sometimes memory can be more superior than valuing just data.
    • As we move into [[Dataism]], one by one, we see our subjective decisions into objective/data driven decisions - you'd probably start going with the data more and more.
      • It's more of a mental framework, when you're dropping one for a better one.
  • Dataism taken to it's extreme form.

    • Algorithms would own everything like corporations do today.
    • Software eating the world. Everything is moving more and more towards algorithms. From the Data viewpoint, we can see our whole species like a single processing system. The end goal of Data is to create the Internet of all things, a completely interconnected (pure intersubjective) system of consciousness. This may render Sapiens obsolete, once we've created the IOT
    • Long term consequences of the types of optimizations we've done historically.
  • "These three processes raise three key questions, which I hope will stick in your mind long after you have finished this book:

    • Are organisms really just algorithms, and is life really just data processing?
    • What’s more valuable – intelligence or consciousness?
    • What will happen to society, politics and daily life when non-conscious but highly intelligent algorithms know us better than we know ourselves?”

ALSO REFERENCED IN: