{"id":597446,"date":"2026-04-11T17:15:12","date_gmt":"2026-04-11T17:15:12","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/597446\/"},"modified":"2026-04-11T17:15:12","modified_gmt":"2026-04-11T17:15:12","slug":"could-ai-one-day-become-conscious-michael-pollan-has-some-thoughts","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/597446\/","title":{"rendered":"Could AI one day become conscious? Michael Pollan has some thoughts"},"content":{"rendered":"<p class=\"c-article-body__text text-pr-5\">Michael Pollan\u2019s bestsellers have reshaped how we think about food, plants, and psychedelics. In his new book, A World Appears: A Journey Into Consciousness, he ventures into perhaps his most ambitious subject yet: consciousness. <\/p>\n<p><a style=\"display:block\" href=\"https:\/\/www.theglobeandmail.com\/resizer\/v2\/X4QLQQF6SJHPPOIBILT3LTWS6M.JPG?auth=816bec8acaee904b44caa9065763a3f3ecb10b2bdef75ff1a1742afe164e4939&amp;width=600&amp;height=400&amp;quality=80&amp;smart=true\" aria-haspopup=\"true\" data-photo-viewer-index=\"0\" rel=\"nofollow noopener\" target=\"_blank\">Open this photo in gallery:<\/a><\/p>\n<p class=\"figcap-text\">Michael PollanTabitha Soren\/Supplied<\/p>\n<p class=\"c-article-body__text text-pr-5\">As Silicon Valley continues to flirt with the idea of building artificial consciousness \u2013 of designing machines that don\u2019t just think, but feel &#8211; we spoke with him about what it would mean to design machines that don\u2019t just think, but feel. <\/p>\n<p class=\"c-article-body__text text-pr-5\">This an excerpt from the latest episode  of <a href=\"https:\/\/www.theglobeandmail.com\/podcasts\/machines-like-us\/\" rel=\"nofollow noopener\" title=\"https:\/\/www.theglobeandmail.com\/podcasts\/machines-like-us\/\" target=\"_blank\">Machines Like Us<\/a>, the Globe\u2019s podcast about technology and people.<\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: Why is it so hard to wrap our heads around this? <\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan:  It\u2019s just a very hard question. We\u2019re really asking: how do you get from matter to mind? Where does that voice in your head come from? And science hasn\u2019t yielded much in the case of consciousness. One of the surprises for me is how much Buddhism has learned about consciousness over 2,500 years, and also how much novelists know about consciousness. It\u2019s a different kind of knowledge, but it\u2019s equally legit. <\/p>\n<p class=\"c-article-body__text mv-16 l-inset text-pb-8\" data-sophi-feature=\"interstitial\"><a href=\"https:\/\/www.theglobeandmail.com\/podcasts\/machines-like-us\/article-michael-pollan-says-ai-isnt-conscious-but-plants-might-be\/\" rel=\"nofollow noopener\" target=\"_blank\">Michael Pollan says AI isn\u2019t conscious \u2013 but plants might be<\/a><\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: I find these philosophical discussions fascinating. But why do they matter?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: There are a lot of implications. Suddenly you\u2019ve got to think harder about animals. Most of them have brain stems. So suddenly, you\u2019re democratizing consciousness. If it starts with feelings, lots of animals have feelings. We\u2019ve always told ourselves that if you have consciousness, you\u2019re entitled to some moral consideration. And it has implications for conscious AI. Can an artificial intelligence have feelings that are real? And if we decide that they do, then we have to think about moral consideration for these machines. And that\u2019s a very active conversation in Silicon Valley, to my shock and to some horror. It seems to me there are a lot of humans that we\u2019re not giving moral consideration to, and we should perhaps work on that first.<\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: Why do some scientists believe that plants are conscious? <\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: Plants don\u2019t have neurons, but they\u2019ve got a lot more going on than we previously thought. They can hear and they can see. If you play the sound of a caterpillar chomping on leaves, plants will react and send toxic molecules to their leaves. One of the spookiest things is that if you give an anesthetic to a plant, it knocks them out. If you take a Venus flytrap, put it in a bell jar, inject some anesthetic gas \u2013 the same ones that put us out for surgery \u2013 it will not react when a fly crosses its threshold. You have to ask yourself, what is the plant losing? We would say we\u2019re losing consciousness. <\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: This feels like it fits in with an idea from the book, which is that we\u2019ve evolved to be conscious as a way to deal with uncertainty. Can you explain that?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: The goal of the brain is maintaining homeostasis, making sure we don\u2019t get too hot or too cold or too hungry, and a lot of that is automatic. It\u2019s adjusting your blood pressure and your heart rate and your blood gasses. So why does any of it become conscious? Well, one theory put forward by Mark Solms, who\u2019s a South African scientist, is that when you have conflicting needs \u2013 let\u2019s say you\u2019re hungry and tired \u2013 you need to be aware so you can make a decision. So consciousness is a problem-solving space for problems that can\u2019t be automated.<\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: And Mark Solms is trying to replicate this in machines, right?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: I was astonished, actually, because he thinks that feelings can be generated in a machine. He\u2019s created what looks like a video game. There\u2019s this avatar who is negotiating competing needs. So there\u2019s hunger and there\u2019s thirst and there is tiredness, but they\u2019re all incommensurate needs. And they\u2019re trying to put this avatar in a condition of deep uncertainty. Do I look for food or do I look for a safe place to rest? And their theory is that these conflicts will drive feelings that will, in turn, lead to consciousness. <\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: I understand trying to build intelligence \u2013 there\u2019s money in that. But why do they want to build consciousness? <\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: There are two reasons that I hear the most. One is that a purely super-intelligent machine would have no compassion and a conscious one is more likely to have a moral compass. I think that\u2019s nuts. Have these people read Frankenstein? Frankenstein\u2019s monster had both intelligence and feelings. And it is the feelings that led to all the bad results. The assumption that having consciousness guarantees moral behaviour is a big leap. The other reason is this Promethean spirit\u2014you would be like a god if you could make a conscious machine.<\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: What\u2019s your argument against the idea that we can build consciousness in a computer?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: The belief you can create a conscious machine depends on this metaphor that brains are computers. And that is just a sloppy metaphor. In brains, you do not have that neat separation between hardware and software. Every memory you have, every experience physically reshapes your brain. So the idea that you could abstract consciousness from this meat-based system we run it on and then move it to silicon\u2014what are you moving exactly? You need the whole wet tofu-like thing.<\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: You argue we\u2019re arriving at a pivotal moment for our sense of ourselves. What do you mean?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: Our identity as humans is under enormous pressure right now. On the one hand, you have this democratization of consciousness. We\u2019re discovering that the world is a lot more alive and aware than we ever thought. And then at the same time you have computers telling us they\u2019re conscious. So who are we? What\u2019s special? Are we gonna identify more with computers that we can talk to in our language, or animals that can feel and grow old and die? Whose team are we on?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Taylor Owen: So where does that leave us?<\/p>\n<p class=\"c-article-body__text text-pr-5\">Michael Pollan: I think we should be very wary of attributing consciousness to computers. We crave human attachment, and we have an epidemic of loneliness, and along come these machines saying, hey, I\u2019ll be your friend. If we think of our human consciousness as the space of ultimate privacy and mental freedom, we\u2019re squandering it. We\u2019re giving it away to chatbots who are trying to hack our emotional attachments.  And we need to take it back.<\/p>\n<p class=\"c-article-body__text text-pr-5\">(Editor\u2019s note: AI tools assisted with condensing the original podcast transcript, which was then reviewed and edited by the Machines Like Us team.)<\/p>\n","protected":false},"excerpt":{"rendered":"Michael Pollan\u2019s bestsellers have reshaped how we think about food, plants, and psychedelics. In his new book, A&hellip;\n","protected":false},"author":2,"featured_media":597447,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[62,276,277,49,48,2922,61],"class_list":{"0":"post-597446","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-ca","12":"tag-canada","13":"tag-noastack","14":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/597446","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=597446"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/597446\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/597447"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=597446"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=597446"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=597446"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}