""I think the secularists in Silicon Valley are filling the God-shaped hole in their heart with AGI," Palantir Chief Technology Officer Shyam Sankar said in an interview with the New York Times's Ross Douthat. "It's like, OK, the models get better. Why do you think that this cliff is going to happen where they somehow turn us into house cats?""
""I just don't think it would work the way that people think it would work in the Terminator nightmare scenario," Sankar said, saying that integrating AI into military technology brings "more of a difference of degree than kind." He also said that doomerism is a "fundraising shtick" where companies on the front lines of AI development "can say 'my technology is so powerful, it's going to lead to mass unemployment, so you better invest in me or you're going to be poor.'""
AI catastrophe scenarios are viewed as unlikely. Secular technologists sometimes invest transcendent hopes in AGI, fueling apocalyptic expectations. Religious individuals tend to be more skeptical of such doomer narratives. Transhumanist beliefs can amplify catastrophic thinking. Integrating AI into military systems will change decision-making by degree rather than produce a sudden, existential transformation. Some companies use apocalyptic framing to attract investment by promising protection from mass unemployment. Model developers can lack frontline experience with real-world usage, which undermines simplistic predictions of abrupt and total societal displacement.
Read at Business Insider
Unable to calculate read time
Collection
[
|
...
]