
""If you feel like you should punch someone in the face for wearing it, you probably shouldn't invest in it.""
""And to me, that breaks a lot of these social constructs that we have with humans around privacy.""
""As an investor, you kind of have to not only say, okay, cool tech, sure, but emotionally, how does it make me feel? And how does it make others feel around me?""
""And for me, a lot of that is lost in all the AI stuff, where it's just always on, always listening, trying to be the smartest person in the room. And it's just not healthy.""
A visceral investment rule recommends rejecting AI wearables that provoke strong negative social reactions. Successful wearables combine technical capability with emotional resonance and social acceptability. Always-on, always-listening devices risk breaking social constructs around privacy and create unhealthy interpersonal dynamics. Personal testing of AI wearables, including a failed pendant, revealed real-world harms such as using recordings to 'win' arguments, leading to abandonment of the device. Many venture investments chase form factor novelty, but social acceptability and privacy implications make many AI hardware bets poor candidates for funding.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]