Friend delays shipments of his companion AI pendant


Frienda startup creating a $99, AI-powered necklace designed to be treated like a digital companion has delayed its first batch of shipments until Q3.

Friend had planned to ship the devices to pre-order customers in Q1. But according to co-founder and CEO Avi Schiffman, that’s no longer feasible.

“As much as I would like it to be shipped in Q1 of this year, I still have refinements to make, and unfortunately you can only start manufacturing electronics when you are 95% done with your design,” Schiffman. he said in an email to customers. “We estimate that at the end of February, when our prototype is complete, we will start our final sprint.”

Friend, who has an eight-person engineering staff and $8.5 million in capital from investors including Perplexity CEO Aravind Srinivas, raised eyebrows spent $1.8 million on the domain name Friend.com. This fall, as part of what Schiffman called an “experiment,” Friend debuted a web platform on Friend.com that lets people talk to random examples of AI characters.

The reception was mixed. TechRadar’s Eric Schwartz noticed that Amico’s chatbots often inexplicably fill conversations with anecdotes of trauma, including being robbed and shot. Indeed, when this reporter visited Friend.com on Monday afternoon, a chatbot named Donald shared that the “ghosts of (his) past” were “wrong.”

Friend
My experience with Friend’s chatbots. Image credits:Friend

In the email mentioned above, Schiffman also said that Friend will be rolling out its chatbot experience.

“We’re thrilled that millions get to play with what we believe is the most realistic chatbot out there,” Schiffman wrote. “This really showed our internal ability to manage traffic, and it really taught us a lot about the digital company … (But) I wish we were focused only on the hardware, and I realized that the digital chatbots and the incarnated partners don’t mix well.”

AI-powered partners have become a hot topic. Character.AI, a chatbot platform backed by Google, has been accused in two separate trials for inflicting psychological harm on children. Some experts have expressed concerns that AI partners could it worsens the isolation replacing human relationships with artificial ones, and generate harmful content which can trigger mental health conditions.





Source link