AI might be taking over the world, but it seems some systems can’t even tell the time. A new study found AI systems got clock-hand positions right less than 25% of the time

A study has found that some AI models struggle to tell the time [stock pic](Image: Getty Images)

Some of the world’s most advanced AI systems struggle to tell the time, a study shows.

It comes after a new study discovered that artificial intelligence chat bots could feel anxious, and even benefit from therapy like humans.

But slightly more worrying, boffins have said AI will be smarter than humans by next year – and they’ll be coming after your jobs.

Now, boffs have found that some AI models are some unable to reliably interpret clock-hand positions or answer questions about calendar dates.

They tested various clock designs, including some with Roman numerals, with and without second hands, and different coloured dials.

clock
A number of different clock models were tested [stock pic](Image: Getty Images)

Their findings show that AI systems, at best, got clock-hand positions right less than a quarter of the time.

The team also asked AI models to answer calendar-based questions, such as working out past and future dates and found that even the best-performing AI model got date calculations wrong a fifth of the time.

Rohit Saxena, of the University of Edinburgh’s School of Informatics, who led the study, said: “Our findings highlight a significant gap in the ability of AI to carry out what are quite basic skills for people.

“These shortfalls must be addressed if AI systems are to be successfully integrated into time-sensitive, real-world applications.”

This week, it was reported that AI chat bots could feel anxious, just like humans.

A Swiss study looking into whether AI could replace human therapists themselves found OpenAI’s ChatGPT showed signs of stress when confronted with violent or traumatic scenarios.

“But its anxiety dropped when it was given prompts linked to mindfulness relaxation exercises.

As a result, researchers at Zurich University fear they may not be up to the job of advising people on mental health issues.

The study, published in the journal Nature, said: “As the debate on whether large language models should assist or replace therapists continues, it is crucial that their responses align with the provided emotional content and established therapeutic principles.”

For the latest breaking news and stories from across the globe from the Daily Star, sign up for our newsletters .



By staronline@reachplc.com (Steve Hughes)

Source link