top of page

LLMs and Language

David Cowles

Jan 9, 2024

“The written word itself is a form of AI…it’s been around for about 10,000 years.”

We devoted the Fall 2023 Issue of Aletheia Today Magazine to the implications of Artificial Intelligence for philosophy, theology, culture, and spirituality. We regard the rise of AI as one of the defining events of our era, but we need to view it in context, lest we create another Golden Calf. 


Recent work by Yiu, Kosoy, & Gopnik reported in Perspectives on Psychological Science does a good job of putting things in perspective: 


“In its current state of development, AI relies on so-called Large Language Models (LLMs) to decode reality.” So, how are LLMs like (or unlike) everyday Language? How does Artificial Intelligence compare with Native Intelligence?”


“Artificial intelligence (AI) systems are often depicted as sentient agents poised to overshadow the human mind. But AI lacks the crucial human ability of innovation… While children and adults alike can solve problems by finding novel uses for everyday objects, AI systems often lack the ability to view tools in a new way…”


“AI language models like ChatGPT are passively trained on data sets containing billions of words (Large Language) and images produced by humans. This allows AI systems to function as a ‘cultural technology’ similar to writing that can summarize existing knowledge…but unlike humans, they struggle when it comes to innovating…” (italics mine)


Shop the Aletheia Today store for Christian apparel and gifts and be a prophet

“Instead of viewing these AI systems as intelligent agents like ourselves, we can think of them as a new form of library or search engine. They effectively summarize and communicate the existing culture and knowledge base (back) to us.” 


Hmm, ‘they effectively summarize and communicate the existing culture and knowledge base to us’…like all technology! What is that ‘vast wasteland’ (Newton Minow) known as television other than an idealized reproduction of everyday life? And with rare exceptions, the same may be said of contemporary theater, film, and popular music. Roland Barthes referred to media as "mythologies,” while Jacques Ellul went further and labeled media’s products, “propaganda."


But according to 20th-century philosopher Jacques Derrida, these concerns reflect the nature of all language per se, especially written language. Homer’s epics organized and reported back to Achaeans the collective knowledge of his time, but it was plastic. Because it was oral literature, it was subject to modification over time and place and from bard to bard. Homer’s epics were cultural records, but they were living cultural records.


As with a $500 bottle of vintage wine, breaking the seal freezes the product in the moment and kick-starts its gradual but inexorable deterioration. We drink wine ‘at the hour of its death’. A written manuscript is like a corkscrew; we can drink no wine before or after its time. Then Guttenberg’s printing press came along and ushered in intellectual history’s version of the ‘screw-top era’. The written word itself is a form of AI…and it’s been around for about 10,000 years. 


Yiu, Kosoy, & Gopnik compared the performance of various AI systems against that of children (ages 3 to 7) and adults. 88% of children and 84% of adults were able to pick, out of a group of objects, the two that would “go best” together. 85% of children and 95% of adults were also able to repurpose everyday objects (innovate) to solve novel problems. 


Wait! So children (3 to 7) outperformed adults on the mix-and-match test? Yup! It’s like we’ve always said here at AT, “Children are smarter than you; deal with it.” 


Then, Yiu & Co. tested five popular AI systems on the same tasks. No model outperformed either the children or the adults in either test! On object pairing, the models ranged from a low of 59% to a high of 83%. But on the innovation test, AI scores ranged from a low of 8% to a high of 75%. So, it’s children: 1, adults: 1, AI: 0.


“Children can imagine completely novel uses for objects that they have not witnessed or heard of before…,” Yiu said. “Large models have a much harder time generating such responses.” 

In a related experiment, most children were able to figure out how a new machine worked just by experimenting and exploring. But when the researchers gave the same problem set to selected LLMs, AI failed miserably.  


AI relies on statistically predicting linguistic patterns; so do you! That’s why it’s so difficult for you to converse in a foreign language, even if it’s familiar: you don’t have enough experience with the language to be able to predict its phonetic sequences. That’s why people speaking foreign languages seem to be talking much faster than you. But sorry, AI, statistically predicting linguistic patterns is not enough to generate new discoveries about the world. 

“AI can help transmit information that is already known, but it is not an innovator,” Yiu said. “These models can summarize conventional wisdom, but they cannot expand, create, change, abandon, evaluate, and improve on conventional wisdom in the way a young human can.” 


“The development of AI is still in its early days, though, and much remains to be learned about how to expand the learning capacity of AI,” Yiu said, adding: “Taking inspiration from children’s curious, active, and intrinsically motivated approach to learning could help researchers design new AI systems that are better prepared to explore the real world.”


 

 

Keep the conversation going!

1. Click here to comment on this TWS.
2. To subscribe (at no cost) to TWS and ATM, follow this link.
3. We encourage new articles and reprints from freelance writers; click here to view out Writers’ Specs.

Do you like what you just read and want to read more Thoughts? Subscribe today for free!

- the official blog of Aletheia Today Magazine. 

Have a thought to share about today's 'Thought'.png
bottom of page