14| ChatGPT as a Glider — James Intriligator
Listen now
Description
Large language models, such as ChatGPT are poised to change the way we develop, research, and perhaps even think. But how do we best understand LLMs to get the most from our prompting? Thinking of LLMs as deep neural networks, while correct, is not very useful in practical terms. It doesn't help us interact with them, rather as thinking of human behavior as nothing more than the result of neurons firing won't make you many friends. However, thinking of LLMs as search engines is also faulty — they are notoriously unreliable for facts. Our guest this week is James Intriligator. James trained as a cognitive neuroscientist at Harvard, but then gravitated towards design and is currently Professor of the Practice in Human Factors Engineering and Director of Strategic Innovation at Tufts University.  James proposes viewing ChatGPT not as a search engine, but as a "glider" that journeys through knowledge. By guiding it through diverse domains, it learns your interests and customizes better answers. Dimensional prompts activate specific areas like medicine or economics.  I like this playful way of thinking of LLMs. Maybe gliding (LLMs) is the new surfing (of the web).  Links: * James' home page [https://engineering.tufts.edu/me/people/faculty/james-intriligator]  * Multiverses website [https://multiverses.xyz/]
More Episodes
Physics helps get stuff done. Its application has put rockets in space, semiconductors in phones, and eclipses on calendars.  For some philosophers, this is all physics offers. It is a mere instrument, albeit of great power, giving us control over tangible things. It is a set of gears and...
Published 06/04/24
Published 06/04/24
It can be tempting to consider language and thought as inextricably linked. As such we might conclude that LLM's human-like capabilities for manipulating language indicate a corresponding level of thinking.    However, neuroscience research suggests that thought and language can be teased apart,...
Published 05/15/24