Below are a few most essential terms to understand in this conversation. Additionally, the NSF-supported Engage AI Institute has published a more substantial
glossary of key terms which we strongly recommend as a helpful resource.
Artificial Intelligence (AI)
(AI) "involves computational technologies that are mimicking, inspired by, or related to what we see from humans - how humans work, how humans think, and how they function. At a level [AI] attempts to augment or potentially even replace some cognitive tasks that human beings engage in." -George Siemens
Generative AI
Generative AI is trained on existing data (music, art, writing, code, etc.) and, extending from its pattern finding capabilities, can generate new content. ChatGPT is a well-known form of Generative AI that uses a Large Language Model to undertake natural language processing and natural language generation. In short, ChatGPT understands human language input and generates human-like text responses.
Intelligence Amplification (or Intelligence Augmentation)
"Intelligence Amplification exploits the opportunities of artificial intelligence, which includes data analytic techniques and codified knowledge for increasing the intelligence of human decision makers" (Wijnhoven, 2022).
Large Language Model (LLM)
Large language models are neural networks that have been trained on natural language data. The training of these models is self-supervised in that they are initially trained on an unlabeled data set to obtain their initial parameters before being tested and further trained with supervised or unsupervised tasks. Tasks in these latter stages might consist of sentence completion exercises (Think of a cloze-type task). LLMs have come gained widespread interest through the popularity of ChatGPT and Bard.