ABCs of AI

A brief glossary of artificial intelligence terminology

Featured Media for ABCs of AI

Artificial intelligence, or AI: A term coined in 1955 by Stanford University computer science professor John McCarthy, PhD, who defined it as “the science and engineering of making intelligent machines.” Today it is understood as a branch of computer science that focuses on creating machines, computer programs or software systems that can perform tasks typically requiring human intelligence.

Algorithm: A step-by-step set of instructions that a computer or a person follows to solve a specific problem or perform a task, such as recognizing patterns.

Chatbot: A software application or web interface that mimics human conversation through text or voice interactions.

ChatGPT: A chatbot developed by OpenAI, capable of generating humanlike text based on context and past conversations. It is powered by a large language model and is an example of generative AI.

Deep learning: A type of machine learning that uses artificial neural networks, which are inspired by the structure and function of the human brain.

Generative AI: AI models that learn the patterns and structure of their input training data (text, images or other media) and then generate new data having similar characteristics or perform tasks they were never trained to do.

Large language model: A type of AI model that’s trained on massive amounts of data and can be easily adapted to perform a wide range of tasks. Some examples are the models that power chatbots like OpenAi’s ChatGPT and Google’s Bard.

Machine learning: A method that helps machines learn from data and get better at doing tasks without being explicitly programmed. It’s like teaching them to make decisions and predictions by themselves based on patterns they discover in information.

Natural language processing: A branch of AI that uses machine learning to process and interpret text and data. It represents the ability of a program to understand human language as it is spoken and written.

Neural network: A computer system inspired by the way our brains work. It’s made up of interconnected “artificial neurons” that help computers learn from data and recognize patterns.

Training data: The initial dataset, containing the examples used to teach a machine learning application to recognize patterns or perform some function.

Sources: Christopher Manning, Ahmad Rushdi and Nigam Shah