Programming Languages for Artificial Intelligence

Artificial intelligence (AI) is a branch of computer science that creates computers and software with the thinking capacity of humans. The key to the capabilities of these computers is the programming language used in the process. Two major initial programming languages for AI were LISP and PROLOG. LISP was more widely used than PROLOG, but both became respected and recognized as invaluable parts of this scientific discipline. Subsequent AI languages include Python, R, and Java, among others. In 2022 ChatGPT, a chatbot designed by AI research group OpenAI, utilized the group's proprietary Generative Pre-trained Transformer (GPT) family of language models.

History and Functionality of LISP

Computers are programmed by humans to "think." The ability of a computer to perform its function is only as strong as the language used to instruct it. Two different programming languages emerged for use in artificial intelligence, and computer scientists noticed early promise in both.

A language called LISP is the older of the two forerunners. LISP—which stands for List Processor—was developed in 1958 by an American computer scientist named John McCarthy, who also coined the term artificial intelligence. The foundation for LISP was an older programming language known as Information Processing Language (IPL), developed by Allen Newell, Herbert Simon, and Cliff Shaw in 1956. This language was useful in the early days of computing, but it was highly complicated. A simpler language was needed to enable programmers to understand it better. McCarthy altered IPL by adding basic principles from what is known as lambda calculus, a branch of mathematical logic that focuses on variables and substitution. An advantage of LISP is that it enables a data management mechanism known as garbage collection in the AI "thinking" process. Items considered "garbage" in the world of AI are units of memory created during the running of an AI operation that serve as temporary holders of information. While they are useful for a time, they eventually outlive that usefulness. LISP is able to identify this used memory and discard it. Unlike IPL, LISP was able to assign storage space for information effectively and conservatively.

LISP became the most commonly used programming language for AI in the United States. The most crucial part of LISP’s functionality is an element called the interpreter. The interpreter takes statements presented to it by users and decides the value of the expression through a process known as evaluation. The evaluation process depends on three factors: the identity of the symbols used in a statement, or what the words or figures presented mean at their most basic level; the associations (e.g. other terms, or lists of terms) with each symbol in a statement; and the list of possible functions that might be associated with a statement. Lists are central to LISP, and yet they are not necessarily the same items we would typically call lists; they are more collections of possible functions or equations centering on a term or symbol. It is significant that the programmer in LISP gives the computer commands by constantly defining new functions and new equations. This process could potentially result in the growth of ability in the machine—as it might with a human, engaged in the act of intellectual exploration.

History and Functionality of PROLOG

The second major early programming language for artificial intelligence was called PROLOG. The name comes from the phrase "programming in logic." PROLOG was first developed in 1973 by French computer scientist Alain Colmerauer at the University of Aix-Marseilles. Robert Kowalski, an American computer scientist at the University of Edinburgh in Scotland, developed it further. The chief strength of PROLOG is its ability to implement logical processes. If two statements are entered, the program can determine if there is a logical connection between them. One reason PROLOG is able to do this is that programmers give the computer a set of facts to work with before beginning to enter statements. These facts might be descriptive statements about people or objects, such as "the car is green," or about relationships between people or objects, "Sam is the father of Carla." The facts can also be rules concerning the people or objects, such as "Sam is the grandfather of Emma is true if Sam is the father of Carla is true and Carla is the mother of Emma is true." This gives the program a basic framework to answer questions. People who use PROLOG do not necessarily have to know its code. Embedded in the language is a program called ProtoThinker(PT-Thinker), which translates normal statements into language a computer driven by PROLOG understands.

Another characteristic that distinguishes PROLOG from other programming languages is that it is declarative. To use this language, the programmer must first decide what the ultimate goal is they want to achieve. The goal is then expressed in PROLOG, and the software determines how that goal can best be met. This allows a considerable programming burden to be taken off the back of the programmer. Rather than telling the computer what to do, the PROLOG user lets the computer determine what to do. The computer does this by sorting through the information it has been given previously to find a match for the request made of it by the user.

The structures native to PROLOG have aspects in common with many searchable databases. PROLOG recognizes words in lowercase and recognizes basic numbers. Among other uses, PROLOG is used in the programming of more intelligent machines and certain trainable robots.

Subsequent Languages

Additional programming languages emerged as artificial intelligence technology was further iterated upon throughout the twentieth and twenty-first centuries. Popular programming languages in the field included Python and R, among many others. These languages offered new improvements compared to their precursors and as such facilitated new developments in the field of AI research. Then, in 2022, OpenAI released ChatGPT, an AI chatbot that used OpenAI's proprietary Generative Pre-trained Transformer (GPT) family of language models, which used a large volume of textual data to inform the chatbot's responses. ChatGPT proved wildly popular following its release and reached more than 100 million users in just two months.

In the months after ChatGPT's release it remained one of the most popular chatbots worldwide, but also faced competition from other chatbots such as Claude AI and Bard AI, which was developed by tech giant Google. Since OpenAI's GPT language model family was proprietary, these other chatbots drew on different language models; Bard, for example, used LaMDA large language model (LLM), which was developed by Google.

Bibliography

Collins, Eli. “LaMDA: Our Breakthrough Conversation Technology.” Google Blog, 18 May 2021, blog.google/technology/ai/lamda/. Accessed 25 Aug. 2023.

Copeland, Jack. "What Is Artificial Intelligence?" AlanTuring. AlanTuring, 2000. Web. 5 Oct. 2015. <http://www.alanturing.net/turing‗archive/pages/reference%20articles/what‗is‗AI/What%20is%20AI05.html>.

Milmo, Dan. “ChatGPT Reaches 100 Million Users Two Months after Launch.” The Guardian, 2 Feb. 2023, www.theguardian.com/technology/2023/feb/02/chatgpt-100-million-users-open-ai-fastest-growing-app. Accessed 23 Mar. 2023.

Philips, Winfred. "Introduction to Logic for ProtoThinker." Mind. Consortium on Cognitive Science Instruction, 2006. Web. 6 Oct. 2015. <http://www.mind.ilstu.edu/curriculum/protothinker/logic‗intro.php>.

"Introduction to PROLOG." Mind. Consortium on Cognitive Science Instruction, 2006. Web. 5 Oct. 2015. <http://www.mind.ilstu.edu/curriculum/protothinker/prolog‗intro.php>.

Wilson, Bill. "Introduction to PROLOG Programming." CSE. Bill Wilson/U of New South Wales, 26 Feb. 2012. Web 5 Oct. 2015. <http://www.cse.unsw.edu.au/~billw/cs9414/notes/prolog/intro.html>.