What is Artificial Intelligence?
Artificial Intelligence is a branch of science mostly, but not exclusively, computer science, concerned with making computers "think". As a very broad topic, AI also relates to physiology, philosophy, physics, mathematics and other scientific areas.
The term "Artificial Intelligence" itself was coined in 1956 by John McCarthy from MIT at the "Dartmouth summer research project on Artificial Intelligence."
It is difficult to define intelligence. British scientists Alan Turing stated that a computer can be called intelligent if it could deceive a human into believing that it was human. His test consists of a person asking questions via keyboard to both a person and an intelligent machine. A scaled down version of the Turing test, better known as a Loebner Prize, requires that machines have to "converse" with testers only on a limited topic.
The field of Artificial intelligence ("AI") can therefore mean many things to many people. The problem is that the word 'intelligence' is ill defined. The phrase is so broad that people have found it useful to divide AI into two classes: strong AI and weak AI.
Strong AI makes the bold claim that computers can be made to think on a level (at least) equal to humans. Weak AI simply states that some "thinking-like" features can be added to computers to make them more useful tools... and this has already started to happen.
One definition says that Artificial Intelligence is the simulation of human intelligence processes by machines. The relatively new field of Artificial Life takes a different approach in attempt to study and understand biological life by synthesizing artificial life forms.
Another distinction within AI is between “Statistical” and “Classical AI”.
Statistical AI, arising from machine learning, tends to be more concerned with "inductive" thought: given a set of patterns, induce the trend. Classical AI, on the other hand, is more concerned with "deductive" thought: given a set of constraints, deduce a conclusion. Another difference, as mentioned in the previous question, is that C++ tends to be a favourite language for statistical AI while LISP dominates in classical AI.
A system can't be truly intelligent without displaying properties of both inductive and deductive thought.
expert systems: computer application that makes decisions in real-life situations that would otherwise be performed by a human expert.
neural networks: systems that simulate intelligence by reproducing the types of physical connections found in animal or even human brains. Because of the current technology limitations, the number of these connections is small (in terms of billions of connections found in human brain), but still capable of reproducing some very interesting behaviour in a number of disciplines such as voice or optical-character recognition and natural-language processing.
fuzzy logic: type of logic that recognizes more than simple true and false values. It represents a departure from classical two-valued sets and logic, that uses "soft" linguistic (e.g. large, small, hot, cold, warm) system variables and a continuous range of truth values in the interval [0,1], rather than strict binary (True or False) decisions and assignments.
natural language understanding: programming computers to understand and interact with users in natural languages like English. Related to the voice (speech) recognition which converts spoken dialogue to the computer-readable text, but without understanding the real meaning of that text.
agents: a computational entity which acts on behalf of other (most often human) entities in an autonomous fashion, performs its actions with some level of proactivity and/or reactiveness and exhibits some level of the key attributes of learning, co-operation and mobility. Imagine having your own "smart" agent that could watch new articles on the Usenet, and deliver only the most interesting ones (according to your preferences), instead of having to browse throuh thousands of new messages each day.
robotics: programming computers to see, hear and react to sensory stimuli. Probably the most attractive field of AI for newcomers. Includes several very different approaches: see BEAM robotics Web sites and MIT's Cog project for more info on this.
AI problems (speech recognition, NLP, vision, automatic programming, knowledge representation, etc.) can be paired with techniques (NN, search, Bayesian nets, production systems, etc.) to make distinctions such as search-based NLP vs. NN NLP vs. Statistical/Probabilistic NLP. Then you can combine techniques, such as using neural networks to guide search. And you can combine problems, such as posing that knowledge representation and language are equivalent. (Or you can combine AI with problems from other domains.)
AI Technology can be applied to provide solutions for a wide range of commercial and scientific needs. These needs or applications can be categorized as follows:
A “Simple” application is an integrated standard application. AI technology can also be used to provide solutions, applications and platforms for commercial needs such as e- and m-commerce, network integration and resource management.
"Complex" applications involve taking over the management and analysing functions based on current technology and usage of existing systems at the client. On top of optimising current workflow, AI applications seek to analyse the entire domain environment, makes projections of future events based on historical data and includes future development of the system in its recommendations. Network Management, including vital functions, like storage management, security, clustering, etc. will be changed from ‘trying until it works’ to implement what is needed based on comprehensive computer analysis.
"Very Complex" applications require the collection of enormous amounts of complex data, which then need to be filed, analysed and managed to assure a maximum of usability in the shortest amount of time possible. This can only be accomplished when the system is at the point of self-analysing the data. The system needs to be able to learn how to handle data and where to collect them to propose a decision based within very complex structures. Therefore the system needs to put data in relation and manage interdisciplinary content as well as cross-boarder processes. These applications enable organizations like universities, governments and industrial companies to truly take the world into consideration with a computer when making plans for the near and midterm future.
A Short Overview of Specific Potential Applications include:
Customer Relationship Management (CRM);
Content Management Systems (CMS);
Network and software security systems;
Oil Field search and appraisal;
Intelligent Traffic Management in Telecommunication and Energy Networks;
Industrial automation of complex production processes (such as Steel, Chemicals, Refining);
Global climate analysis;
Distributed Knowledge Networks;
Bioinformatics and Computational Biology;
Coordination and Control of Multi-Agent Systems;
Automata Induction, Grammar Inference, and Language Acquisition;
Computational modelling of Proteins;
Environment for Integrating and Analyzing Plant Genomic Databases;
Military simulations (eg. “Star Wars”);
Human population analysis including resource distribution and consummation;
AI Programming Languages
AI programs have been written in just about every language ever created. The most common seem to be Lisp, Prolog, C/C++, and recently Java.
LISP- For many years, AI was done as research in universities and laboratories, thus fast prototyping was favored over fast execution. This is one reason why AI has favored high-level langauges such as Lisp. This tradition means that current AI Lisp programmers can draw on many resources from the community. Features of the language that are good for AI programming include: garbage collection, dynamic typing, functions as data, uniform syntax, interactive environment, and extensibility.
PROLOG- It wasn't until the 70s that people began to realize that a set of logical statements plus a general theorem prover could make up a program. Prolog
combines the high-level and traditional advantages of Lisp with a built-in unifier, which is particularly useful in AI. Prolog seems to be good for problems in which logic is intimately involved, or whose solutions have a succinct logical characterization. Its major drawback (IMHO) is that it's hard to learn.
C/C++- C/C++ is mostly used when the program is simple, and excecution speed is the most important. Statistical AI techniques such as neural networks are common examples of this. Backpropagation is only a couple of pages of C/C++ code, and needs every ounce of speed that the programmer can muster.
Java- The newcomer, Java uses several ideas from Lisp, most notably garbage collection. Its portability makes it desirable for just about any application, and it has a decent set of built in types. Java is still not as high-level as Lisp or Prolog, and not as fast as C, making it best when portability is paramount.