“Artificial Intelligence” as we know it today is a misnomer. AI is not intelligent in any way, but it is artificial. It is one of the hottest topics in the industry and is enjoying renewed interest in the academy. This is no stranger — in the last 50 years the world has gone through a series of AI peaks and valleys. But what makes the current flurry of AI success different is that modern computing hardware is powerful enough to eventually implement some of the wild ideas that have been lingering for a long time.
In the 1950s, in the early days of what we now call artificial intelligence, there was talk of what to name the field. Herbert Simon, co-developer of both Logic Theory Machine and General Problem Solver, argued that the field should have the anodine name “complex information processing.” It certainly does not inspire the awe of “artificial intelligence” or convey the idea that machines can think like humans.
However, “complex data processing” is a good description of what artificial intelligence really is: parsing complex data sets and trying to make conclusions from the pile. Some modern examples of AI include speech recognition (in the form of virtual assistants such as Siri or Alexa) and systems that determine what is in the photograph or buy or view the next. None of these examples are comparable to human intelligence, but they show that we can do significant things with adequate information processing.
Whether we refer to this field as “complex information processing” or “artificial intelligence” (or more ominously, Skynet-sounding “machine learning”) does not matter. A tremendous amount of work and human ingenuity has gone into making some absolutely incredible applications. See, for example, GPT-3, a deep learning paradigm for natural languages that can create a text that is indistinguishable from a person’s written (albeit hilariously incorrect). This is supported by a neural network model that uses more than 170 billion parameters to model the human language.
Built on top of the GPT-3 is a tool called Dall-E, which generates an image of any amazing object the user requests. The updated 2022 version of the accessory, Dall-E 2, allows you even more that you can “understand” a lot of abstract styles and concepts. For example, asking Doll-E to visualize “an astronaut riding in the style of Andy Warhol” produces several images such as:
Dall-E 2 does not do a Google search to find the same image; It creates an image based on its internal pattern. This is a new film made from nothing but math.
Not all applications of AI are groundbreaking like this. AI and machine learning are finding uses in every industry. Machine learning is a must-have in many industries, from recommended engines in the retail sector to pipeline safety in the oil and gas industry and diagnostic and patient privacy in the healthcare industry. Not every company has the resources to create tools like Dall-E from scratch, so there is a lot of demand for affordable, achievable equipment. The challenge of filling that demand parallels the early days of business computing, where computers and computer programs were rapidly becoming The Technology businesses need. While not everyone needs to develop the next programming language or operating system, many companies want to harness the power of these new areas of study and need similar tools to help them.