Page 186 - Cyber Defense eMagazine July 2024
P. 186
AI is more than just ChatGPT
It’s important to cut through the hype: ChatGPT and Copilot are really exciting pieces of technology, but
they’re just the latest chapter in an AI story spanning decades. What’s new here is the use of a specific
type of neural network (that is, a mathematical model inspired by the structure of neurons in biological
systems) called a transformer.
Because of the impact of ChatGPT, what most people today now mean when they refer to AI, is this sort
of transformer-based neural network. These Large Language Models (LLMs) represent a groundbreaking
advance, but they are merely the latest evolution of AI, not its totality.
Businesses have been using AI and machine learning (ML) for decades to spot anomalies, group data,
give recommendations, and more. ML often doesn't use neural networks, and has been part of the
software developer’s repertoire for many years.
Companies using ML in their products are technically correct stating they have AI, although possibly
disingenuous. That’s because AI washing now takes place in this disconnect between what most of us
on the research side mean when we use the term “AI”, and companies’ claims to be "AI-powered". What
many actually mean is based on well-trodden ML methods or, worse, on the back of crude hard-coded
logic masquerading as AI. In those cases, there are no AI characteristics, such as perceiving and learning
from its environment.
The temptation to deceive
It's easy to see why companies might do this. In today's market, claiming AI capabilities carries serious
weight. Numerous studies show consumers and business decision-makers view AI adoption as a
competitive necessity. In one survey, 73% of consumers said that AI can have a positive impact on their
customer experience.
Unsurprisingly, less scrupulous vendors are happy to slap the AI label on glorified IF statements in their
code. It can help make sales and attract investors. However, bad-faith claims obscure real progress and
push aside necessary conversations around the challenges of secure and responsible AI deployment.
Even well-meaning companies can succumb. We have seen cases where companies claimed to use AI
although in reality many had yet to kick off their project in question. These companies felt they needed to
claim an AI footprint in order to be seen as leading in their field, as they scramble to hire scarce and
pricey AI talent.
Hidden risks
Disillusionment is a real risk, but AI washing brings dangers beyond disappointment. Fake AI claims
obscure real progress and stifle important conversations around responsible AI use.
Cyber Defense eMagazine – July 2024 Edition 186
Copyright © 2024, Cyber Defense Magazine. All rights reserved worldwide.