Page 45 - Cyber Defense eMagazine September 2023
P. 45
Nevertheless, filtering will likely be necessary to discern any kind of signal in an ocean of noise. What
tools might we use to try and compete with the various AI techniques that junk producers might employ?
This part of my prediction is less certain, but I still feel confident-enough to make it publicly: the only
filtering technology that can adapt to AI-based content generation must itself be AI-based. Rule-based
systems require humans to articulate the solution after the problem has already been articulated; but with
generative AI, it's often impossible to articulate why things were done a certain way. The model learns,
and the thing it has learned cannot be exported in legible form.
And thus, we have a problem that keeps morphing and cannot be legibly-defined. The only toolkit with
any hope of solving it is machine learning.
Low-hanging fruit
As with other kinds of filters, we are likely to encounter a pareto distribution whereby a small number of
filters results in a large fraction of the filtering. The vast majority of bad content - whether created by
humans or by large language models - could likely be filtered by relatively-simple systems that focus on
form, style, and other simple patterns. Each additional improvement in the filters will result in diminishing
returns, i.e. more and more effort will be required to improve the filtering capacity by a few percentage
points.
It might make sense, then, to focus our initial attention on the low-hanging fruit. Things that are obviously
bad (like the examples I provided above) should obviously be filtered out. Complex disinformation
campaigns orchestrated by content-creators with a large expenditure of resources might need to be
handled later, when the bottom 90% are already taken care of.
Generalizing to other spheres
In all likelihood, writers are not unique in their capacity for derivative and superficial work. As generative
AI models become better, their use could likely be generalized to audio, images, videos, and other forms
of content that is created exclusively by humans today.
It's neither the end of the world nor utopia. Rather, we are entering an age of broken mirrors where most
content will be fake, junk, or fake junk - and we must develop new tools to find needles in these haystacks
of bad information.
There’s an obvious necessity, which typically means a market.
And the market for content filters will likely be proportional in size to the market for generated content.
Cyber Defense eMagazine – September 2023 Edition 45
Copyright © 2023, Cyber Defense Magazine. All rights reserved worldwide.