The Office of the Director of
National Intelligence in Washington defines Open Source Intelligence as being “publicly available information that
appears in print or electronic format, including radio, television, newspapers,
magazines, Internet, commercial databases, and video, graphics and drawings.”
The important term here is “publically available”, given that banks operate in
a highly regulated environment.
The amount of digitalised
content that can be captured on the surface web and the deep-web is growing
exponentially. Just as importantly, the
amount of information on past events is also exploding. This is to say that if
one imagines an investigation on a client that was carried out in 2011 and the
amount of relevant information that was “openly” available for analysts to
capture at that time; today there will be a multiple of that amount that is
information of that year. Not only is current information being digitised, but
importantly for the investigator, past information is also being digitalised
making discovery all the more interesting. Yet most commentators cite the data
deluge and information overload as a massive problem: not if you know how to
build your filters!
In the example of Total’s
near-disaster on the Elgin Platform the content was being delivered as text;
hence easily processed and searchable.
By definition “open source” means any format of information that can be
imagined. This causes problems for investigators who are more used to thinking
(and investigating) within a predefined concept: they think within the box and
need to be shown how to “think outside of the box” and sometimes how to think
without any box being present.
Today it is not about wondering in a certain item of information exists, but rather in what format is does exist, or where one can find collateral information that would point towards it.
Apologies for using another
example from the energy industry, but it fits perfectly. A few years ago a
hedge-fund client asked if it was possible to build an alert system that would
track new oil or gas discoveries being made by companies that had shares quoted
on the stock market, getting that information before any formal (regulated)
announcement by the company. After a couple of days of thinking and
investigating, we informed the client that a monitoring system could be put
together than would alert him within 24 hours of any discovery anywhere in the
world – possibly before the company’s own head office might receive the news.
The client’s immediate assumption was that we planned to hack the internal
communication systems of the relevant companies: living and working in a highly
regulated industry, that obviously was not the case. The next assumption was
that we knew how to put a large number of people on rubber dinghies or camels,
close by the drilling rigs.
The reality was far simpler
and infinitely more elegant. Whenever a drilling rig hits hydrocarbons, i.e.
makes a discovery, there is what is called “associated gas”. It doesn’t matter
whether it is an oil discovery or a gas discovery; there will always be gas
present, and this gas needs to be burnt off in a controlled manner. That action
of burning off the associated gas would generate considerable heat, and that
heat could be picked up by any commercial satellite that was trained on a
specific position looking for a heat signature with its infra-red cameras. Thus
in answer to the clients query; with the geographic coordinates of every
drilling rig or drill-ship (Lloyds shipping register) and by hiring capacity on
a number of commercial satellites, one could set up a very simple tracking
system that would deliver the required intelligence within the suggested 24
hours. Notionally the initial request
would have been impossible to answer using legal methods; but by “thinking
outside of the box” and thinking through how such information might be
generated, it was possible to deliver a solution – albeit overly costly for
that client.
No comments:
Post a Comment