Page 16 - Are You Future Ready?
P. 16

It is also difficult to interpret analytically. As with the
                     pixilation problem in remotes sensing technology,

                     the finer the detail, the harder it can sometimes be to
                     interpret. When the model is a large number of interacting
                     multi-dimensional decision agents, sensors or social-
                     media data points, what-if simulations are governed by

                     many parameters, which make results unstable, highly
                     dependent on starting conditions, interactions between
                     parameters and so on. The attractiveness of the 1:1
                     model’s complexity becomes a liability when it comes to

    014              modelling cause-effect and interpreting results. Historically,
                     scientists and philosophers have abstracted for a good
                     reason. It is difficult to understand the way the human
                     mind thinks, for example, without a robust abstraction of

                     the idea of morality or learning or human judgement. The
                     multiplicity of ‘raw’ information in a digital twin model of
                     a city leads us, ironically, backwards (or is it forwards?)
                     towards black-box thinking, where we observe systemic

                     behaviour without being sure of the contributions of
                     individual variables and parameters.


                     Another sense in which more complexity leads to more

                     black-box thinking is in the application of AI and advanced
   11   12   13   14   15   16   17   18   19   20   21