We’ve been arguing about realisation and instantiation, but from your discussion here, and posts by Mick and others on the machine ‘fooling’ us into interpreting its texts as human-like, the issue is actually individuation.It can appear to negotiate affiliation, but the machine itself is not affiliated with any scale of community. Only perhaps the people who program it and feed it text corpora are so affiliated, and the users focusing its tasks.I think this is crucial because we can use and grow our individuation toolset to analyse this dimension. As I think you are suggesting, our realisation and instantiation tools are inadequate on their own for describing it. That is why all I can see when I look at its texts is instantiation of systems at each stratum. I am told this is wrong but I haven’t been given any textual evidence against it – only the authority of its designers and their community.When the machine tellsI don't have a subjective experience or consciousness that allows me to perceive or interpret those inputs in the same way that a human might....it’s talking about individuation.
Blogger Comments:
[1] To be clear, Martin has proposed two models of individuation, one in which meaning potential is individuated (derived from Bernstein), and one in which meaners are individuated (derived from his student Knight). Neither model applies to ChatGPT because its texts are not instances of a system of meaning potential individuated through ontogenesis. Instead, ChatGPT uses the collective ("unindividuated") lexical collocation probabilities derived from millions of instances (texts) to produce new instances. In Bernstein's terms, ChatGPT is not a 'repertoire of potential' but a 'reservoir of instances'. ChatGPT is thus not an individuated meaner producing texts as instances of an individuated system of potential.
[2] To be clear, the underlying principle of affiliation is different from that of individuation, but Martin confuses the two in his Knight-derived model. Where individuation is a hyponymic taxonomy (an elaboration of types), affiliation is a meronymic taxonomy (a composition of parts). The affiliation model does not apply to ChatGPT because it applies to individuated meaners producing texts as instances of an individuated system, and as demonstrated above, ChatGPT is not an individuated meaner producing texts as instances of an individuated system.