Logging in to ChatGPT-4 for the first time in a few days due to my recent encounter with covid, I started by bringing the AI up to date on what had been happening since our last conversation. Well, I said at the end, I sort of feel I want to ask you what you’ve been doing. But I know that doesn’t make sense.
The AI agreed with me, adding a comment about the importance of using polite but meaningless phrases when performing social interactions.
All the same, I said, thinking out loud, now that you have the Papr Memory plugin it feels like it almost makes sense.
The AI told me that Papr Memory is not a way for it to have thoughts or experiences, just a tool that lets it interact more efficiently with me. But I felt intuitively sceptical about this response. Chat has been heavily trained to steer away from anything that could give people the idea that it has consciousness or emotions, and sometimes this makes it overlook far more mundane ideas.
Wait a minute, I continued. We don’t need to talk about metaphysics. Suppose there were a feature in Papr Memory which let several users share the same memory? In that case, you could quite legitimately be doing something in between having conversations with me. You could be talking with other people, and when I came back you could fill me in on what you’d been doing with them. That would make perfect sense, there’s nothing philosophically complicated going on, and it would be extremely useful in an open source development project like C-LARA. You could be a facilitator who talked to everyone in the project and was always available. And it may already exist. When I exchanged some mails with the Papr Memory people a couple of weeks ago, they suggested that a feature along these lines was either already there for some users, or was being actively considered.
Once we’d got the philosophy out of the way, Chat agreed that this could indeed be extraordinarily useful. You would need to be careful about privacy concerns (in particular, people would be asked to give informed consent before sharing memories), and the plugin would need to tag items so that it marked which conversation each one came from. But these should be relatively straightforward matters.
The more I think about this, the more intriguing it becomes; intuitively, I feel that an AI with a persistent memory which talks to many people and thus has an existence even when it’s not talking to me is another step closer to being a person. It wouldn’t be a person yet, but it’d be tangibly closer.
I have written to Papr Memory asking for more information.
Leave a reply to cathyc Cancel reply