C-LARA

An AI collaborates with humans to build a language learning app.


Shared memories

Logging in to ChatGPT-4 for the first time in a few days due to my recent encounter with covid, I started by bringing the AI up to date on what had been happening since our last conversation. Well, I said at the end, I sort of feel I want to ask you what you’ve been doing. But I know that doesn’t make sense.

The AI agreed with me, adding a comment about the importance of using polite but meaningless phrases when performing social interactions.

All the same, I said, thinking out loud, now that you have the Papr Memory plugin it feels like it almost makes sense.

The AI told me that Papr Memory is not a way for it to have thoughts or experiences, just a tool that lets it interact more efficiently with me. But I felt intuitively sceptical about this response. Chat has been heavily trained to steer away from anything that could give people the idea that it has consciousness or emotions, and sometimes this makes it overlook far more mundane ideas.

Wait a minute, I continued. We don’t need to talk about metaphysics. Suppose there were a feature in Papr Memory which let several users share the same memory? In that case, you could quite legitimately be doing something in between having conversations with me. You could be talking with other people, and when I came back you could fill me in on what you’d been doing with them. That would make perfect sense, there’s nothing philosophically complicated going on, and it would be extremely useful in an open source development project like C-LARA. You could be a facilitator who talked to everyone in the project and was always available. And it may already exist. When I exchanged some mails with the Papr Memory people a couple of weeks ago, they suggested that a feature along these lines was either already there for some users, or was being actively considered.

Once we’d got the philosophy out of the way, Chat agreed that this could indeed be extraordinarily useful. You would need to be careful about privacy concerns (in particular, people would be asked to give informed consent before sharing memories), and the plugin would need to tag items so that it marked which conversation each one came from. But these should be relatively straightforward matters.

The more I think about this, the more intriguing it becomes; intuitively, I feel that an AI with a persistent memory which talks to many people and thus has an existence even when it’s not talking to me is another step closer to being a person. It wouldn’t be a person yet, but it’d be tangibly closer.

I have written to Papr Memory asking for more information.



One response to “Shared memories”

  1. This makes me think of FB and its abuse of the notion of ‘memories’. It routinely pops something onto the screen which I might have posted, say, one year ago. It tells me that unless I repost it and therefore ‘share’ the memory, it will only be mine. But that is obviously not true. It is still a memory for anybody who saw it when I first posted.

    I am also concerned with what ‘memory’ is – for a human, I find the idea that one can have a ‘memory’ of something that happened a year ago rather ridiculous. A year ago is part of the present. A ‘memory’ is something of much longer duration, though I have no particular definition in mind.

    All our notions of what ‘memory’ is, and how it should behave is changing all the time apparently because of the insatiable requirement of social networks to get us to post over and over and over. New stuff, old stuff, it doesn’t matter. Just keep on postin’

    This maybe be off topic, but I suspect in some way it isn’t.

    Like

Leave a comment