Al-Khwarizmi 10 hours ago

They make LLMs play a very abstract game that rewards them points from answering the same as the other, and punishes them from answering differently, and LLMs tend to converge to an answer. From that to "social conventions" there is a long, long stretch. The paper lacks a baseline - wouldn't much simpler (non-LLM) systems also exhibit the same property? Is it that surprising that systems that are clones of each other (because they didn't even try "mixed societies" of different LLMs) agree when you give them points for agreeing?

Maybe I'm missing something but in my view this is pure hype and no substance. And note that I'm far from an LLM skeptic and I wouldn't rule out at all that current LLMs could develop social conventions, but this simulation doesn't really show that convincingly.

tbrownaw 9 hours ago

As obviously silly as this is, could it actually be useful for getting the observed phenomena acknowledged by people who might just tune out is presented with the underlying math that makes it individually?

How much overlap is there between people who think llms are magic and people who don't approve of applying math to groups of people? And how many in the overlap have positions where their opinions have outsized effects?

lostpilot 12 hours ago

Crazy - this is saying agents develop their own biases and culture through their engagement with one another even if the individual agents are unbiased.

  • Animats 5 hours ago

    That may be reading too much into this behavior. Watch this video of metronomes self-synchronizing.[1] That's a pervasive phenomenon. Anything with similar oscillation frequency and coupling will do this. (Including polling systems with fixed retry intervals.)

    Are you sure that's not just this effect?

    [1] https://www.youtube.com/watch?v=Aaxw4zbULMs

    • drdaeman 4 hours ago

      Yes, but aren't cultures essentially the same way - people grouped together getting influenced by each other actions and ending up learning from each other (introducing bias into individual agents), doing things together, appreciating similar stuff, talking in particular ways and so on? AFAIK, "culture" essentially means "stuff that goes in in this particular space and time".

      • Animats 4 hours ago

        The article hypothesized leaders and followers. That's not necessary. Drift towards the mean, plus some noise, is sufficient.

amelius 10 hours ago

I wonder when we will see LLMs being used to test economic theories.

  • laughingcurve 9 hours ago

    Already exists in the current literature

    • tmpz22 7 hours ago

      And much of the current writing

dgfitz 14 hours ago

It’s funny how we seem to be on this treadmill of “tech that uses GPUs to crunch data” starting with the Bitcoin thing, moving to NFTs, now LLMs.

Wonder what’s next.

ggm 6 hours ago

So how many systems aside from Grok started to espouse Afrikaaner propaganda, and how many systems aside from Meta started to be holocaust deniers?

or, are the walled gardens working to our advantage here?

th0ma5 15 hours ago

Oh I thought this was going to be about the cult like in group signaling of LLM advocates, but this is a thing imagining language patterns as a society instead of language patterns of a society being a bias that the models have.

  • A4ET8a8uTh0_v2 11 hours ago

    This sounds dismissive, but it is interesting in two more obvious ways:

    1. The interaction appears to mimic human interaction online ( trendsetter vs follower ) 2. It shows something some of us have been suggesting for a while: pathways for massive online manipulation campaigns