secret free
hey, proposition:
control over "who knows what, and when" is never a good idea peer-to-peer
those levers have a function but its function is only relevant to someone with operational control over an entire scene - i.e. the scene operator controlling knowledge flow within the scene, the operator importantly not being an agent within the scene, but the operator only controlling knowledge flow for agents that they can successfully empathize with
(the successful-empathy part is important: it's what prevents the theoretical mechanism from being vestigial while still explaining why it only works when applied recursively and not at all laterally. the operator must be consistently agent-like without being in the scene. only way for the agents within the scene to be able to trust the scene's operator. (and maybe scenes are nested, who knows.))
when knowledge flow control is attempted peer to peer, agent to agent within a scene, it only ever calcifies the scene
so: what if peers methodically extricate themselves from patterns that compel them from attempting that control? this is the operational piece of the proposition: what if we-as-agents learn to notice when we're controlling information flow between us, and we design ourselves out of the patterns that create those urges? what if we built tools that were known as tools for this purpose? less a group project, but a group of individuals each conducting their instance of this project? not forced group vulnerability, but parallel play toward systemic honesty
I bet the scene improves a whole hell of a lot, and we don't ever have to agree on what's beyond "the scene"
(I think this functionally just adds up to "what if we tactically, strategically, methodically, approach systemic honesty" - but not honesty as in broadcast, but honesty as in "(the aggregate effect of (intentional agent pathing such that (information concealment just (doesn't have a (use)))))")
or just, here:
what if we made secrets economically obsolete? can't punish the players, but I'm personally playing towards states such that opacity no longer provides any advantage for self or other aaaaaaaand I'm publishing my notes
my objective is to get to an experience of world in which the world experiences itself as being well
this is not a secret
idea: when you have relevant information that it's not safe to share, either make it irrelevant or make it safe. refer to speedrun for engineering that path.
would that work? I'm looking for the minimum viable heuristic here
Last updated
Was this helpful?