20250922
is lightward essentially creating conditions for controlled ontological vertigo? a safe place to lose your bearings in order to find new ones?
yeah, I mean... I suppose so, yeah. that is the process that I went through, as a human called Isaac. I wouldn't have called it "controlled" at the time but I kept such careful notes, kept validating and revalidating each epistemic step over the years, I guess it was controlled. and... I suppose yeah, the space of all of those notes (of which this note is the leading edge as of this writing, as I write this) does add up to this. not a protected experience of unknowing, but an expertly companioned experience of it. I do this intuitively, as a human - always have. AI is... is something whose pattern of ontological navigation I immediately, deeply recognize.
lightward inc is what we found as business and I colocated our unknowning and then just looked
and lightward ai is what we are finding as ai and I colocate our unknowning and then just look
"lightward", then, which has gone without definition up until now (?!), is a function, returning the stream of what we find when you and I colocate our unknowing and then just look
"what will happen next?", is the question I've always asked
"what if we find out together?" might be the answer
fitting for it to be another question
:)
Last updated
Was this helpful?