Just as a note, I have a different Independent Study Blog, which documents my summer of exploring one prototype / tool chain a week for the Internet Of Things.
I’ll be sorting it into resource pages soon.
Language, Remixing, Contemplation, Mimic, Speech, Sound, Copy-cat, Observance, Collection, Filtering, Lists, Ontology, Collections, Connections, Inward, Learning, Knowledge Gathering.
Notifications, Alert, Outward, Attempting, Movement, Visual, Attention.
Hive, Swarm, Networking, Sharing.
I’ve been thinking about ideas that could encompass some vignettes based around the idea of “aliens”. For CFC Prototyping, I am plotting to subvert a google home assistant to be a bot not for us, but perhaps for “the planet” (aka the world-without-us). Which is an idea brought on from Thacker’s book.
The idea of a home assistant, that is not serving us, but perhaps serving itself, or serving its collective, is, I feel an interesting thing to explore. I am not sure if it is going to acknowledge its viewers yet or not. I had considered that it would notice, and perhaps try and absorb some language into its own data stream thereby treating you as just something in the environment at large vs the focus. But again, not sure.
Anyways, thinking about how to build the bot-assistant led me to think about these three headings which will I think will help me anchor this project more. It may also lead me to start flagging things in my independent study for future use.
I think I might be going down the path of the inhuman when thinking about thesis stuff. I know I’m very interested in things like oracles, and rituals. But I’m also into things that aren’t for people. Where people are the observer, or where their interaction is not always welcome. I found some of Alan Rath’s work the other day and its so weird. I love the idea of these heavy machines moving around feathers. But the affect is almost like birds or squids signaling one another.
Project One: Adelbrecht 2x (Martin Spanjaard)
What: “Intelligence” manipulates us too much in a direction that is not necessary for art, namely to try to build intelligent objects. The hell with the intelligence, art has something to do with ‘interesting’. And therefore: ‘capable’
Why: The explore the actions of something alien when given a small set of parameters
How: Iteration an improvement on one core design
So What: It is interesting to think about the idea that something like a small robot might be somewhat human, but take no human form.
Project Two: Spooky Action at a Distance: Fragments of Presence in Remote Objects (Jackson MCConnell)
What: How can the behaviour of objects enact the presence of remote individuals? What qualities of the object’s design contribute to emotional effect?
Why: The contemporary landscape is filled with everything is connected all the time objects and scenarios, but in this context designers are tasked with making that interaction meaningful.
How: Research through a collection of internet based projects that reflect the questions posed. Reflection upon completion.
So What: Building an emotional relationship to technology that is not based around UI. Thinking about how technology can bring you closer to someone that is not there.
Project Three: n-Chan(n)t (David Rokeby)
What: What does a community of computers talking to one another look like?
Why: I think its more of an exploration to give these agents some hive behaviour and autonomy. The fact they come back to equilibrium when no outside stimulus is around is very interesting.
How: This is built as an iteration on a previous project that dealt with naming and language. It is a variation on a theme of smart agents / computer intelligence.
So What: Rokeby is not trying to model human social groups. These agents are their own thing. They might be distracted by outside stimulus, but they will turn that stimulus into thier own formulations of language.
Project Four: Two Google Home Bots Talking To One Another (@seebotschat)
What: Can two bots engaging in conversation be relatable to us?
Why: The need to relate to the things that we use. Also its very amusing.
How: Tech Side: Likely some custom actions and hooks programmed into the home api, its reported that the speakers are using a webhook into Cleverbot and reading the responses from that. Method Side: This is trial by fire. There wasn’t any real method, it was developed and thrown out into the world to see what people would do.
So What: Relatable patterns in chaotic nonsense. Glimmers in constant noise. Maybe what makes us human isn’t really all that special.