The objects around us are becoming smarter, and our relationships with them are changing. Humans and robots will have a shared future and I as a designer can make that future better. This is an interaction design project that show cases my point of view regarding how our relationships and collaboration with robots should be like.
Scroll down to see the development process.
Cobe is a robot that works in an industrial environment in collaboration with humans. I chose an industrial setting to present my agenda because robots have been a part of factories for many years, there is proven data to rely on and actual users to interview. The main problems with cobots (cobot- collaborative robot) today, is the lack of good communication and not being able to understand their intentions. These things are a big part of why some people deter robots and dislike them.
My main goal was to create a robot that we can understand- understand its intentions and understand how to communicate with it. A robot that is suited for the work environment and still evokes empathy.​​​​​​​
COBE communicates through light and movement. The environment is noisy, so visual cues are appropriate. COBE signals to the direction it is about to move, the signal is intuitive and the user doesn’t need to interpret it, much like we do not consciously analyze the body language of the person walking in front of us on the street but still know immediately which direction they are about to turn.
Light has a wide range of shades, intensities, and rhythms, which allows me to be precise in the messages it conveys. In addition to the signal light I chose to add lighting in the joints - a visual emphasis of which joints are now moving helps understand the movement that is happening and that is about to happen. The lighting colors I chose derive from an industrial light. Green lighting while working, orange lighting while waiting and red lighting during a malfunction.​​​​​​​
In the video below we can see the beginning of a work day, how COBE wakes up and how the worker signals it to start working.
I chose to use gestures as a form of communication because we are used to using gestures as complementary communication. I was looking for something different from computer or switches which fit a machine, but do not fit a live object such as COBE.
In the next video we see how COBE recognizes the person who is in charge of it, and how it expresses this recognition and acknowledges the worker.
Another issue I chose to address is how COBE calls for help, and how it says “thank you”, a kind of gesture that basically expresses “I see you”.
To figure out the use cases I mapped the course of the workday and the joint actions that the employee and the COBE perform together during a shift. Below we see the lean version of the flow chart of the action and reactions by the COBE and its supervisor.
Throughout the development process I conducted several experiments. One of them was an experiment to see if there was a consensus as to which gestures represented which messages, referring to the specific messages I defined.
I conducted several user experiments and used “blender” to create animation sketches that will allow me to convey my design developments. I faced interesting questions such as how to create a living object that may act human but does not look human? How to translate human gestures to robotic gestures? How to design an object that is a machine but also a colleague?
A small bit of the process you can see below, to learn more you are welcome to contact me.
I believe this will be the future of design. Questions about collaborations between humans and robots will occupy us in everyday life, and I as a designer can make a difference, and improve "living objects".
Back to Top