SEE ISLANDS

Location: Berlin, Germany

Type: Research, Robotics, Artificial Intelligence, Parametric Modeling

Client/Research Team: Amazon Robotics, Berlin

Collaborator: Christine Lara Hoff

 

The design of information is as valuable in architectural practice as the design of space.

Machine learning and virtual reality are increasingly influencing planning processes and urban systems to enforce a cityscape of complex interdependencies. Urban conditions, fleeting intimate interactions, and large-scale city infrastructure will develop based on the design of parameters in robotics and artificial intelligence. The involvement, or lack of involvement of architects, in this research will have lasting implications on how cities develop and determine the interfaces of our day-to-day life.

In collaboration with a team of scientists in robotics, we worked in simulation design, developing models for machine learning research. Through virtual and parametric modeling, we designed parameters to determine how robots perceive and understand space. How do robots perceive the edge of a thing? How do you teach robots to perceive the end of one form and the beginning of another? Segmentation in robotics uses different techniques to distinguish or identify objects from one another, including geometry and color. However, these parameters only address forms in uniform conditions. What happens when you add reflectivity to objects? These subtle shifts in light and texture make the robot’s ability to ‘see’ far more complicated. How can you teach robots to understand a form under different conditions? Wet versus dry, well-lit versus dim, whole versus damaged….

We approached the research from abstraction to realism. If the difference between amorphous forms can be understood, these principles can be applied to any object. Based on this research, we created a data form series, or “Islands,” which map the perception of space and objects for robotics. 

Previous
Previous

Q-forum

Next
Next

Projection