IndaPlant Community Goes Live Each Day From 10:00am to Noon Via Webcam

I am thrilled to report that our current community of three IndaPlants (IP’s) from the IndaPlant ProjectAn Act of Trans-Species Giving went live via a webcam during an exhibition at the Association of Environmental Science Studies in New York this June. Visitors to the gallery at AESS were able to watch the floraborgs (part-plant/part-robot entities that use machine learning to locate sunlight and water) navigate the hallways of the School of Engineering at Rutgers University from 10:00am to noon each day.

One of the interesting aspects of this project is that the IP’s have become part of the daily routine at Rutgers University. When my collaborator on the project, Dr. Qingze Zou, comes to work in the morning he is greeted by the IndaPlants, which jostle with one another to exit his office in search of sun in the adjacent hallway. When an IndaPlant is thirsty, a moisture sensor sends a signal through the unit’s central processor which may decide that its plant species needs water. If so, the unit will locate a water dispenser in the hallway, via an inferred sensor. If a floraborg is in the immediate vicinity of the watering station, passer-buys are invited to give the plant a drink.

My primary interest in creating this art piece lies in the poetic implications of turning an immobile houseplant—which is completely dependent upon human largesse and care—into a free agent. The project has however grown in addressing the relationship of the built to the natural world. The work has led to the synergistic creation of a larger team and what may be a truly significant scientific breakthrough in communicating with plants about the nature of our shared environment. In addition to myself and Dr. Zou, the IndaPlant team now includes the biologist Dr. Simeon Kotchoni and the computer scientist Dr. Ahmed Elgammal. With these joint capabilities our group is now working on the creation of a floraborg biocyber interface. Addressing the super sensory capacities of plants, this interface allows humans to decipher plant-based information on ecosystem health, the effects of climate change and air pollution. In this capacity, a super sensory IndaPlantV2 (IPV2) may allow us to model and support environments that are able to sustain humans and plants alike.

The project is currently up for multiple grants that will allow us to close a positive feed back loop between the plant and its robotic cartage and we have hight expectations for what the future will bring for our floraborgs.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s