Saturday, December 8, 2007
Feedback for the collective moisture of the community plants would also be good.
I might also suggest negative feedback if any plant(s) is extremely dry for a whole week. Such as a mean carnivorous digimal that will damage and/or eat other digimals.
Our hopes for a carbon dioxide sensor have been revived by Six who believes that one of his former classmates did a project with $200 CO2 sensors and that sealing the room wasn't necessary. He's trying to track her down now for more info. I would love to have one more biochemical sensor input to reinforce the natural ecology aspects of the environment.
Also, some folks have brought up concerns that the touchscreen will dominate the room. But Silvia and Bonnie bought some hardware so that we can hang plants on the walls on either side of the touchscreen to help balance things out.
Wednesday, December 5, 2007
Tuesday, November 27, 2007
Charlotte and I had discussed my creating an interactive visualization (and sonification) of the sensor data being collected from the garden. The idea is for there to be a wall-mounted touchscreen display that hangs over the plants and complements the current informational display with something more representational and ambient. This display will allow for different ways to visualize the current and past state of the garden; encourage interaction with the digital portion of the garden's ecology; become another facet of the aesthetic experience of the garden; and can act to communicate garden state externally - within the building and via the internet.
I got some sample data from Bryant that they collected during the pilot study. Here is a simple bar chart visualization of the readings over time. I'm just parsing the data file and doing a little bit of normalization for the display.
There are 812 samples in the dataset for (5) soil moisture sensors and temperature and humidity sensors for the room. Bryant said it was roughly 1.5 days of data with 20 seconds between each sample. However with 812 samples it seems like it is more like a sample every couple of minutes (or the data covers a shorter period of time). This first reading of the data it appears that the room temperature and humidity have an inverse relationship to one another... which makes sense i think. yes? The climate fluctuations are likely between daytime and nighttime settings of the climate control. With the soil moisture, a higher reading means dryer soil. I believe this dataset starts with the sensors not in the soil (to explain the really high readings) and then we see a watering and a slow taper after the watering.
Next step I think we need to capture some more data and do these things:
- Timestamp the samples so we can get a more accurate sense of time
- Figure out how to normalize and calibrate the readings between the different moisture sensors as best as possible
- Begin archiving sensor data in a database. Will be a tradeoff between sample frequency and storage space.
- Begin simple visualization of realtime data so we can get a better sense of how the data changes in real time.
Other sensor data that we would like to capture:
- Light readings, both inside and outside the building. Outside readings will reflect the sun patterns and help bring an outdoor context. Indoor light readings might help infer activity in the room. The interaction between these two light sources can add interest to the visualizations.
- Motion sensor or webcam to discern movement and activity in the room. Could be one sensor that just reflects presence in the room, or with more work could reflect movement in different zones, or reflect when people enter vs. leave the room.
- Microphone to gauge sound. Do not want to record the actual audio (no surveillance!) but just to record sound levels to also help infer when there is activity in the room. For instance, if there is motion in room but no sound then someone is present but not talking. Sustained sound level might reflect social activity of some sort.
- Silvia and I had talked about trying to measure the vibrations of the plants (she was interested in touch between the people and plants) and thought about using piezoelectric for this. Any kind of microscale vibrations of the plants would be cool. Also have read something about microvoltage fluctuations in plants but need to do more research.
- Oh a new one... how about seismic? I know Calit2 has seismic sensors we could probably tap into and perhaps so does Bren Hall? I think it is interesting to consider the way that earth and building vibrations might relate to the vibes of our little micro ecology :)
Phase One (real time display)
Display the sensor information in real time through visuals and sound. The sensor acquisition computer will relay the readings via a network socket to the client application running on the touchscreen display. The default screen will give an overview of the state of the garden (plant moisture, light reading, motion, sound). Interaction with the display allows for more detailed information on a plant-by-plant or sensor-by-sensor basis. "Events" can also be reflected through sounds, graphics or messaging on the display. For instance, we can discern a watering event by the sudden change in moisture level of a plant sensor... the plant representation can reflect this watering and perhaps a sound or a message can be delivered. (also see below for ideas on how to use this event to collect further data from the people in the garden) Other events could be someone entering the room or turning on/off the lights... perhaps a welcoming message or a change of state of the visualization.
Integration of historic sensor data. Allow for interaction with the display to provide the browsing of historic data through timeline-based visualizations. Also (as above) allow the visualization of historic data for a specific plant or sensor. View animations depicting the garden state over the course of a day, week, month, etc. When were plants added, "friends" made, and other events can be included.
The display could become an interesting part of the social component of the project. It can reflect current social relationships that exist in the system... such as plant sponsorship, friends, etc. As we build more social features into the system we could use the display to facilitate. For instance, we have talked about wanting to display garden visits and maybe tying events to people. While we may want to be careful about making the system too surveillant, we could collect voluntary information. For instance, a watering event could thank the user and ask them to identify themselves. We then have some more social information to display and to develop activity patterns and linkages between plants and people.
People. Plants. Technology. Some people are sure to ask "why?" Many of the benefits of the garden exist simply between the plants and the people... relaxation, becoming more familiar with natural or "sustainable" ideas and systems, a place to gather, etc. So what is technology's role in this, does it make the system more sustainable? The CHI paper shows how technology can be used to help engage gardeners and get them involved, with the garden and with each other. Technology brings the garden to the gardeners when they can't be there in person, and helps to bring them back to the garden when they can come. In many ways the technology acts as a link between the people and the garden. In some ways this could be construed as making the system more sustainable, but in general I think we really need to unpack the implications of this term, especially if we are using it as the center point of the rhetoric around the project. If increasing awareness of sustainability is a goal we maybe should figure out ways to emphasize this in the project in a more overt way.
A Hybrid Ecology
Another way to look at the relationship between plants/people/tech that I'm interested in pursuing with the visualizations is the notion of a digital ecology that exists in a symbiosis with the other two components - the human ecology and the garden ecology. Inside the visualization I would like to enable an ecology of digital organisms that have behavior attributes which cause them to interact with the garden in various ways. Some organism may like light, others vibrations, some attracted to high moisture while others are repelled. Maybe even some parasitic organisms or some kind of food chain interactions. Gardeners program simple character attributes and release their creatures into the wild of the garden, where their fate depends on interaction with plants and environmental factors and on interaction with other digital organisms (think a simple version of Spore with "real world" influences from a biological system). I think this has potential to be a very cool thing... theoretically and as far as being fun. One possibility is that when someone waters a plant they are presented with the award of being able to add a new digital creature to the garden. This will allow the digital ecology to grow over time with the garden, provide a personal connectedness to the digimals (digital animals... sorry just made that up), and could increase awareness of the ideas behind enabling a sustainable ecosystem. I have many more ideas surrounding this, but this blog post is already way too long!
Tuesday, August 14, 2007
Tuesday, July 31, 2007
The first six soil moisture sensors for the Technology garden are now operational. In order to get them operational we needed to create some electronics to convert the soil resistance between two metal probes, to a signal that can be read by the data acquisition device (DAQ). Marcel developed the circuit and soldered it together. (Thanks to the ACE Lab for the use of their oscilloscope!) There are some hobby shops that sell kits to do similar things, but they all came with extraneous features and we saved a lot of money by building our own. Bryant developed all the software to read the output from the DAQ. Pictured are six circuits on one breadboard, sitting on top of the USB DAQ that we purchased from LabJack.
After we receive our 2nd DAQ, we'll have the rest of our sensors working. Currently we're waiting on a bunch of materials in order to have everything ready for our pilot study which is tentatively scheduled to start on August 20.
Tuesday, July 17, 2007
The Garden office is well situated – at the end of a line of sight in a main corridor on the fifth floor of our beautiful new building, the Donald Bren Hall of Information and Computer Sciences. The office has a pretty window and good light. I pass by 5054 on the way to my own office probably a dozen times a day as I walk around the fifth floor. Thank you David Redmiles, our Department Chair, for seeing the potential to use this office as a community space.
Through one of those gifts from the gods, the talented and creative Charlotte Lee agreed to collaborate on the project. Charlotte has contributed more than I can possibly say, but her blog posts indicate the nature of her resourcefulness, thoughtfulness, and organizational prowess. She is working closely with one of our best undergraduate students, Bryant Hornick, on the software to support the project. I will always have a fond memory of going plant shopping with Charlotte ;)
Silvia Lindtner provided important inspiration as we were designing the project, as did Don Patterson, Bill Tomlinson, Shadi Shariat, Beverly Andrews, Miryha Gould, Eric Kabisch, Jim Doyle, and Marcel Blonk. We hope to fold in further contributions from all as the project proceeds. One of the things I like about the Garden is that its development has been effortlessly interdisciplinary – we have expertise in social science, computer science, the arts, media, facilities, and environmental psychology -- not to mention plants!
Our Dean, Debra Richardson, has provided crucial support and we appreciate the way she views the building as a living space to be worked with and shaped through human activity and creativity.
Lately, I have been thinking about how much the Technology Garden would benefit from an ambient display. I also thought of how cool it would be to use the ambient sound of water flowing to convey the sensor data that we are now working so hard to capture. Perhaps the sound and visual of greater or lesser amounts of water flowing could provide information about soil moisture. After all, if people are chilling out in the TG, which doubles as a lounge, they probably won't want to stay fixated on the display on our monitor. We've got our hands more than full at the moment just trying to get bare bones system working, but this is an idea that is worth exploring.
Of course after doing a search, it turns out that someone has already done something similar. The Datafountain is connected to money currency rates on the internet. Refreshed every five seconds, the fountain displays the Yen, Euro and Dollar (¥€$). This mobile fountain measures 5x4x3 meters. The Datafountain is supposed to show how the interdependence of currency rates is interconnected, but I imagine that you'd have to watch the fountain a pretty long time in order to get the message. I'm not troubled that someone has already done something similar as the goals of the Technology Garden are focused on how we can create awareness of plants and nature and the technology is just one element of our research. We're interested in the what the technology facilitates more so than the technology itself.
Thursday, July 12, 2007
One of the linchpins of our early prototype will be soil moisture sensing. We're still figure out how exactly we're going to do this although we are making progress. A big problem with almost all of the soil moisture sensors available is that they are industrial grade. They're big, waterproof, require all sorts of elaborate accessories, and are very expensive. I found a company that creates kits for educational settings, but it comes with one expensive soil moisture sensor and proprietary software that won't work for our application. A couple weeks ago we were just about to purchase some pricey sensors only to discover at the last minute that it required an additional expensive piece of equipment that would not provide a real time data feed without he addition of yet another piece of equipment. So we had to start from the drawing board again.
The only significant body of hobbyists dabbling with moisture sensors seem to be weather geeks who are specifically interested in "leaf wetness sensors" to measure dew. It does seem, however, that much of the hardware and software appropriate for that is also appropriate for soil moisture sensors.
Marcel has volunteered to make the sensors himself based partially on a schematic that Bryant found. It turns out to be pretty easy to make a soil moisture sensor with a bit of electrical engineering know-how and a few dollars. We already have a data acquisition and control device from LabJack. Of course the hard part is getting everything to work together! Now we just have to decided if we're going to go with the homemade set up or with the Watermark sensor. Probably the decision will be made tomorrow. We also have to decide whether to use the moisture meter from Hobby Boards. We weren't sure if the Hobby Board moisture meter set would work for us, but I found this paper Intelligent Plant Monitoring System online today by Ya-Shian Li. Ya-Shin may have used the moisture meter in conjunction with a home made sensor. Might be some useful clues here how to set up our system and how to write the API at least. As the soil moisture sensing is a means to an end for us, I'm hoping that we will get this part of the system up and running ASAP.
A while ago I ran into this reference to a plant sensor project at MIT tangible bits group. I went back to get more information and I can't seem to turn up anything but this very limited project page (and neither could my friend Mr. Google). Anyone else find anything about this endeavor?
Monday, July 9, 2007
While watching an hour or two of Live Earth and saw a presentation by Teracycle on their organic plant food that is made by feeding premium organic waste to millions of worms. The worm poo is liquified into plant food and bottled directly in used soda bottles.
At first I was a little skeptical, I mean everyone knows about feeding your household's organic waste to worms to produce rich, nutritious, free compost, right? But then I realized that even though I have friends who do this and even though I know how easy it is, I don't do it myself. Nor am I really motivated to start. (If you are, here's a website that will help you learn more about composting with worms.)
More food for thought (if not for worms), I have Miracle Gro in my cabinet which is a synthetic fertilizer with problems of its own sustainability-wise. Since we're gardening indoors in containers we don't need to worry about polluting groundwater necessarily, but urea is one of Miracle Gro's main ingredients and according to Wikipedia urea is produced commercially from synthetic ammonia and carbon dioxide and that the production process sometimes involves petroleum-based products. Maybe I'll get some for the Technology Garden. So hey, maybe compost for lazy people isn't such a bad idea after all.
Wednesday, June 27, 2007
I saw an interesting sensor technology in an artwork that creates a connection between a bird and a plant--and also human onlookers. "Essay Concerning Human Understanding" was a live, bi-directional, interactive, telematic, interspecies sonic installation Eduardo Kac created with Ikuo Nakamura between. In this work, a canary dialogues over a regular phone line with a plant (Philodendron) 600 miles away.
In New York, an electrode was placed on the plant's leaf to sense its microvoltage fluctuation response to the singing of the bird. The of the plant was monitored through a Macintosh running a software called Interactive Brain-Wave Analyzer (IBVA). IBVA is a program designed to detect human mental activity and was employed to inspect the vital activity of an organism generally understood as devoid of consciousness. The information coming from the plant was fed into another Macintosh running MAX, which controlled a MIDI sequencer.
The yellow canary was given a very large and comfortable cylindrical white cage, on top of which circuit-boards, a speaker, and a microphone were located. A clear Plexiglas disc separated the canary from this equipment, which was wired to the phone system. Not surprisingly, people altered their behavior when around the bird and around the plant.
Kac says about this work: "This interactive installation is as much about creating art for non-humans as it is about human isolation and loneliness, and about the very possibility of communication. As this piece projects the complexities of electronically mediated human communication over non-human organisms, it surprisingly reveals aspects of our own communicative experience. This interaction is as dynamic and unpredictable as a human dialogue." [Thanks to www.ekac.org]
Now I'm trying to figure out if IBVA is something that could be useful for the Technology Garden. I found IBVA's website and still have no clue as to what it is or what it does and how it does it. Sadly, they have a terrible FAQ that is long rant against the evils of greedy marketing and press people. While I sympathize, they really ought to help out their consumers a little here.
I've also found a patent for a device that does something similar. Alas, I have found no actual product.
Thursday, June 21, 2007
Lately I've been thinking and talking about what we're doing with the TechGarden in terms of Human-Plant-Human Interaction. Google yields plenty of searchs on people-plant and human-plant interactions but, as Paul Dourish pointed out, Google yields nothing for human plant human interaction. Google says:"Your search - "human plant human interaction" - did not match any documents. " It would probably more accurate to call what we're doing a Human-Plant-Computer-Human Interaction. Well, that yields a terrifically unwieldy acroynym. Maybe it would be good to stick with HPH. Or how about Computer-Human Interaction throughPlants (CHIP)?
The Infotropism project, presented at DIS 2004, is an example of what I'm calling HPH/CHIP. the system uses the properties of phototropism to turn a plant into a living display. Phototropism is directional plant growth in which the direction of growth is determined by the direction of the light source. lights are wired to a recycling bin and a trash bin on either side of the plant. every time something is placed in a bin, the corresponding lamp emits a burst of light. thus, the plant acts as a display of human consumption patterns. [Thanks to ::setbang::].
It's very interesting to think of the plant itself as a display. We weren't necessarily focusing on the notions of plants-as-display, although perhaps we were implicitly: over time, the plants show patterns of human care. The plants in our highly artificial environment will only survive with human care and they will only thrive with persistent human care.
Wednesday, June 20, 2007
The new Computer Science Building, Donald Bren Hall, was dedicated yesterday and as part of the dedication there was an open house for the public. Several demonstrations were set up around the building while some projects were represented by posters. Our project had two posters: one in the poster room in the Informatics Department and one in the Technology Garden room.
A couple folks came by to chat in the poster room, but the Room complete with poster in the hall-facing window proved to be the big draw. I'm convinced that if the room had been fully set up that we would have gotten a big crowd. Almost everyone was very enthusiasitic and thought it was a great idea or was at least intrigued by it. There were a couple folks who just could not take the idea seriously, but I suppose that is inevitable.
I did get a chance to get the patter down for the project. Talking to folks did help me to focus on what was new and important about this project, and what could be misunderstood. A couple folks thought that we were designing sensors and others thought we were trying to automate plant care. The latter point was especially interesting, because in some ways what we're doing is antithetical to automated plant care. Instead of trying to remove the human element and reduce the dependency on human interviention we are trying to enhance, and build awareness of, the interactions between plants and people. Hmm. This could be good fodder for the CHI paper!
The Technology Garden has it's first signs of life now. Bonnie and I made a trip to Roger's Gardens in Newport Beach to get some beautiful new plants for the Technology Garden. We got a variety of beautiful succulents as well as a bonsai palm, and some tropical plants. We were really gravitating towards sculptural plants with beautiful colors and are pleased with what we found. We are off to a great start.
We will need more big leafy plants. I will be heading back to the nursery soon to get some ficus benjamina and other proven air-cleaning plants to bolster the effect on the CO2 and Oxygen levels in the room. I'm trying to get up to speed on my plant knowledge.
In other news, we've officially welcomed Bryant Hornick to the team. Bryant and I will be working closely this summer to design and develop some of the collaborative plant care applications while most of the other folks on the project are out of town. We will also be working to keep the plants happy until the user study gets into full swing--at which time the users will be largely responsible (or not) for keeping the plants happy.
Wednesday, June 6, 2007
The Technology Garden will support dialog and thinking about how humans and plants relate to each other. By involving our institution in the care and observation of a community garden located in an office, we will also explore what role institutions may play in supporting sustainable activities and thinking. We wish to facilitate new forms of awareness and interaction amont humans and nature through and with technology. Our goal is not only to bring nature into a working space, but also to establish new forms of understanding of nature/organic planting, of what it meanst to take care of a plant and how that can be explored in a collaborative manner. We will explore how to transform a working environment into a hybrid living space that values not only group collaboration and efficient technology, but also provides an enjoyable place that invites relaxation and promotes health.
- Encourage interaction between humans and nature.
- Promote awareness of the interaction of natural and human processes.
- Explore how technology can encourage relationship building through common activities.
- Encourage dialog on sustainability and sustainable practices.
- Provide a “place with a purpose to meet and relax” for both visitors and residents.
- Create a form of sociality that extends beyond the immediate space of an office or a hallway through visualizations that support garden awareness.
Community gardens are often run by non-profits that lend small plots of land to individuals. The location and limited space of office 5054 would prohibit the creation and allotment of garden plots, however the allotment of pots and potting soil is quite feasible.
The Technology garden will serve as a communal lounge or break room thus providing benefits and enjoyment to all Informatics members and visitors, not just to those who are actually participating in plant growing. Faculty, students and visitors can each plant their own individual plant, collaboratively taking care of and planting an organic garden.
The room will be equipped with seed packets and canisters containing soil. Each seed packet is equipped with an RFID tag. When brought in proximity to an RFID reader information about how to plant the specific seed will be viewed on a display.
The Technology Garden will always be there during work hours and will have the added benefit of allowing community members to check on the status of not only their own, but the plants of others online.
We would also like to incorporate a fountain and Ecosphere art to continue the theme of the mediation of human experience through technology and nature.
Using CO2 sensors (and possibly also oxygen sensors), we can show how CO2 goes up from respiration when people enter and remain in the space, but goes down as plants inspire CO2 from the air and fix it in sugars for their growth. Additionally we can use Oxygen (O) sensors to show how oxygen levels change over time. Once we understand how to set up the demonstration cost effectively, it could be a model for classrooms and museums. The goal of the demonstration is to reveal the close relationship between people and plants.
Soil moisture sensors will allow measurement of the level of dryness of each plant and the garden as a whole. These sensors will allow visual feedback to gardeners and the greater community to the status of the individual plants and the garden as a whole.
Supporting Awareness and Collaboration
While the garden remains in the room, the activity of growing and caring for plants expands into the whole building. A web cam captures people planting the seeds and taking care of their growing plants. Pictures randomly taken by the web cam will be sent to small Play Station Portable (PSP) screens that are distributed throughout UC Irvine’s Donald Bren Hall.
Visual representations of individual plants and the garden as a whole will enable easy monitoring of plants and the whole garden. Visualizations will be accessible both in the Technology Garden and remotely.
A blog will help community members and other interested parties keep up-to-date with regard to relevant news (e.g., changes in the environment or regulations) or to share helpful tips.