Urban Heartbeat is an civic art project that explores real-time temporal identities and pulses of places in cities around the world.
The project is part of the Data Canvas: Sense Your City competition, a DIY sensor network project collecting and visualizing open data. To state the obvious: cities are for people— not machines, not data scientists, not corporations. Places, particularly broken down to the scale of a neighborhood, are really hard to quantify. Places are temporal and dynamic, changing dramatically over the course of a day, week, month, year.
The primary goal of Urban Heartbeat is to empower citizens see and understand civic data search in the form of information-rich, yet visually simplified mediums.
Check out the project here, which won a grand prize in the Sense your City competition:
Sense Your City built 14 environmental sensor nodes in 7 cities. Each sensor captures data about air quality, dust, light, sound, temperature, and humidity. One goal of our project was how to explore real-time data collection in a way that makes the process more transparent and intuitive to all citizens. To achieve this goal, we decided to explore different types of environmental data in different mediums, including light, sound, and visual design.
We conducted several experiments based upon different environmental factors, such as pollution, noise, dust, and light. Each experiment explores a different factor using a different medium or visualization technique. Then we incorporate these visualizations into a single interface that gives a hyper-local, real-time snapshot of a place. The user can also view multiple places at one, to compare their current pulse.
Noise for the current place is expressed as an audible heartbeat and sound visualization. The frequency and loudness of the heartbeat represent the amount of sound.
Pollution data tells about potential harmful target gases near the sensor, including smoke, carbon monoxide, and ethanol.
Pollution is visualized as an animated cloud of pollutants. When levels are high, the cloud is more green and opaque.
Light tells about the current amount of light (LUX) that is captured by the sensor.
High and low light are visualized by modify the exposure and brightness of the map for the selected place.
Dust data tells about the concentration of particulate matters (PM) near the sensor.
Dust is visualized as a graph of particles that appear to emanate from sensor.
The project is currently on exhibition in Geneva. This project was covered publication including Fast Company, Next City, and CityLab.
Project Keywords: Urban Informatics, Civic Hacking, Smart Cities, Internet of Things, Physical Computing, Data Visualization, User Interaction Design