{"id":279642,"date":"2016-08-18T17:32:38","date_gmt":"2016-08-19T00:32:38","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=279642"},"modified":"2020-11-12T13:44:34","modified_gmt":"2020-11-12T21:44:34","slug":"project-florence","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/project-florence\/","title":{"rendered":"Project Florence"},"content":{"rendered":"

A Plant to Human Experience<\/h3>\n

If plants could talk to us, what would they say? Equally important how might they respond to us if we could converse with them? How might these conversations evolve and expand our ability to better relate with the Natural World?<\/p>\n

Project Florence is an artistic representation of a Plant-Human Interface Experience that is built on top of a scientific analysis of the plant and its environment. Paired with the ability to receive human input, the plant can return a response, thus promoting a two-way conversational experience. Combining biology, natural language research, design, and engineering\u2026 we have created an instantiation of a plant to human interface through the power of language. Project Florence enables people to converse with a plant by translating their text sentiment into a light frequency the plant can recognize and respond to.<\/p>\n\t\t\t

\n\t\t\t
\n\t\t\t\t\t
\n\t\t
\"Project

Using light frequencies to drive action within the plant<\/p><\/div>

\t<\/div>\n\t \t

\n\t\t<\/p>
\"Silver\/AgCl

Silver\/AgCl electrodes for measuring the cell membrane action potential<\/p><\/div>

\t<\/div>\n\t<\/p>\t\t\t<\/div>\n\t\t<\/div>\n\t\t\n

When you visit Project Florence, you begin by typing a message to it on a keyboard that is connected to the computer that is monitoring the plant ecosystem.\u00a0 What you type is then analyzed by our Natural Language Processing components to examine the sentiment and the sentence structure,\u00a0with the result then translated into stimuli (in this case: different light frequencies and duration) that the plant can understand and react to. \u00a0The resulting signals are used to modulate a light source that projects either a red or blue light spectrum based on that sentiment. For positive sentiment, we flash a red-light spectrum that encourages plant cells to grow. For negative, we project a blue-light that in effect turns off the cells ability to expand. We leverage the plant\u2019s ability to respond to different light properties to deliver different sentiment and trigger a plant response.\u00a0 Based on the \u201cmood\u201d of the plant (derived from the various sensors in the enclosure) and the content of the message from the visitor, a response is generated (again using the Natural Language Processing), and that message is printed and displayed.<\/p>\n\t\t\t

\n\t\t\t
\n\t\t\t\t\t
\n\t\t
\"Natural

Natural Language Algorithm used to determine sentiment and semantic content<\/p><\/div>

\t<\/div>\n\t \t

\n\t\t<\/p>
\"The

The electrical signal generated by cell depolarization through ion fluxes<\/p><\/div>

\t<\/div>\n\t<\/p>\t\t\t<\/div>\n\t\t<\/div>\n\t\t\n

The opportunity to interact and communicate with our natural environment enables us to look beyond the physical properties of plants and gives us access to look at the processes and functions of plants. Within the Florence ecosystem, there are various sensors measuring environmental factors that define Florence\u2019s \u201cmood\u201d.\u00a0 These sensors include soil moisture, air humidity, carbon monoxide, and air temperature. These sensor values are collected and streamed to the Azure Cloud service. The Azure service combines this plant-mood-affecting information with the Natural Language Processing of the human conversation as inputs into the formulation of Florence\u2019s conversational response.<\/p>\n\t\t\t

\n\t\t\t
\n\t\t\t\t\t
\n\t\t
\"An

An 8-channel board for measuring bio-electrical signals<\/p><\/div>

\t<\/div>\n\t \t

\n\t\t<\/p>
\"Plant

Plant response generated by signal translation<\/p><\/div>

\t<\/div>\n\t<\/p>\t\t\t<\/div>\n\t\t<\/div>\n\t\t\n

The \u2018moods\u2019 of Florence are defined by the common knowledge of plant physiology and its reaction to environmental conditions like weather, draught and its internal clock (circadian rhythm). The different conditions are than mapped on common human reactions to thirst or tiredness and to phrases and idioms used in farming and gardening. We also developed a set of tools to help us examine the reactions of plants to different light frequencies and durations as proposed in project Florence. These tools enabled us to measure variation potential changes within the plant by inserting needle electrodes.<\/p>\n

Project Florence is an artistic interpretation of a future world where nature and humans can communicate and co-exist in greater harmony. Enabling people to \u201ctalk\u201d to a plant and receive a response opens their minds to this concept. Sometimes, Florence can be quite cranky and unpleasant if she is too dry or does not care for your input. This too invokes a sense of personality to the plant that people would not normally consider. This could lead to novel applications for agriculture to detect things like plant infestation at an earlier stage, requiring less pesticides.\u00a0This can also expand our ability to monitor our natural environment and crop-fields to better understand the problems we face and to build more sustainable agricultural systems.<\/p>\n

Future opportunities:<\/strong><\/p>\n

    \n
  • Education\/Research<\/li>\n
  • Environmental monitoring<\/li>\n
  • Gene expression<\/li>\n
  • Interaction with natural environment<\/li>\n
  • Studying adaptation of plants at different locations<\/li>\n<\/ul>\n

     <\/p>\n","protected":false},"excerpt":{"rendered":"

    Project Florence is an artistic representation of a Plant-Human Interface Experience that is built on top of a scientific analysis of the plant and its environment. Paired with the ability to receive human input, the plant can return a response, thus promoting a two-way conversational experience.<\/p>\n","protected":false},"featured_media":581275,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[198583],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-279642","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-ecology-environment","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[448878],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Jonathan Lester","user_id":32333,"people_section":"Group 1","alias":"jlester"},{"type":"user_nicename","display_name":"Chris Quirk","user_id":31430,"people_section":"Group 1","alias":"chrisq"},{"type":"user_nicename","display_name":"Asta Roseway","user_id":31130,"people_section":"Group 1","alias":"astar"},{"type":"guest","display_name":"Helene Steiner","user_id":582847,"people_section":"Group 1","alias":""}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/279642"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":68,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/279642\/revisions"}],"predecessor-version":[{"id":618447,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/279642\/revisions\/618447"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/581275"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=279642"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=279642"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=279642"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=279642"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=279642"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}