{"id":306086,"date":"2010-07-19T09:00:54","date_gmt":"2010-07-19T16:00:54","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=306086"},"modified":"2016-10-15T18:41:14","modified_gmt":"2016-10-16T01:41:14","slug":"quest-quality-searches","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/quest-quality-searches\/","title":{"rendered":"The Quest for Quality Searches"},"content":{"rendered":"
By Janie Chang, Writer, Microsoft Research<\/em><\/p>\n When the Association for Computing Machinery\u2019s (ACM\u2019s) Special Interest Group on Information Retrieval (SIGIR<\/a>) holds a conference, it must be difficult for participants to decide which sessions to attend, because creating easy, effective search experiences these days involves challenges potentially as diverse as dealing with multimedia, social media, relevance judgments, unstructured searches, and massive scalability. The 33rd annual ACM SIGIR Conference<\/a>, being held at the University of Geneva from July 19-23, features a busy schedule of tutorials, workshops, and presentations of research papers that explore these topics.<\/p>\n The increasingly multidisciplinary nature of this subject is reflected in the eighty-seven papers accepted for this year\u2019s conference. Fifteen submissions from Microsoft alone represent 10 groups from four research facilities\u2014Microsoft Research Redmond<\/a>, Microsoft Research Cambridge<\/a>, Microsoft Research Asia<\/a>, and Microsoft Research India<\/a>\u2014as well as the Internet Services Research Center<\/a> and Bing<\/a>.<\/p>\n Image Search by Concept Map<\/em><\/a>\u2014by Hao Xu of the University of Science and Technology of China and Jingdong Wang<\/a>, Xian-Sheng Hua, and Shipeng Li of Microsoft Research Asia\u2014is an example of how the use of multimedia has increased the complexity of information-retrieval problems. Digital images are, after text, the second-most prevalent media on the Web. The challenge for these researchers was to devise a more intuitive way for users to query for images.<\/p>\n Hua, lead researcher with the Media Computing Group<\/a>, wants to overcome the limitations of existing image-search engines, which depend on the metadata of web images\u2014and which rarely contain spatial information. Although the Image Search by Color Sketch<\/a> feature in Bing addresses spatial relationships between colors in an image, his team wanted to convey semantic intention.<\/p>\n \u201cThis is a totally new way of searching web images,\u201d Hua says, \u201cwhen compared to text-box-based image searches. In this model, we allow users to specify the spatial positions of the query terms. The typed keywords indicate the desired visual concepts, or objects, within the image. The spatial relation of the keywords indicates the desired layout of the visual contents. We translate from a concept map to a visual instance map.\u201d<\/p>\n