Cross-supervised synthesis of web-crawlers

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A web-crawler is a program that automatically and systematically tracks the links of a website and extracts information from its pages. Due to the different formats of websites, the crawling scheme for different sites can differ dramatically. Manually customizing a crawler for each specific site is time consuming and error-prone. Furthermore, because sites periodically change their format and presentation, crawling schemes have to be manually updated and adjusted. In this paper, we present a technique for automatic synthesis of web-crawlers from examples. The main idea is to use hand-crafted (possibly partial) crawlers for some websites as the basis for crawling other sites that contain the same kind of information. Technically, we use the data on one site to identify data on another site. We then use the identified data to learn the website structure and synthesize an appropriate extraction scheme. We iterate this process, as synthesized extraction schemes result in additional data to be used for re-learning the website structure. We implemented our approach and automatically synthesized 30 crawlers for websites from nine different categories: books, TVs, conferences, universities, cameras, phones, movies, songs, and hotels.

Original languageEnglish
Title of host publicationProceedings - 2016 IEEE/ACM 38th IEEE International Conference on Software Engineering Companion, ICSE 2016
PublisherIEEE Computer Society
Pages368-379
Number of pages12
ISBN (Electronic)9781450339001, 9781450342056
DOIs
StatePublished - 14 May 2016
Event2016 IEEE/ACM 38th IEEE International Conference on Software Engineering, ICSE 2016 - Austin, United States
Duration: 14 May 201622 May 2016

Publication series

NameProceedings - International Conference on Software Engineering
Volume14-22-May-2016

Conference

Conference2016 IEEE/ACM 38th IEEE International Conference on Software Engineering, ICSE 2016
Country/TerritoryUnited States
CityAustin
Period14/05/1622/05/16

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'Cross-supervised synthesis of web-crawlers'. Together they form a unique fingerprint.

Cite this