Article,

An architecture for SCS: A specialized web crawler on the topic of security

, and .
Proceedings of the American Society for Information Science and Technology, (2004)
DOI: 10.1002/meet.1450410138

Abstract

Abstract 10.1002/meet.1450410138.abs Mining for correct and relevant information in the World Wide Web is a difficult task, handled by Web crawlers. This study outlines the components of a specialized crawler on the topic of security (SCS) that heavily makes use of artificial neural networks and rule-based expert systems to establish successful focused crawling on the topic of security. SCS is designed to find, index and follow the updates of Web pages of interest, and proposes new approaches for reaching relevant pages, which might stay hidden to other crawling approaches. SCS consists of four new page explorers, a database of relevant pages, a relevance evaluator using artificial neural networks and an updater using rule-based expert systems. SCS is a multi-threaded multi-object Java Applet and Application combination with embedded SQL and PHP elements and is applicable on single or multiple machines through parallel processing with its expandable and modular structure.

Tags

Users

  • @sofiagruiz92

Comments and Reviews