Enhancing RFID data quality and reliability

Mahdin, Hairulnizam (2012) Enhancing RFID data quality and reliability. PhD thesis, Deakin University.

[img]PDF
849Kb

Abstract

Radio Frequency Identification (RFID) is gaining significant thrust as the preferred choice of automatic identification and data collection system. RFID technology has been increasingly deployed in a wide range of applications, such as animal tracking, automatic toll collection and mass transportation. A RFID system consists of a transponder (i.e., tag), which is attached to the objects to be identified, an interrogator (i.e., reader) that creates an RF field for detecting radio waves, and a backend database system for maintaining expanded information on the objects and other associated objects. While RFID provides promising benefits in many applications, there are serious data management issues that must be overcome before these benefits can be fully realized. In this thesis, we address the RFID data quality and reliability problems. RFID data is fundamentally different from the traditional relational and data warehouse technologies. These differences pose great challenges and they need to be fully considered in RFID data management systems. Also, RFID uses radio waves to capture data automatically. Unfortunately, despite vast improvements in the quality of RFID technology, a significant amount of erroneous data is still captured in the system. The observed read rate in real world RFID deployments is often in the 60-70 % range. Such level of error rates render raw RFID data essentially useless for mission-critical applications such as healthcare and inventory management systems. Moreover, RFID data are large, dynamic and time-dependent. Therefore, missed and unreliable readings are very common in RFID applications and often happen in situations of low-cost, low-power hardware and wireless communications, which lead to frequently dropped readings or with faulty individual readers. RFID data stream is full of duplicate readings. The duplicate data results in unnecessary transmissions and consumes network bandwidth. In this thesis, we studied the issues contributing to the low quality and unreliability of the RFID and propose several approaches to enhance the RFID data quality and reliability. RFID naturally generates a large amount of duplicate readings. Duplicate readings can produce conflicting information to the system such as tagged object being counted twice. Removing these duplicate readings from the RFID data stream is paramount as it does not contribute any new information to the system and wastes the system resources. In this thesis, we present a data filtering approach that efficiently eliminates the duplicate data from RFID data streams. We will also present experimental results of the data filtering algorithm to show that the proposed approach provides a significance improvement in the quality of RFID data processed. Another problem with RFID systems is that the captured data has significant percentage of errors particularly as a result of miss reads. Unreliable readings are often happen due to low-power, faulty individual readers and wireless communications. We studied the problem of faulty readings due to faulty readers that continuously send readings and developed an approach to detect and remove such faulty readings from the RFID data stream. We also developed an energy-aware RFID data filtering approach to address the frequently dropped RFID readings due to low-power.

Item Type:Thesis (PhD)
Subjects:T Technology > TK Electrical engineering. Electronics Nuclear engineering > TK6540-6571 Radio
ID Code:4636
Deposited By:Normajihan Abd. Rahman
Deposited On:23 Dec 2013 16:29
Last Modified:23 Dec 2013 16:29

Repository Staff Only: item control page