Welcome To UTPedia

We would like to introduce you, the new knowledge repository product called UTPedia. The UTP Electronic and Digital Intellectual Asset. It stores digitized version of thesis, final year project reports and past year examination questions.

Browse content of UTPedia using Year, Subject, Department and Author and Search for required document using Searching facilities included in UTPedia. UTPedia with full text are accessible for all registered users, whereas only the physical information and metadata can be retrieved by public users. UTPedia collaborating and connecting peoples with university’s intellectual works from anywhere.

Disclaimer - Universiti Teknologi PETRONAS shall not be liable for any loss or damage caused by the usage of any information obtained from this web site.Best viewed using Mozilla Firefox 3 or IE 7 with resolution 1024 x 768.

DATA COMPRESSION AND DATA HIDING DURING LARGE DATA INGESTION

Lai, Zhen Yean (2019) DATA COMPRESSION AND DATA HIDING DURING LARGE DATA INGESTION. IRC, Universiti Teknologi PETRONAS. (Submitted)

[img] PDF
Restricted to Registered users only

Download (2319Kb)

Abstract

This paper explains Data Ingestion which is the process of collecting data. Data ingestion usually occurs in the internal organization so that the organization can analyze the data further. A famous file storage for big data analysis is Hadoop Distributed File System (HDFS). There are two tools related to data ingestion in Hadoop, which are Apache Sqoop and Apache Flume. Apache Sqoop is a tool to transfer data between Hadoop and Relational Database Management System (RDBMS) . Apache Flume is a distributed service to collect data from multiple variety of sources and forward to Hadoop Storage. The concerns of these tools are they do not have built-in data compression and data hiding feature during the data transmission. The proposed solution to this concern is applying the Fixed Length Coding (FLC) compression with Audio Steganography technique by using a new data ingestion method to achieve data compression and data hiding. The proposed solution methodology is implementing the data compression and audio steganography during the transmission of the data from RDBMS to Hadoop Distributed File System (HDFS) Storage. However, there is an inefficient aspect which is the capability of overcome data loss during audio steganography. Further performance evaluation is performed to valid the data transmission, the evaluation parameters including compression ratio, signal to noise ratio and information loss.

Item Type: Final Year Project
Academic Subject : Academic Department - Information Communication Technology
Subject: Q Science > Q Science (General)
Divisions: Sciences and Information Technology > Computer and Information Sciences
Depositing User: Ahmad Suhairi Mohamed Lazim
Date Deposited: 09 Sep 2021 20:08
Last Modified: 09 Sep 2021 20:08
URI: http://utpedia.utp.edu.my/id/eprint/20909

Actions (login required)

View Item View Item

Document Downloads

More statistics for this item...