What is Hadoop | Components of Hadoop, A
Description
What is Hadoop?Apache Hadoop is an open-source framework that is used to efficiently store and process large datasets ranging in size from gigabytes to petabytes of data. Instead of using one large computer to store and process the data, Hadoop allows clustering multiple computers to analyze massive datasets in parallel more quickly.
Hadoop consists of four main modules:
Hadoop Distributed File System (HDFS) – A distributed file system that runs on standard or low-end hardware. HDFS provides better data throughput than traditional file systems, in addition to high fault tolerance and native support of large datasets.
Yet Another Resource Negotiator (YARN) – Manages and monitors cluster nodes and resource usage. It schedules jobs and tasks.
MapReduce – A framework that helps program
Web page:
Announcement ID: #242792
Published on 07-29-2022
See all ads from ravi
Contact
- Type of sale:Delivery
- Estado:Good
10.00 $