Home
Q&A
Education
Technology
Credit
General
Health
Insurance
Questions
Ask a Question
Why are blocks in HDFS huge?
Home
Questions
Why are blocks in HDFS huge?
asked
Aug 5, 2021
by
JackTerrance
Why are blocks in HDFS huge?
hadoop-objective-questions
1
Answer
0
votes
answered
Aug 5, 2021
by
JackTerrance
By default, the size of the HDFS data block is 128 MB
. The ideas for the large size of blocks are:
To reduce the expense of seek: Because of the large size blocks, the time consumed to shift the data from the disk can be longer than the usual time taken to commence the block. As a result, the multiple blocks are transferred at the disk transfer rate.
If there are small blocks, the number of blocks will be too many in Hadoop HDFS and too much metadata to store. Managing such a vast number of blocks and metadata will create overhead and head to traffic in a network.
Related questions
0
votes
Q: What are Hadoop HDFS Commands?
What are Hadoop HDFS Commands?...
asked
Aug 3, 2021
by
JackTerrance
hadoop-objective-questions
0
votes
Q: What are the types of Znode?
What are the types of Znode?...
asked
Aug 3, 2021
by
JackTerrance
hadoop-objective-questions
0
votes
Q: What are main the YARN components?
What are main the YARN components?...
asked
Aug 3, 2021
by
JackTerrance
hadoop-objective-questions
0
votes
Q: What is DistCp?
What is DistCp?...
asked
Aug 5, 2021
by
JackTerrance
hadoop-objective-questions
0
votes
Q: List the components of Apache Spark?
List the components of Apache Spark?...
asked
Aug 2, 2021
by
JackTerrance
hadoop-objective-questions
0
votes
Q: What is shuffling in MapReduce?
What is shuffling in MapReduce?...
asked
Aug 2, 2021
by
JackTerrance
hadoop-objective-questions
0
votes
Q: Various climatic factors like are considered during construction of runways, seaports, huge bridges and skyscrapers, etc.
Various climatic factors like are considered during construction of runways, seaports, huge bridges and ... proposed by,electromagnetic theory engineering physics,Science nptel...
asked
Nov 7, 2021
in
Education
by
JackTerrance
science-interview-questions
science-interview-questions-answers
science-multiple-choice-questions
science-mcq-with-answers
science-mcq
science-questions
0
votes
Q: What are the main differences between HDFS (Hadoop Distributed File System ) and Network Attached Storage(NAS) ?
What are the main differences between HDFS (Hadoop Distributed File System ) and Network Attached Storage(NAS) ?...
asked
Aug 2, 2021
in
Technology
by
JackTerrance
interview-question-answer
technology-questions-answers
0
votes
Q: What are the different Features of HDFS?
What are the different Features of HDFS?...
asked
Aug 2, 2021
in
Technology
by
JackTerrance
interview-question-answer
technology-questions-answers
0
votes
Q: What are the mandatory configurations needed in order to connect to HDFS?
What are the mandatory configurations needed in order to connect to HDFS?...
asked
Mar 23, 2021
in
Technology
by
JackTerrance
interview-question-answer
technology-questions-answers
0
votes
Q: By default the records from databases imported to HDFS by sqoop are
By default the records from databases imported to HDFS by sqoop are A - Tab separated B - Concatenated columns C - space separated D - comma separated...
asked
Jan 13, 2021
in
Technology
by
JackTerrance
interview-question-answer
technology-questions-answers
0
votes
Q: ________ is a collection of data that is huge in volume, yet growing exponentially with time.
________ is a collection of data that is huge in volume, yet growing exponentially with time. A. Big Dataabase B. Big Data C. Big Datafile D. Big DBMS...
asked
Dec 2, 2022
in
Education
by
JackTerrance
bigdata
0
votes
Q: Splitting a huge dataframe into smaller dataframes and writing to files using SPARK(python)
I am loading a (5gb compressed file) into memory (aws), creating a dataframe(in spark) and trying ... JavaScript Questions for Interview, JavaScript MCQ (Multiple Choice Questions)...
asked
Jun 12, 2022
in
Education
by
JackTerrance
apache-sparkdataframepysparkapache-spark-sql
0
votes
Q: Memory-mapping huge files in Java
Is it possible to memory-map huge files (multiple GBs) in Java? This method of FileChannel looks promising: ... Questions for Interview, JavaScript MCQ (Multiple Choice Questions)...
asked
May 15, 2022
in
Education
by
JackTerrance
javanio
0
votes
Q: Memory-mapping huge files in Java
Is it possible to memory-map huge files (multiple GBs) in Java? This method of FileChannel looks promising: ... Questions for Interview, JavaScript MCQ (Multiple Choice Questions)...
asked
May 10, 2022
in
Education
by
JackTerrance
javanio
...