Hadoop, how to create a single node cluster using docker

INTRODUCTION
Hadoop is an open-source sofware utilities that uses a computer cluster to solve problems involving massive amounts of data (BigData)
The Apache Hadoop framework is composed by follow modules:
CommonHDFSYARNMapReduce
All this modules are included into single docker image (only for accademic use) created by sequenceiq
INSTRUCTIONRequirements: Docker CE
Follow steps will help you to create a single node cluster into your computer !!!
First pull the image from official repo
docker pull sequenceiq/hadoop-docker:2.7.1
Now you can create a docker container named hadoop-local
docker run --name hadoop-local -d -t -i \
-p 50010:50010 -p 50020:50020 -p 50070:50070 -p 50075:50075 -p 50090:50090 \
-p 8020:8020 -p 9000:9000 -p 19888:19888 -p 8030:8030 -p 8031:8031 \
-p 8032:8032 -p 8033:8033 -p 8040:8040 -p 8042:8042 -p 8088:8088 -p 49707:49707 \
-p 2122:2122 \
sequenceiq/hadoop-docker:2.7.1 /etc/bootstrap.sh -bash
Into run command there are exposed ports: HDFS500105002…
The Apache Hadoop framework is composed by follow modules:
CommonHDFSYARNMapReduce
All this modules are included into single docker image (only for accademic use) created by sequenceiq
INSTRUCTIONRequirements: Docker CE
Follow steps will help you to create a single node cluster into your computer !!!
First pull the image from official repo
docker pull sequenceiq/hadoop-docker:2.7.1
Now you can create a docker container named hadoop-local
docker run --name hadoop-local -d -t -i \
-p 50010:50010 -p 50020:50020 -p 50070:50070 -p 50075:50075 -p 50090:50090 \
-p 8020:8020 -p 9000:9000 -p 19888:19888 -p 8030:8030 -p 8031:8031 \
-p 8032:8032 -p 8033:8033 -p 8040:8040 -p 8042:8042 -p 8088:8088 -p 49707:49707 \
-p 2122:2122 \
sequenceiq/hadoop-docker:2.7.1 /etc/bootstrap.sh -bash
Into run command there are exposed ports: HDFS500105002…