Skip to content

Commit 7d33fd8

Browse files
glorysdjLe-Zhengpinggao18pinggao187hkvision
authored
Migrate hyperzoo (#4958)
* add hyperzoo for k8s support (#2140) * add hyperzoo for k8s support * format * format * format * format * run examples on k8s readme (#2163) * k8s readme * fix jdk download issue (#2219) * add doc for submit jupyter notebook and cluster serving to k8s (#2221) * add hyperzoo doc * add hyperzoo doc * add hyperzoo doc * add hyperzoo doc * fix jdk download issue (#2223) * bump to 0.9s (#2227) * update jdk download url (#2259) * update some previous docs (#2284) * K8docsupdate (#2306) * Update README.md * Update s3 related links in readme and documents (#2489) * Update s3 related links in readme and documents * Update s3 related links in readme and documents * Update s3 related links in readme and documents * Update s3 related links in readme and documents * Update s3 related links in readme and documents * Update s3 related links in readme and documents * update * update * modify line length limit * update * Update mxnet-mkl version in hyper-zoo dockerfile (#2720) Co-authored-by: gaoping <[email protected]> * update bigdl version (#2743) * update bigdl version * hyperzoo dockerfile add cluster-serving (#2731) * hyperzoo dockerfile add cluster-serving * update * update * update * update jdk url * update jdk url * update Co-authored-by: gaoping <[email protected]> * Support init_spark_on_k8s (#2813) * initial * fix * code refactor * bug fix * update docker * style * add conda to docker image (#2894) * add conda to docker image * Update Dockerfile * Update Dockerfile Co-authored-by: glorysdj <[email protected]> * Fix code blocks indents in .md files (#2978) * Fix code blocks indents in .md files Previously a lot of the code blocks in markdown files were horribly indented with bad white spaces in the beginning of lines. Users can't just select, copy, paste, and run (in the case of python). I have fixed all these, so there is no longer any code block with bad white space at the beginning of the lines. It would be nice if you could try to make sure in future commits that all code blocks are properly indented inside and have the right amount of white space in the beginning! * Fix small style issue * Fix indents * Fix indent and add \ for multiline commands Change indent from 3 spaces to 4, and add "\" for multiline bash commands Co-authored-by: Yifan Zhu <[email protected]> * enable bigdl 0.12 (#3101) * switch to bigdl 0.12 * Hyperzoo example ref (#3143) * specify pip version to fix oserror 0 of proxy (#3165) * Bigdl0.12.1 (#3155) * bigdl 0.12.1 * bump 0.10.0-Snapshot (#3237) * update runtime image name (#3250) * update jdk download url (#3316) * update jdk8 url (#3411) Co-authored-by: ardaci <[email protected]> * update hyperzoo docker image (#3429) * update hyperzoo image (#3457) * fix jdk in az docker (#3478) * fix jdk in az docker * fix jdk for hyperzoo * fix jdk in jenkins docker * fix jdk in cluster serving docker * fix jdk * fix readme * update python dep to fit cnvrg (#3486) * update ray version doc (#3568) * fix deploy hyperzoo issue (#3574) Co-authored-by: gaoping <[email protected]> * add spark fix and net-tools and status check (#3742) * intsall netstat and add check status * add spark fix for graphene * bigdl 0.12.2 (#3780) * bump to 0.11-S and fix version issues except ipynb * add multi-stage build Dockerfile (#3916) * add multi-stage build Dockerfile * multi-stage build dockerfile * multi-stage build dockerfile * Rename Dockerfile.multi to Dockerfile * delete Dockerfile.multi * remove comments, add TINI_VERSION to common arg, remove Dockerfile.multi * multi-stage add tf_slim Co-authored-by: shaojie <[email protected]> * update hyperzoo doc and k8s doc (#3959) * update userguide of k8s * update k8s guide * update hyperzoo doc * Update k8s.md add note * Update k8s.md add note * Update k8s.md update notes * fix 4087 issue (#4097) Co-authored-by: shaojie <[email protected]> * fixed 4086 and 4083 issues (#4098) Co-authored-by: shaojie <[email protected]> * Reduce image size (#4132) * Reduce Dockerfile size 1. del redis stage 2. del flink stage 3. del conda & exclude some python packages 4. add copies layer stage * update numpy version to 1.18.1 Co-authored-by: zzti-bsj <[email protected]> * update hyperzoo image (#4250) Co-authored-by: Adria777 <[email protected]> * bigdl 0.13 (#4210) * bigdl 0.13 * update * print exception * pyspark2.4.6 * update release PyPI script * update * flip snapshot-0.12.0 and spark2.4.6 (#4254) * s-0.12.0 master * Update __init__.py * Update python.md * fix docker issues due to version update (#4280) * fix docker issues * fix docker issues * update Dockerfile to support spark 3.1.2 && 2.4.6 (#4436) Co-authored-by: shaojie <[email protected]> * update hyperzoo, add lib for tf2 (#4614) * delete tf 1.15.0 (#4719) Co-authored-by: Le-Zheng <[email protected]> Co-authored-by: pinggao18 <[email protected]> Co-authored-by: pinggao187 <[email protected]> Co-authored-by: gaoping <[email protected]> Co-authored-by: Kai Huang <[email protected]> Co-authored-by: GavinGu07 <[email protected]> Co-authored-by: Yifan Zhu <[email protected]> Co-authored-by: Yifan Zhu <[email protected]> Co-authored-by: Song Jiaming <[email protected]> Co-authored-by: ardaci <[email protected]> Co-authored-by: Yang Wang <[email protected]> Co-authored-by: zzti-bsj <[email protected]> Co-authored-by: shaojie <[email protected]> Co-authored-by: Lingqi Su <[email protected]> Co-authored-by: Adria777 <[email protected]> Co-authored-by: shaojie <[email protected]>
1 parent f3057be commit 7d33fd8

17 files changed

+2361
-0
lines changed

docker/hyperzoo/Dockerfile

Lines changed: 176 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,176 @@
1+
ARG SPARK_VERSION=2.4.6
2+
ARG SPARK_HOME=/opt/spark
3+
ARG JDK_VERSION=8u192
4+
ARG JDK_URL=your_jdk_url
5+
ARG BIGDL_VERSION=0.13.0
6+
ARG ANALYTICS_ZOO_VERSION=0.12.0-SNAPSHOT
7+
ARG TINI_VERSION=v0.18.0
8+
9+
# stage.1 jdk & spark
10+
FROM ubuntu:18.04 as spark
11+
ARG SPARK_VERSION
12+
ARG JDK_VERSION
13+
ARG JDK_URL
14+
ARG SPARK_HOME
15+
ENV TINI_VERSION v0.18.0
16+
ENV SPARK_VERSION ${SPARK_VERSION}
17+
ENV SPARK_HOME ${SPARK_HOME}
18+
RUN apt-get update --fix-missing && \
19+
apt-get install -y apt-utils vim curl nano wget unzip maven git && \
20+
# java
21+
wget $JDK_URL && \
22+
gunzip jdk-$JDK_VERSION-linux-x64.tar.gz && \
23+
tar -xf jdk-$JDK_VERSION-linux-x64.tar -C /opt && \
24+
rm jdk-$JDK_VERSION-linux-x64.tar && \
25+
mv /opt/jdk* /opt/jdk$JDK_VERSION && \
26+
ln -s /opt/jdk$JDK_VERSION /opt/jdk && \
27+
# spark
28+
wget https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz && \
29+
tar -zxvf spark-${SPARK_VERSION}-bin-hadoop2.7.tgz && \
30+
mv spark-${SPARK_VERSION}-bin-hadoop2.7 /opt/spark && \
31+
rm spark-${SPARK_VERSION}-bin-hadoop2.7.tgz && \
32+
cp /opt/spark/kubernetes/dockerfiles/spark/entrypoint.sh /opt
33+
34+
RUN ln -fs /bin/bash /bin/sh
35+
RUN if [ $SPARK_VERSION = "3.1.2" ]; then \
36+
rm $SPARK_HOME/jars/okhttp-*.jar && \
37+
wget -P $SPARK_HOME/jars https://repo1.maven.org/maven2/com/squareup/okhttp3/okhttp/3.8.0/okhttp-3.8.0.jar; \
38+
elif [ $SPARK_VERSION = "2.4.6" ]; then \
39+
rm $SPARK_HOME/jars/kubernetes-client-*.jar && \
40+
wget -P $SPARK_HOME/jars https://repo1.maven.org/maven2/io/fabric8/kubernetes-client/4.4.2/kubernetes-client-4.4.2.jar; \
41+
fi
42+
43+
ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini /sbin/tini
44+
45+
# stage.2 analytics-zoo
46+
FROM ubuntu:18.04 as analytics-zoo
47+
ARG SPARK_VERSION
48+
ARG BIGDL_VERSION
49+
ARG ANALYTICS_ZOO_VERSION
50+
51+
ENV SPARK_VERSION ${SPARK_VERSION}
52+
ENV BIGDL_VERSION ${BIGDL_VERSION}
53+
ENV ANALYTICS_ZOO_VERSION ${ANALYTICS_ZOO_VERSION}
54+
ENV ANALYTICS_ZOO_HOME /opt/analytics-zoo-${ANALYTICS_ZOO_VERSION}
55+
56+
RUN apt-get update --fix-missing && \
57+
apt-get install -y apt-utils vim curl nano wget unzip maven git
58+
ADD ./download-analytics-zoo.sh /opt
59+
60+
RUN chmod a+x /opt/download-analytics-zoo.sh && \
61+
mkdir -p /opt/analytics-zoo-examples/python
62+
RUN /opt/download-analytics-zoo.sh && \
63+
rm analytics-zoo-bigdl*.zip && \
64+
unzip $ANALYTICS_ZOO_HOME/lib/*.zip 'zoo/examples/*' -d /opt/analytics-zoo-examples/python && \
65+
mv /opt/analytics-zoo-examples/python/zoo/examples/* /opt/analytics-zoo-examples/python && \
66+
rm -rf /opt/analytics-zoo-examples/python/zoo/examples
67+
68+
# stage.3 copies layer
69+
FROM ubuntu:18.04 as copies-layer
70+
ARG ANALYTICS_ZOO_VERSION
71+
72+
COPY --from=analytics-zoo /opt/analytics-zoo-${ANALYTICS_ZOO_VERSION} /opt/analytics-zoo-${ANALYTICS_ZOO_VERSION}
73+
COPY --from=analytics-zoo /opt/analytics-zoo-examples/python /opt/analytics-zoo-examples/python
74+
COPY --from=spark /opt/jdk /opt/jdk
75+
COPY --from=spark /opt/spark /opt/spark
76+
COPY --from=spark /opt/spark/kubernetes/dockerfiles/spark/entrypoint.sh /opt
77+
78+
79+
# stage.4
80+
FROM ubuntu:18.04
81+
MAINTAINER The Analytics-Zoo Authors https://github.com/intel-analytics/analytics-zoo
82+
ARG ANALYTICS_ZOO_VERSION
83+
ARG BIGDL_VERSION
84+
ARG SPARK_VERSION
85+
ARG SPARK_HOME
86+
ARG TINI_VERSION
87+
88+
ENV ANALYTICS_ZOO_VERSION ${ANALYTICS_ZOO_VERSION}
89+
ENV SPARK_HOME ${SPARK_HOME}
90+
ENV SPARK_VERSION ${SPARK_VERSION}
91+
ENV ANALYTICS_ZOO_HOME /opt/analytics-zoo-${ANALYTICS_ZOO_VERSION}
92+
ENV FLINK_HOME /opt/flink-${FLINK_VERSION}
93+
ENV OMP_NUM_THREADS 4
94+
ENV NOTEBOOK_PORT 12345
95+
ENV NOTEBOOK_TOKEN 1234qwer
96+
ENV RUNTIME_SPARK_MASTER local[4]
97+
ENV RUNTIME_K8S_SERVICE_ACCOUNT spark
98+
ENV RUNTIME_K8S_SPARK_IMAGE intelanalytics/hyper-zoo:${ANALYTICS_ZOO_VERSION}-${SPARK_VERSION}
99+
ENV RUNTIME_DRIVER_HOST localhost
100+
ENV RUNTIME_DRIVER_PORT 54321
101+
ENV RUNTIME_EXECUTOR_CORES 4
102+
ENV RUNTIME_EXECUTOR_MEMORY 20g
103+
ENV RUNTIME_EXECUTOR_INSTANCES 1
104+
ENV RUNTIME_TOTAL_EXECUTOR_CORES 4
105+
ENV RUNTIME_DRIVER_CORES 4
106+
ENV RUNTIME_DRIVER_MEMORY 10g
107+
ENV RUNTIME_PERSISTENT_VOLUME_CLAIM myvolumeclaim
108+
ENV SPARK_HOME /opt/spark
109+
ENV HADOOP_CONF_DIR /opt/hadoop-conf
110+
ENV BIGDL_VERSION ${BIGDL_VERSION}
111+
ENV BIGDL_CLASSPATH ${ANALYTICS_ZOO_HOME}/lib/analytics-zoo-bigdl_${BIGDL_VERSION}-spark_${SPARK_VERSION}-${ANALYTICS_ZOO_VERSION}-jar-with-dependencies.jar
112+
ENV JAVA_HOME /opt/jdk
113+
ENV REDIS_HOME /opt/redis-5.0.5
114+
ENV CS_HOME /opt/work/cluster-serving
115+
ENV PYTHONPATH ${ANALYTICS_ZOO_HOME}/lib/analytics-zoo-bigdl_${BIGDL_VERSION}-spark_${SPARK_VERSION}-${ANALYTICS_ZOO_VERSION}-python-api.zip:${SPARK_HOME}/python/lib/pyspark.zip:${SPARK_HOME}/python/lib/py4j-*.zip:${CS_HOME}/serving-python.zip:/opt/models/research/slim
116+
ENV PATH ${ANALYTICS_ZOO_HOME}/bin/cluster-serving:${JAVA_HOME}/bin:/root/miniconda3/bin:${PATH}
117+
ENV TINI_VERSION ${TINI_VERSION}
118+
ENV LC_ALL C.UTF-8
119+
ENV LANG C.UTF-8
120+
121+
122+
COPY --from=copies-layer /opt /opt
123+
COPY --from=spark /sbin/tini /sbin/tini
124+
ADD ./start-notebook-spark.sh /opt
125+
ADD ./start-notebook-k8s.sh /opt
126+
127+
RUN mkdir -p /opt/analytics-zoo-examples/python && \
128+
mkdir -p /opt/analytics-zoo-examples/scala && \
129+
apt-get update --fix-missing && \
130+
apt-get install -y apt-utils vim curl nano wget unzip maven git && \
131+
apt-get install -y gcc g++ make && \
132+
apt-get install -y libsm6 libxext6 libxrender-dev && \
133+
rm /bin/sh && \
134+
ln -sv /bin/bash /bin/sh && \
135+
echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su && \
136+
chgrp root /etc/passwd && chmod ug+rw /etc/passwd && \
137+
# python
138+
apt-get install -y python3-minimal && \
139+
apt-get install -y build-essential python3 python3-setuptools python3-dev python3-pip && \
140+
pip3 install --no-cache-dir --upgrade pip && \
141+
pip install --no-cache-dir --upgrade setuptools && \
142+
pip install --no-cache-dir numpy==1.18.1 scipy && \
143+
pip install --no-cache-dir pandas==1.0.3 && \
144+
pip install --no-cache-dir scikit-learn matplotlib seaborn jupyter jupyterlab requests h5py && \
145+
ln -s /usr/bin/python3 /usr/bin/python && \
146+
#Fix tornado await process
147+
pip uninstall -y -q tornado && \
148+
pip install --no-cache-dir tornado && \
149+
python3 -m ipykernel.kernelspec && \
150+
pip install --no-cache-dir tensorboard && \
151+
pip install --no-cache-dir jep && \
152+
pip install --no-cache-dir cloudpickle && \
153+
pip install --no-cache-dir opencv-python && \
154+
pip install --no-cache-dir pyyaml && \
155+
pip install --no-cache-dir redis && \
156+
pip install --no-cache-dir ray[tune]==1.2.0 && \
157+
pip install --no-cache-dir Pillow==6.1 && \
158+
pip install --no-cache-dir psutil aiohttp && \
159+
pip install --no-cache-dir py4j && \
160+
pip install --no-cache-dir cmake==3.16.3 && \
161+
pip install --no-cache-dir torch==1.7.1 torchvision==0.8.2 && \
162+
pip install --no-cache-dir horovod==0.19.2 && \
163+
#tf2
164+
pip install --no-cache-dir pyarrow && \
165+
pip install opencv-python==4.2.0.34 && \
166+
pip install aioredis==1.1.0 && \
167+
pip install tensorflow==2.4.0 && \
168+
# chmod
169+
chmod a+x /opt/start-notebook-spark.sh && \
170+
chmod a+x /opt/start-notebook-k8s.sh && \
171+
chmod +x /sbin/tini && \
172+
cp /sbin/tini /usr/bin/tini
173+
174+
WORKDIR /opt/spark/work-dir
175+
176+
ENTRYPOINT [ "/opt/entrypoint.sh" ]

0 commit comments

Comments
 (0)