wey
2022 年11 月 17 日 01:40
2
Submit a Spark Job From Java Code
=================================
In this post I will show you how to submit a Spark job from Java code. Typically,
we submit Spark jobs to "Spark Cluster" (standalone Spark cluster) and Hadoop/YARN
(MapReduce/Hadoop cluster) by using the ````$SPARK_HOME/bin/spark-submit```` shell
script. Submitting Spark job from a shell script limits programmers when they want
to submit Spark jobs from Java code (such as Java servlets or other Java code such
as REST servers).
## [Submit Spark Job to Hadoop/YARN From Java Code -- Spark 1.5.2](./how-to-submit-spark-job-to-yarn-from-java-code.md)
## [Submit Spark Job to Hadoop/YARN From Java Code -- Spark 2.0.0](https://github.com/mahmoudparsian/data-algorithms-book/tree/master/src/main/java/org/dataalgorithms/chapB13/client)
## [Submit Spark Job to Spark Cluster From Java Code -- Spark 2.0.0](https://github.com/mahmoudparsian/data-algorithms-book/tree/master/src/main/java/org/dataalgorithms/chapB13/client)
https://github.com/mahmoudparsian/data-algorithms-book/blob/master/misc/how-to-submit-spark-job-to-yarn-from-java-code.md
1 个赞
nicole
2022 年11 月 17 日 02:30
3
你可以了解下 SparkLuncher
Launcher for Spark applications. Use this class to start Spark applications programmatically. The class uses a builder pattern to allow clients to configure the Spark application and launch it as a child process.
你好,sparkLauncher需要运行业务代码的服务有spark环境吧,它需要指定sparkHome,但现在spark服务在远程服务器 本地没有。
nicole
2022 年11 月 17 日 02:39
5
vision-wx:
运行业务代码的服务有spark环境
你的业务代码是要打包跑在服务器上的,可以打包后在服务器上启动,也可以在本地安装一个单机的spark
刚用了一种通过spark Rest的方式提交spark任务,事先我把nebula-exchange_spark_3.0-3.3.0和mysql驱动的jar包上传到了spark环境,任务提交成功了,但是运行的时候提示找不到类。
nicole
2022 年11 月 17 日 09:05
7
你是啥模式提交的spark 任务,如果是集群模式,需要每台机器都上传
system
关闭
2022 年12 月 17 日 09:06
8
此话题已在最后回复的 30 天后被自动关闭。不再允许新回复。