This page works best with javascript enabled.

Log Type: stderr

Log Upload Time: Thu Jun 09 16:37:49 +0800 2022

Log Length: 190880

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/data6/yarn/nm2/filecache/308/spark-2.4.0-bin-hadoop2.6.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-20220330112300-149/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
22/06/09 15:44:10 INFO SignalUtils: Registered signal handler for TERM
22/06/09 15:44:10 INFO SignalUtils: Registered signal handler for HUP
22/06/09 15:44:10 INFO SignalUtils: Registered signal handler for INT
22/06/09 15:44:11 INFO SecurityManager: Changing view acls to: yarn,net_kg
22/06/09 15:44:11 INFO SecurityManager: Changing modify acls to: yarn,net_kg
22/06/09 15:44:11 INFO SecurityManager: Changing view acls groups to: 
22/06/09 15:44:11 INFO SecurityManager: Changing modify acls groups to: 
22/06/09 15:44:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, net_kg); groups with view permissions: Set(); users  with modify permissions: Set(yarn, net_kg); groups with modify permissions: Set()
22/06/09 15:44:11 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:11 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:11 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:11 INFO ApplicationMaster: Preparing Local resources
22/06/09 15:44:13 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:13 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1654741161919_19163_000001
22/06/09 15:44:13 INFO ApplicationMaster: Starting the user application in a separate Thread
22/06/09 15:44:13 INFO ApplicationMaster: Waiting for spark context initialization...
22/06/09 15:44:13 INFO Configs$: DataBase Config com.vesoft.nebula.exchange.config.DataBaseConfigEntry@45b4b0cc
22/06/09 15:44:13 INFO Configs$: User Config com.vesoft.nebula.exchange.config.UserConfigEntry@3161b833
22/06/09 15:44:13 INFO Configs$: Connection Config Some(Config(SimpleConfigObject({"retry":10,"timeout":10000000})))
22/06/09 15:44:13 INFO Configs$: Execution Config com.vesoft.nebula.exchange.config.ExecutionConfigEntry@6595927e
22/06/09 15:44:13 INFO Configs$: Source Config Hive source exec: select thing_id, thing_name, thing_title, thing_namech, thing_nameen, thing_abbreviation, thing_tag, thing_alias, thing_abstract, thing_image, thing_video, thing_audio, thing_gmtcreated, thing_gmtmodified, thing_popularity, thing_prior, thing_datasource, thing_urls, thing_class, thing_imagejson, thing_embedding, thing_sourceids, thing_videocover, thing_englishname, thing_japanname, thing_koreaname, thing_category, thing_thumbnails, thing_pictures, thing_additionalinfo, thing_subtitle, thing_edition from oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df where  data_version='v1.1.20211205.alpha' and ds = '20220521'
22/06/09 15:44:13 INFO Configs$: Sink Config Hive source exec: select thing_id, thing_name, thing_title, thing_namech, thing_nameen, thing_abbreviation, thing_tag, thing_alias, thing_abstract, thing_image, thing_video, thing_audio, thing_gmtcreated, thing_gmtmodified, thing_popularity, thing_prior, thing_datasource, thing_urls, thing_class, thing_imagejson, thing_embedding, thing_sourceids, thing_videocover, thing_englishname, thing_japanname, thing_koreaname, thing_category, thing_thumbnails, thing_pictures, thing_additionalinfo, thing_subtitle, thing_edition from oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df where  data_version='v1.1.20211205.alpha' and ds = '20220521'
22/06/09 15:44:13 INFO Configs$: name Thing  batch 128
22/06/09 15:44:13 INFO Configs$: Tag Config: Tag name: Thing, source: Hive source exec: select thing_id, thing_name, thing_title, thing_namech, thing_nameen, thing_abbreviation, thing_tag, thing_alias, thing_abstract, thing_image, thing_video, thing_audio, thing_gmtcreated, thing_gmtmodified, thing_popularity, thing_prior, thing_datasource, thing_urls, thing_class, thing_imagejson, thing_embedding, thing_sourceids, thing_videocover, thing_englishname, thing_japanname, thing_koreaname, thing_category, thing_thumbnails, thing_pictures, thing_additionalinfo, thing_subtitle, thing_edition from oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df where  data_version='v1.1.20211205.alpha' and ds = '20220521', sink: File sink: from ./knowledge_graph_v8 to hdfs://alg-hdfs/knowledge_graph_v8/download, vertex field: thing_id, vertex policy: None, batch: 128, partition: 24.
22/06/09 15:44:13 INFO Exchange$: Config Configs(com.vesoft.nebula.exchange.config.DataBaseConfigEntry@45b4b0cc,com.vesoft.nebula.exchange.config.UserConfigEntry@3161b833,com.vesoft.nebula.exchange.config.ConnectionConfigEntry@ac57fae7,com.vesoft.nebula.exchange.config.ExecutionConfigEntry@6595927e,com.vesoft.nebula.exchange.config.ErrorConfigEntry@ea90487b,com.vesoft.nebula.exchange.config.RateConfigEntry@ae47d065,com.vesoft.nebula.exchange.config.SslConfigEntry@745bf6f4,,List(Tag name: Thing, source: Hive source exec: select thing_id, thing_name, thing_title, thing_namech, thing_nameen, thing_abbreviation, thing_tag, thing_alias, thing_abstract, thing_image, thing_video, thing_audio, thing_gmtcreated, thing_gmtmodified, thing_popularity, thing_prior, thing_datasource, thing_urls, thing_class, thing_imagejson, thing_embedding, thing_sourceids, thing_videocover, thing_englishname, thing_japanname, thing_koreaname, thing_category, thing_thumbnails, thing_pictures, thing_additionalinfo, thing_subtitle, thing_edition from oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df where  data_version='v1.1.20211205.alpha' and ds = '20220521', sink: File sink: from ./knowledge_graph_v8 to hdfs://alg-hdfs/knowledge_graph_v8/download, vertex field: thing_id, vertex policy: None, batch: 128, partition: 24.),List(),None)
22/06/09 15:44:13 INFO Exchange$: you don't config hive source, so using hive tied with spark.
22/06/09 15:44:14 INFO SparkContext: Running Spark version 2.4.0
22/06/09 15:44:14 WARN SparkConf: Total executor cores: 16 is not divisible by cores per executor: 24, the left cores: 16 will not be allocated
22/06/09 15:44:14 INFO SparkContext: Submitted application: 1654760263556_nebula2.5.1-import-sst-Thing.conf
22/06/09 15:44:14 INFO SecurityManager: Changing view acls to: yarn,net_kg
22/06/09 15:44:14 INFO SecurityManager: Changing modify acls to: yarn,net_kg
22/06/09 15:44:14 INFO SecurityManager: Changing view acls groups to: 
22/06/09 15:44:14 INFO SecurityManager: Changing modify acls groups to: 
22/06/09 15:44:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yarn, net_kg); groups with view permissions: Set(); users  with modify permissions: Set(yarn, net_kg); groups with modify permissions: Set()
22/06/09 15:44:14 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:14 INFO Utils: Successfully started service 'sparkDriver' on port 17911.
22/06/09 15:44:14 INFO SparkEnv: Registering MapOutputTracker
22/06/09 15:44:14 INFO SparkEnv: Registering BlockManagerMaster
22/06/09 15:44:14 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/06/09 15:44:14 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-0f53da54-213c-4a92-b43c-ffd504848865
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data13/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-d03b249f-49ef-4ac4-bd11-0a7f680d6fbe
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data7/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-7921874e-0eff-42c2-936b-a601b8606f2a
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data17/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-399d3513-81bb-46a5-bb6a-d8df4f688616
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data24/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-2dd27b38-7271-4006-8eee-c7a14988ef48
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data6/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-9b8df439-eedb-44bb-a9ca-34a2aacea10b
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data16/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-1e16989a-3a9c-4ef4-957e-4e2211b32291
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data5/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-dbf25c5d-6c3b-4bd1-abe5-5fc8fe17fbe3
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data3/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-e2c1cac1-7c05-4e72-ae95-b5ce691c17dc
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data21/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-aa8aeda8-de36-43a6-ba96-70239df63191
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data4/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-38f9ad20-b3a4-483b-982b-3d1f7b68e4f2
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data20/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-b475a0c9-a2c0-4bee-a596-0e85e3308a30
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data8/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-65a680c9-ecc9-4e90-82a6-36a055157be6
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data19/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-e13a71e2-64af-419a-9b93-b1a584059f72
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data1/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-b4d01c40-f3d4-4d3f-8d21-94f360f95150
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data2/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-967d84f6-8904-4fda-b731-1a9b746cf456
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data11/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-0eb470e9-0141-4ad4-8346-0d28e7b2983d
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data15/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-a5db8ce7-2e0a-45ae-b2d1-dc881cb468cd
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data10/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-982588a6-ab95-4c19-a543-6e8d14a69473
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data23/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-af4198c3-8ef6-45d0-8b28-ce7165ae21ce
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data9/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-69782a7c-7cfc-4dfe-bb7b-ed21d50d3787
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data18/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-fb55f62b-3f80-48a7-a0f1-dbc55bd73801
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data14/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-1f74275e-a559-4495-909b-93e8160091dd
22/06/09 15:44:14 INFO DiskBlockManager: Created local directory at /data/data12/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/blockmgr-670496a7-1b65-46c9-b81e-f1f79678bfd8
22/06/09 15:44:14 INFO MemoryStore: MemoryStore started with capacity 4.6 GB
22/06/09 15:44:14 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:14 INFO SparkEnv: Registering OutputCommitCoordinator
22/06/09 15:44:15 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill.
22/06/09 15:44:15 INFO Utils: Successfully started service 'SparkUI' on port 17329.
22/06/09 15:44:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://cdh-bjht-3878:17329
22/06/09 15:44:15 INFO YarnClusterScheduler: Created YarnClusterScheduler
22/06/09 15:44:15 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1654741161919_19163 and attemptId Some(appattempt_1654741161919_19163_000001)
22/06/09 15:44:15 INFO YarnClusterScheduler: Starting speculative execution thread
22/06/09 15:44:15 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 18445.
22/06/09 15:44:15 INFO NettyBlockTransferService: Server created on cdh-bjht-3878:18445
22/06/09 15:44:15 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/06/09 15:44:15 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, cdh-bjht-3878, 18445, None)
22/06/09 15:44:15 INFO BlockManagerMasterEndpoint: Registering block manager cdh-bjht-3878:18445 with 4.6 GB RAM, BlockManagerId(driver, cdh-bjht-3878, 18445, None)
22/06/09 15:44:15 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, cdh-bjht-3878, 18445, None)
22/06/09 15:44:15 INFO BlockManager: external shuffle service port = 7337
22/06/09 15:44:15 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, cdh-bjht-3878, 18445, None)
22/06/09 15:44:15 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json.
22/06/09 15:44:15 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:15 INFO ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
22/06/09 15:44:15 INFO ZooKeeper: Client environment:host.name=cdh-bjht-3878
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.version=1.8.0_192
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.vendor=Oracle Corporation
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.home=/usr/local/paas-agent/Oracle-Linux-x86_64-1.8.0_192/jre
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.class.path=/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_conf__:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-logging-1.1.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-xc-1.9.13.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jtransforms-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/xercesImpl-2.9.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-catalyst_2.11-2.4.0-tests.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-core_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/machinist_2.11-0.6.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hppc-0.7.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-yarn-server-common-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/breeze-macros_2.11-0.13.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hive-cli-1.2.1.spark2-20200623.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/libfb303-0.9.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/gson-2.2.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/htrace-core4-4.0.1-incubating.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-databind-2.6.7.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/arrow-format-0.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-container-servlet-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/calcite-linq4j-1.2.0-incubating.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/antlr-2.7.7.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javax.servlet-api-3.1.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-client-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/xz-1.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-auth-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/chill_2.11-0.9.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/calcite-core-1.2.0-incubating.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-annotations-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/shapeless_2.11-2.3.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spire-macros_2.11-0.13.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/avro-mapred-1.8.2-hadoop2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/chill-java-0.9.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-mapreduce-client-common-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-guava-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/datanucleus-core-3.2.10.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/calcite-avatica-1.2.0-incubating.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javax.ws.rs-api-2.0.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/apacheds-kerberos-codec-2.0.0-M15.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jdo-api-3.0.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/protobuf-java-2.5.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hive-exec-1.2.1.spark2-20200623.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hk2-locator-2.4.0-b34.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-sketch_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-aws-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/guava-14.0.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-net-3.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jodd-core-3.5.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-configuration-1.6.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jaxb-api-2.2.11.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/datanucleus-rdbms-3.2.9.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/log4j-1.2.17.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/bonecp-0.8.0.RELEASE.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javolution-5.5.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/apacheds-i18n-2.0.0-M15.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/curator-client-2.6.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/ivy-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-module-paranamer-2.7.9.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/scala-compiler-2.11.12.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-hive_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-hive-thriftserver_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-mapreduce-client-core-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/orc-core-1.5.5-nohive.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javassist-3.18.1-GA.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/slf4j-log4j12-1.7.16.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-graphx_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-unsafe_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-dbcp-1.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/orc-shims-1.5.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-digester-1.8.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/netty-3.9.9.Final.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/metrics-jvm-3.1.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-common-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-network-shuffle_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javax.inject-2.4.0-b34.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/avro-1.8.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-compaction-1.0.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-module-scala_2.11-2.6.7.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/antlr4-runtime-4.7.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-beanutils-1.7.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/curator-framework-2.6.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/oro-2.0.8.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/datanucleus-api-jdo-3.2.6.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/antlr-runtime-3.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/JavaEWAH-0.3.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/aopalliance-repackaged-2.4.0-b34.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-encoding-1.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/activation-1.1.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/core-1.1.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-catalyst_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-yarn-client-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/kryo-shaded-4.0.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jta-1.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jsr305-1.3.9.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/pyrolite-4.13.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hive-metastore-1.2.1.spark2-20200623.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/json4s-core_2.11-3.5.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-yarn-common-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-lang3-3.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-sqlConformanceCheck-plugin-1.0.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/breeze_2.11-0.13.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-authorizer-shim-1.0-SNAPSHOT.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/janino-3.0.9.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-mapreduce-client-app-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-pool-1.5.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/flatbuffers-1.2.0-3f79e055.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/stream-2.7.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-mapper-asl-1.9.13.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/arrow-memory-0.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/bdc-plugin-classloader-1.0.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/aircompressor-0.10.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-ocsa-2.6.0-cdh5.14.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/RoaringBitmap-0.5.11.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/metrics-graphite-3.1.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/scala-xml_2.11-1.0.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hk2-api-2.4.0-b34.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-common-1.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/py4j-0.10.7.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-lang-2.6.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-compiler-3.0.9.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/scala-parser-combinators_2.11-1.1.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/macro-compat_2.11-1.1.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/lz4-java-1.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/osgi-resource-locator-1.0.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/aws-java-sdk-bundle-1.11.134.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-math3-3.4.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-client-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/ST4-4.0.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-jaxrs-1.9.13.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-kvstore_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/scala-library-2.11.12.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-hadoop-bundle-1.6.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/aopalliance-1.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-format-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/super-csv-2.2.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/metrics-core-3.1.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/netty-all-4.1.17.Final.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-tags_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/api-util-1.0.0-M20.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/arrow-vector-0.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/json4s-jackson_2.11-3.5.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hive-beeline-1.2.1.spark2-20200623.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javax.annotation-api-1.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-collections-3.2.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spire_2.11-0.13.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-container-servlet-core-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hive-jdbc-1.2.1.spark2-20200623.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-mllib_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-media-jaxb-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/guice-servlet-3.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-io-2.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/httpcore-4.4.10.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-beanutils-core-1.8.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/opencsv-2.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/metrics-json-3.1.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/slf4j-api-1.7.16.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-hdfs-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/minlog-1.3.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-sql_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-cli-1.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/avro-ipc-1.8.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/xbean-asm6-shaded-4.8.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/apache-log4j-extras-1.2.17.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-mllib-local_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-common-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/stax-api-1.0.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-hadoop-1.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/eigenbase-properties-1.1.5.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/libthrift-0.9.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/snappy-0.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/zstd-jni-1.3.2-2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hudi-spark-bundle_2.11-0.6.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jline-2.14.6.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/scala-reflect-2.11.12.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-launcher_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-jackson-1.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-yarn-api-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jersey-server-2.22.2.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jetty-util-6.1.26.cloudera.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jetty-6.1.26.cloudera.4.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jcl-over-slf4j-1.7.16.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hk2-utils-2.4.0-b34.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/leveldbjni-all-1.8.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/validation-api-1.1.0.Final.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/httpclient-4.5.6.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/parquet-column-1.10.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jpam-1.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/zookeeper-3.4.6.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-codec-1.10.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-httpclient-3.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/univocity-parsers-2.7.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-core-asl-1.9.13.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/joda-time-2.9.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/json4s-ast_2.11-3.5.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/paranamer-2.8.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/hadoop-yarn-server-web-proxy-2.6.0-cdh5.14.4-20200402.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/guice-3.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jul-to-slf4j-1.7.16.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-network-common_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-core-2.6.7.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/snappy-java-1.1.7.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-yarn_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/derby-10.12.1.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-repl_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/javax.inject-1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/api-asn1-api-1.0.0-M20.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/objenesis-2.5.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/compress-lzf-1.0.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/curator-recipes-2.6.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/arpack_combined_all-0.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-compress-1.8.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/orc-mapreduce-1.5.5-nohive.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/stringtemplate-3.2.1.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-avro_2.11-2.4.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/xmlenc-0.52.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/json4s-scalap_2.11-3.5.3.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/commons-crypto-1.0.0.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/jackson-annotations-2.6.7.jar:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_libs__/spark-streaming_2.11-2.4.0.jar:/etc/hadoop/conf.cloudera.yarn:/usr/local/yarn/nodemanager/etc/hadoop:/usr/local/yarn/nodemanager/share/hadoop/common/hadoop-nfs-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/hadoop-common-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/hadoop-common-2.6.0-cdh5.14.4-tests.jar:/usr/local/yarn/nodemanager/share/hadoop/common/hadoop-lzo-0.4.15-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/junit-4.11.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jetty-6.1.26.cloudera.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/netty-3.10.5.Final.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/asm-3.2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/hadoop-annotations-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-collections-3.2.2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/curator-client-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/zookeeper-3.4.5-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/avro-1.7.6-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-beanutils-1.9.2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/curator-framework-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/hadoop-lzo-0.4.15-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/hadoop-auth-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/logredactor-1.0.3.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/htrace-core4-4.0.1-incubating.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/activation-1.1.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/xz-1.0.jar:/usr/local/yarn/nodemanager/share/hadoop/common/lib/jsr305-3.0.0.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/hadoop-hdfs-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/hadoop-hdfs-2.6.0-cdh5.14.4-tests.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jetty-6.1.26.cloudera.4.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/hdfs-auth-shim-1.0.0-SNAPSHOT.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/bdc-plugin-classloader-1.0.0.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/common-auth-shim-1.0.0-SNAPSHOT.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-registry-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-common-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/spark-2.4.0-yarn-shuffle.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-api-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-ocsa-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/spark_yarn_shuffle-shade-3.1.2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/aws-java-sdk-bundle-1.11.134.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-aggregate-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/hadoop-yarn-client-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/curator-recipes-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jetty-6.1.26.cloudera.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/bdc-plugin-classloader-1.0.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/curator-client-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/zookeeper-3.4.5-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/common-auth-shim-1.0.0-SNAPSHOT.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/curator-x-discovery-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/curator-framework-2.7.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/yarn-auth-shim-1.0.0-SNAPSHOT.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jline-2.11.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/usr/local/yarn/nodemanager/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.4-tests.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/netty-3.10.5.Final.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/avro-1.7.6-cdh5.14.4.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/yarn/nodemanager/share/hadoop/mapreduce/lib/aopalliance-1.0.jar::/opt/cloudera/parcels/GPLEXTRAS/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.14.4.jar:/opt/cloudera/parcels/GPLEXTRAS/lib/hadoop/lib/hadoop-lzo.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/mockito-all-1.8.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/httpcore-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/httpclient-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/slf4j-api-1.7.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/azure-data-lake-store-sdk-2.2.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/hadoop-ocsa-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/hue-plugins-3.9.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/slf4j-log4j12.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/aws-java-sdk-bundle-1.11.134.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/logredactor-1.0.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-cascading.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-pig.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-jackson.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-nfs-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-annotations.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-hadoop-bundle.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-test-hadoop2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-nfs.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-format-sources.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-azure-datalake-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-tools.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-annotations-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-thrift.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-format.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-common.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-pig-bundle.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-azure-datalake.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-column.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common-2.6.0-cdh5.14.4-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-common-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-format-javadoc.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-scrooge_2.10.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-encoding.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-auth-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-hadoop.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-aws-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-avro.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-generator.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-scala_2.10.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//hadoop-aws.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//parquet-protobuf.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/xml-apis-1.3.04.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-nfs-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-2.6.0-cdh5.14.4-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-client-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-guice-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/spark-2.4.0-yarn-shuffle.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/guice-3.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/guice-servlet-3.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jline-2.11.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/jsr305-3.0.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-registry-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-common-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-common.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-api-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-registry.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-common.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-common-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//spark_yarn_shuffle-shade-3.1.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-client.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-tests-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-client-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//hadoop-yarn-api.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/avro.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/guice-3.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-azure-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//okio-1.4.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-archive-logs-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-datajoin-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-el-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//zookeeper.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//log4j-1.2.17.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//junit-4.11.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-ant-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-auth.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jetty-util-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//httpcore-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-rumen-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//avro.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-sls-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-streaming.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-gridmix-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.4-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-ant.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jetty-6.1.26.cloudera.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-annotations-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//httpclient-4.2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-extras.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//asm-3.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-sls.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//paranamer-2.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-streaming-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-cli-1.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//stax-api-1.0-2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//xmlenc-0.52.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-databind-2.2.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-lang-2.6.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-gridmix.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-distcp.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//curator-client-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-archives.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-io-2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-digester-1.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-net-3.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//okhttp-2.4.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jersey-json-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-distcp-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-rumen.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jasper-runtime-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jersey-server-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-beanutils-1.9.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//gson-2.2.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//metrics-core-3.0.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jsch-0.1.42.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-auth-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-archives-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jasper-compiler-5.5.23.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//servlet-api-2.5.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//commons-codec-1.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-jaxrs-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jackson-xc-1.8.8.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//htrace-core4-4.0.1-incubating.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-azure.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jets3t-0.9.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//guava-11.0.2.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-datajoin.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-openstack.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jettison-1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//activation-1.1.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-extras-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//hadoop-openstack-2.6.0-cdh5.14.4.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jersey-core-1.9.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//xz-1.0.jar:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//jsr305-3.0.0.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.14.4.jar:/opt/cloudera/parcels/GPLEXTRAS-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/hadoop-lzo.jar:/usr/local/share/spark/bin/../native/lib/*:/usr/local/share/spark/bin/../native/lib/native/*:::/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__spark_conf__/__hadoop_conf__
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/GPLEXTRAS-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/GPLEXTRAS-5.11.2-1.cdh5.11.2.p0.4/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.11.2-1.cdh5.11.2.p0.4/lib/hadoop/lib/native::/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.io.tmpdir=/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/tmp
22/06/09 15:44:15 INFO ZooKeeper: Client environment:java.compiler=<NA>
22/06/09 15:44:15 INFO ZooKeeper: Client environment:os.name=Linux
22/06/09 15:44:15 INFO ZooKeeper: Client environment:os.arch=amd64
22/06/09 15:44:15 INFO ZooKeeper: Client environment:os.version=3.10.0-957.el7.x86_64
22/06/09 15:44:15 INFO ZooKeeper: Client environment:user.name=yarn
22/06/09 15:44:15 INFO ZooKeeper: Client environment:user.home=/var/lib/hadoop-yarn
22/06/09 15:44:15 INFO ZooKeeper: Client environment:user.dir=/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001
22/06/09 15:44:15 INFO ZooKeeper: Initiating client connection, connectString=bjht1287:2181,bjht1330:2181,bjht1370:2181,bjht1393:2181,bjht1453:2181 sessionTimeout=60000 watcher=org.apache.hadoop.conf.DistributedConfigHelper$ZooKeeperHolder$DummyWatcher@5b5fdf54
22/06/09 15:44:15 INFO ClientCnxn: Opening socket connection to server bjht1330/10.35.54.25:2181. Will not attempt to authenticate using SASL (unknown error)
22/06/09 15:44:15 INFO ClientCnxn: Socket connection established to bjht1330/10.35.54.25:2181, initiating session
22/06/09 15:44:15 INFO ClientCnxn: Session establishment complete on server bjht1330/10.35.54.25:2181, sessionid = 0xff8118bfdf3c261f, negotiated timeout = 60000
22/06/09 15:44:15 INFO DistributedConfigHelper: not exists  all special local resources,change mode from LOCAL to MIX,path:/tmp/dist-config/logs-hdfs
22/06/09 15:44:15 WARN DistributedConfigHelper: zk resouces timestamp is defective,missing:[core-site]
22/06/09 15:44:15 INFO DistributedConfigHelper: fetch local resource:hdfs-site.xml
22/06/09 15:44:15 INFO DistributedConfigHelper: load resources:[hdfs-site, core-site],load mode:MIX
22/06/09 15:44:15 INFO ZooKeeper: Session: 0xff8118bfdf3c261f closed
22/06/09 15:44:15 INFO ClientCnxn: EventThread shut down
22/06/09 15:44:15 WARN ClientContext: Existing client context 'default' does not match requested configuration.  Existing: shortCircuitStreamsCacheSize = 256, shortCircuitStreamsCacheExpiryMs = 300000, shortCircuitMmapCacheSize = 256, shortCircuitMmapCacheExpiryMs = 3600000, shortCircuitMmapCacheRetryTimeout = 300000, shortCircuitCacheStaleThresholdMs = 1800000, socketCacheCapacity = 16, socketCacheExpiry = 3000, shortCircuitLocalReads = true, useLegacyBlockReaderLocal = false, domainSocketDataTraffic = false, shortCircuitSharedMemoryWatcherInterruptCheckMs = 60000, keyProviderCacheExpiryMs = 864000000, domainSocketDisableIntervalSeconds = 1, Requested: shortCircuitStreamsCacheSize = 256, shortCircuitStreamsCacheExpiryMs = 300000, shortCircuitMmapCacheSize = 256, shortCircuitMmapCacheExpiryMs = 3600000, shortCircuitMmapCacheRetryTimeout = 300000, shortCircuitCacheStaleThresholdMs = 1800000, socketCacheCapacity = 16, socketCacheExpiry = 3000, shortCircuitLocalReads = false, useLegacyBlockReaderLocal = false, domainSocketDataTraffic = false, shortCircuitSharedMemoryWatcherInterruptCheckMs = 60000, keyProviderCacheExpiryMs = 864000000, domainSocketDisableIntervalSeconds = 1
22/06/09 15:44:15 INFO EventLoggingListener: Logging events to hdfs://logs-hdfs/app/spark/jhs_2/v_2_x/application_1654741161919_19163_1
22/06/09 15:44:16 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:16 INFO YarnRMClient: Registering the ApplicationMaster
22/06/09 15:44:16 INFO ApplicationMaster: 
===============================================================================
YARN executor launch context:
  env:
    CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>$HADOOP_CLIENT_CONF_DIR<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/*<CPS>$HADOOP_COMMON_HOME/lib/*<CPS>$HADOOP_HDFS_HOME/*<CPS>$HADOOP_HDFS_HOME/lib/*<CPS>$HADOOP_YARN_HOME/*<CPS>$HADOOP_YARN_HOME/lib/*<CPS>$HADOOP_MAPRED_HOME/*<CPS>$HADOOP_MAPRED_HOME/lib/*<CPS>$MR2_CLASSPATH<CPS>/opt/cloudera/parcels/GPLEXTRAS/lib/hadoop/lib/*<CPS>/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//*:/opt/cloudera/parcels/GPLEXTRAS-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/*:/usr/local/share/spark/bin/../native/lib/*:/usr/local/share/spark/bin/../native/lib/native/*::<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__
    SPARK_DIST_CLASSPATH -> /opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/libexec/../../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop-mapreduce/.//*:/opt/cloudera/parcels/GPLEXTRAS-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/*:/usr/local/share/spark/bin/../native/lib/*:/usr/local/share/spark/bin/../native/lib/native/*::
    SPARK_YARN_STAGING_DIR -> hdfs://alg-hdfs/user/net_kg/.sparkStaging/application_1654741161919_19163
    SPARK_USER -> net_kg

  command:
    LD_LIBRARY_PATH=\"/opt/cloudera/parcels/GPLEXTRAS-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/lib/native:/opt/cloudera/parcels/GPLEXTRAS-5.11.2-1.cdh5.11.2.p0.4/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.11.2-1.cdh5.11.2.p0.4/lib/hadoop/lib/native:$LD_LIBRARY_PATH\" \ 
      {{JAVA_HOME}}/bin/java \ 
      -server \ 
      -Xmx8192m \ 
      '-XX:MaxDirectMemorySize=7372m' \ 
      -Djava.io.tmpdir={{PWD}}/tmp \ 
      '-Dspark.network.timeout=300s' \ 
      '-Dspark.driver.port=17911' \ 
      '-Dspark.port.maxRetries=2' \ 
      '-Dspark.ui.port=0' \ 
      -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ 
      -XX:OnOutOfMemoryError='kill %p' \ 
      org.apache.spark.executor.CoarseGrainedExecutorBackend \ 
      --driver-url \ 
      spark://CoarseGrainedScheduler@cdh-bjht-3878:17911 \ 
      --executor-id \ 
      <executorId> \ 
      --hostname \ 
      <hostname> \ 
      --cores \ 
      24 \ 
      --app-id \ 
      application_1654741161919_19163 \ 
      --user-class-path \ 
      file:$PWD/__app__.jar \ 
      1><LOG_DIR>/stdout \ 
      2><LOG_DIR>/stderr

  resources:
    __app__.jar -> resource { scheme: "hdfs" host: "alg-hdfs" port: -1 file: "/user/net_kg/.sparkStaging/application_1654741161919_19163/nebula-exchange-2.6.3-sst.jar" } size: 205183999 timestamp: 1654760527140 type: FILE visibility: PRIVATE
    __spark_libs__ -> resource { scheme: "hdfs" host: "alg-hdfs" port: -1 file: "/app/spark/archive/spark-2.4.0-bin-hadoop2.6.zip" } size: 478437403 timestamp: 1609322203093 type: ARCHIVE visibility: PUBLIC
    Thing.conf -> resource { scheme: "hdfs" host: "alg-hdfs" port: -1 file: "/user/net_kg/.sparkStaging/application_1654741161919_19163/Thing.conf" } size: 3430 timestamp: 1654760527231 type: FILE visibility: PRIVATE
    __spark_conf__ -> resource { scheme: "hdfs" host: "alg-hdfs" port: -1 file: "/user/net_kg/.sparkStaging/application_1654741161919_19163/__spark_conf__.zip" } size: 366316 timestamp: 1654760527459 type: ARCHIVE visibility: PRIVATE

===============================================================================
22/06/09 15:44:16 INFO YarnAllocator: Will request 24 executor container(s), each with 24 core(s) and 16384 MB memory (including 8192 MB of overhead)
22/06/09 15:44:16 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@cdh-bjht-3878:17911)
22/06/09 15:44:16 INFO YarnAllocator: Submitted 24 unlocalized container requests.
22/06/09 15:44:16 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals
22/06/09 15:44:45 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
22/06/09 15:44:45 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done
22/06/09 15:44:46 INFO Exchange$: Processing Tag Thing
22/06/09 15:44:46 INFO Exchange$: field keys: thing_name, thing_title, thing_namech, thing_nameen, thing_abbreviation, thing_tag, thing_alias, thing_abstract, thing_image, thing_video, thing_audio, thing_gmtcreated, thing_gmtmodified, thing_popularity, thing_prior, thing_datasource, thing_urls, thing_class, thing_imagejson, thing_embedding, thing_sourceids, thing_videocover, thing_englishname, thing_japanname, thing_koreaname, thing_category, thing_thumbnails, thing_pictures, thing_additionalinfo, thing_subtitle, thing_edition
22/06/09 15:44:46 INFO Exchange$: nebula keys: Thing_name, Thing_title, Thing_nameCh, Thing_nameEn, Thing_abbreviation, Thing_tag, Thing_alias, Thing_abstract, Thing_image, Thing_video, Thing_audio, Thing_gmtCreated, Thing_gmtModified, Thing_popularity, Thing_prior, Thing_dataSource, Thing_urls, Thing_class, Thing_imageJson, Thing_embedding, Thing_sourceIds, Thing_videoCover, Thing_englishName, Thing_japanName, Thing_koreaName, Thing_category, Thing_thumbnails, Thing_pictures, Thing_additionalInfo, Thing_subTitle, Thing_edition
22/06/09 15:44:46 INFO Exchange$: Loading from Hive and exec select thing_id, thing_name, thing_title, thing_namech, thing_nameen, thing_abbreviation, thing_tag, thing_alias, thing_abstract, thing_image, thing_video, thing_audio, thing_gmtcreated, thing_gmtmodified, thing_popularity, thing_prior, thing_datasource, thing_urls, thing_class, thing_imagejson, thing_embedding, thing_sourceids, thing_videocover, thing_englishname, thing_japanname, thing_koreaname, thing_category, thing_thumbnails, thing_pictures, thing_additionalinfo, thing_subtitle, thing_edition from oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df where  data_version='v1.1.20211205.alpha' and ds = '20220521'
22/06/09 15:44:46 INFO SharedState: loading hive config file: file:/data/data2/yarn/nm2/usercache/net_kg/filecache/11/__spark_conf__.zip/__hadoop_conf__/hive-site.xml
22/06/09 15:44:46 WARN Configuration: __spark_hadoop_conf__.xml:an attempt to override final parameter: ipc.client.fallback-to-simple-auth-allowed;  Ignoring.
22/06/09 15:44:46 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/dc-warehouse').
22/06/09 15:44:46 INFO SharedState: Warehouse path is '/user/hive/dc-warehouse'.
22/06/09 15:44:46 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL.
22/06/09 15:44:46 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL/json.
22/06/09 15:44:46 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL/execution.
22/06/09 15:44:46 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /SQL/execution/json.
22/06/09 15:44:46 INFO JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /static/sql.
22/06/09 15:44:47 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
22/06/09 15:44:47 WARN AuthorizerCheckRule: application id is application_1654741161919_19163
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/data6/yarn/nm2/filecache/308/spark-2.4.0-bin-hadoop2.6.zip/bdc-authorizer-plugin-impl/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/data6/yarn/nm2/filecache/308/spark-2.4.0-bin-hadoop2.6.zip/bdc-authorizer-plugin-impl/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/data6/yarn/nm2/filecache/308/spark-2.4.0-bin-hadoop2.6.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-20220330112300-149/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/__app__.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
22/06/09 15:44:48 INFO LoggerFactory: using logger: org.apache.dubbo.common.logger.log4j.Log4jLoggerAdapter
22/06/09 15:44:48 INFO ServiceAnnotationBeanPostProcessor:  [DUBBO] BeanNameGenerator bean can't be found in BeanFactory with name [org.springframework.context.annotation.internalConfigurationBeanNameGenerator], dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:48 INFO ServiceAnnotationBeanPostProcessor:  [DUBBO] BeanNameGenerator will be a instance of org.springframework.context.annotation.AnnotationBeanNameGenerator , it maybe a potential problem on bean name generation., dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:48 WARN ServiceAnnotationBeanPostProcessor:  [DUBBO] No Spring Bean annotating Dubbo's @Service was found under package[com.oppo.bdp.unified.authority.sdk.service], dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:48 WARN SpringExtensionFactory:  [DUBBO] No spring extension (bean) named:level, try to find an extension (bean) of type org.apache.dubbo.common.logger.Level, dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:48 WARN SpringExtensionFactory:  [DUBBO] No spring extension (bean) named:level, type:org.apache.dubbo.common.logger.Level found, stop get bean., dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:48 WARN SpringExtensionFactory:  [DUBBO] No spring extension (bean) named:file, try to find an extension (bean) of type java.io.File, dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:48 WARN SpringExtensionFactory:  [DUBBO] No spring extension (bean) named:file, type:java.io.File found, stop get bean., dubbo version: 2.7.0, current host: 10.34.78.20
22/06/09 15:44:50 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
22/06/09 15:44:51 INFO metastore: Trying to connect to metastore with URI thrift://bj2665:9083
22/06/09 15:44:51 INFO metastore: Connected to metastore.
22/06/09 15:44:57 INFO SessionState: Created local directory: /data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/tmp/yarn
22/06/09 15:44:57 INFO SessionState: Created local directory: /data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/tmp/048a3660-3923-498f-84e7-324ee290376f_resources
22/06/09 15:44:57 INFO SessionState: Created HDFS directory: /tmp/hive/net_kg/048a3660-3923-498f-84e7-324ee290376f
22/06/09 15:44:57 INFO SessionState: Created local directory: /data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/container_e4056_1654741161919_19163_01_000001/tmp/yarn/048a3660-3923-498f-84e7-324ee290376f
22/06/09 15:44:57 INFO SessionState: Created HDFS directory: /tmp/hive/net_kg/048a3660-3923-498f-84e7-324ee290376f/_tmp_space.db
22/06/09 15:44:57 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.2) is /user/hive/dc-warehouse
22/06/09 15:44:57 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:44:57 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:44:57 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:44:57 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:44:57 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:44:57 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:44:57 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:44:57 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:44:57 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:44:57 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:44:57 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 5feaaa8536aefddd0341534919a12af0
22/06/09 15:44:57 INFO Version: HV000001: Hibernate Validator 6.0.16.Final
22/06/09 15:44:58 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='65cfe4ba828546748a5e23a14730e431', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:44:58 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:44:58 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='1f3a7674c90b454bb356b327f970e00d', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:44:58 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:44:58 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:44:58 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:44:58 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:44:58 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:44:59 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:44:59 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:44:59 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:44:59 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:44:59 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:44:59 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:44:59 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 2be48790a081828ad2e2333a86096e56
22/06/09 15:44:59 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='03e7ca62dfb440309d6f12c9c95e1455', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:44:59 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:44:59 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='261f66ec18ec429da995f86f644feada', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:44:59 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:44:59 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:44:59 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:44:59 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:44:59 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:44:59 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:44:59 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:44:59 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:44:59 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:44:59 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:44:59 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:44:59 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:44:59 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 2be48790a081828ad2e2333a86096e56
22/06/09 15:44:59 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='d9d51bc35c93423cb87bc3fc8077aca5', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:44:59 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='a2087aa49de34bacb728ddd8297b17e0', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:45:00 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:45:00 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:45:00 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:45:00 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 749275ef6ed475bffe2b79282abb4da4
22/06/09 15:45:00 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='15e774aa63744e58b1fc09ff47a276c0', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='2b69d41be5b44a49926f174f47ce1484', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:45:00 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:45:00 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:45:00 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:45:00 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 749275ef6ed475bffe2b79282abb4da4
22/06/09 15:45:00 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='01bc73fc0c3a4526878f7ee377b94a05', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='c1f7233e7d024ef3a8ba51f0fd2807e0', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:45:00 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:45:00 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:45:00 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:45:00 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 749275ef6ed475bffe2b79282abb4da4
22/06/09 15:45:00 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='dd35845c9ad34a81a393ee39fa364744', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='d8f78f41724a43bcac3d073d9fec54d7', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:45:00 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:45:00 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:45:00 INFO AuthorizerCheckRule: User net_kg; OperationType: QUERY; Input: List(Object [type=TABLE_OR_VIEW, name=oppo_kg_dw.dwd_kg_release_spo_thing_1_v3_55_df[]]); Output: List()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select)
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer()
22/06/09 15:45:00 INFO AuthRpdClient: dbName is oppo_kg_dw; tbName is dwd_kg_release_spo_thing_1_v3_55_df; url is /hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df; fs.defaultFs is china1
22/06/09 15:45:00 INFO AuthRpdClient: urls is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthorizerCheckRule: checkOperation rpdList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select); rpdAdminList is ArrayBuffer(hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin)
22/06/09 15:45:00 INFO AuthRpdClient: enter into rpdCheckPrivileges
22/06/09 15:45:00 INFO AuthRpdClient: rpdList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select; rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: sysId is spark; user is net_kg; sign is 749275ef6ed475bffe2b79282abb4da4
22/06/09 15:45:00 INFO AuthRpdClient: authenticate is BaseResponse{code=1, traceId='97b838a8127a4ef6b83356ef327720f5', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: rpdAdminList is hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin
22/06/09 15:45:00 INFO AuthRpdClient: adminAuthenticate is BaseResponse{code=1, traceId='947a90a066174f14a74d61d0b10e6b26', desc='成功', data=[AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}]}
22/06/09 15:45:00 INFO AuthRpdClient: genRpdListsFromAuthorizeResults rpds is ArrayBuffer(Rpd{rpdUri=hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select, rpdString='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', rpdList=null, authorityBit='select', extraParams={}})
22/06/09 15:45:00 INFO AuthRpdClient: genSucceedFullPathListsFromAuthorizeResults data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false})
22/06/09 15:45:00 INFO AuthRpdClient: paths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data is List(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}); adminData is ArrayBuffer(AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=admin', result=true, msg='成功', hitCache=false}); succeedPaths is ArrayBuffer(/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df)
22/06/09 15:45:00 INFO AuthRpdClient: data indices is Range(0); authorizeResultEntity is AuthorizeResultEntity{rpd='hive://net_kg:user@china1/hive/oppo_kg_dw.db/dwd_kg_release_spo_thing_1_v3_55_df?option=select', result=false, msg='失败', hitCache=false}
22/06/09 15:45:00 WARN Utils: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
22/06/09 15:45:04 INFO FileSourceStrategy: Pruning directories with: isnotnull(data_version#35),isnotnull(ds#34),(data_version#35 = v1.1.20211205.alpha),(ds#34 = 20220521)
22/06/09 15:45:04 INFO FileSourceStrategy: Post-Scan Filters: 
22/06/09 15:45:04 INFO FileSourceStrategy: Output Data Schema: struct<Thing_id: bigint, Thing_name: string, Thing_title: string, Thing_nameCh: string, Thing_nameEn: string ... 30 more fields>
22/06/09 15:45:04 INFO FileSourceScanExec: Pushed Filters: 
22/06/09 15:45:04 INFO PrunedInMemoryFileIndex: Selected 1 partitions out of 1, pruned 0.0% partitions.
22/06/09 15:45:04 WARN DFSClient: Slow ReadProcessor read fields took 48720ms (threshold=30000ms); ack: seqno: 3 reply: 0 reply: 0 reply: 0 downstreamAckTimeNanos: 749755, targets: [DatanodeInfoWithStorage[10.34.114.28:50010,DS-3f094c2b-e1c8-408a-ac78-d94eef78533a,DISK], DatanodeInfoWithStorage[10.34.114.32:50010,DS-6fff28e7-f043-476b-a28c-d63d5584bf95,DISK], DatanodeInfoWithStorage[10.34.82.33:50010,DS-3ef5ba49-87c9-440c-9fe2-7dfbd515c393,DISK]]
22/06/09 15:45:05 INFO CodeGenerator: Code generated in 231.139703 ms
22/06/09 15:45:05 INFO CodeGenerator: Code generated in 15.971303 ms
22/06/09 15:45:05 INFO CodeGenerator: Code generated in 14.53474 ms
22/06/09 15:45:05 INFO CodeGenerator: Code generated in 54.226809 ms
22/06/09 15:45:05 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 597.6 KB, free 4.6 GB)
22/06/09 15:45:05 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 55.6 KB, free 4.6 GB)
22/06/09 15:45:05 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on cdh-bjht-3878:18445 (size: 55.6 KB, free: 4.6 GB)
22/06/09 15:45:05 INFO SparkContext: Created broadcast 0 from foreachPartition at VerticesProcessor.scala:195
22/06/09 15:45:05 INFO FileSourceScanExec: Planning scan with bin packing, max size: 268435456 bytes, open cost is considered as scanning 4194304 bytes.
22/06/09 15:45:05 INFO DAGScheduler: Registering RDD 3 (foreachPartition at VerticesProcessor.scala:195)
22/06/09 15:45:05 INFO DAGScheduler: Registering RDD 7 (foreachPartition at VerticesProcessor.scala:195)
22/06/09 15:45:05 INFO DAGScheduler: Got map stage job 0 (foreachPartition at VerticesProcessor.scala:195) with 24 output partitions
22/06/09 15:45:05 INFO DAGScheduler: Final stage: ShuffleMapStage 1 (foreachPartition at VerticesProcessor.scala:195)
22/06/09 15:45:05 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
22/06/09 15:45:05 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
22/06/09 15:45:05 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at foreachPartition at VerticesProcessor.scala:195), which has no missing parents
22/06/09 15:45:05 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 29.1 KB, free 4.6 GB)
22/06/09 15:45:05 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 9.5 KB, free 4.6 GB)
22/06/09 15:45:05 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on cdh-bjht-3878:18445 (size: 9.5 KB, free: 4.6 GB)
22/06/09 15:45:05 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
22/06/09 15:45:05 INFO DAGScheduler: Submitting 67 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at foreachPartition at VerticesProcessor.scala:195) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14))
22/06/09 15:45:05 INFO YarnClusterScheduler: Adding task set 0.0 with 67 tasks
22/06/09 15:45:35 WARN YarnClusterScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
22/06/09 15:45:41 ERROR ApplicationMaster: RECEIVED SIGNAL TERM
22/06/09 15:45:41 INFO SparkContext: Invoking stop() from shutdown hook
22/06/09 15:45:41 WARN DFSClient: Slow ReadProcessor read fields took 35358ms (threshold=30000ms); ack: seqno: 6 reply: 0 reply: 0 reply: 0 downstreamAckTimeNanos: 530093, targets: [DatanodeInfoWithStorage[10.34.114.28:50010,DS-3f094c2b-e1c8-408a-ac78-d94eef78533a,DISK], DatanodeInfoWithStorage[10.34.114.32:50010,DS-6fff28e7-f043-476b-a28c-d63d5584bf95,DISK], DatanodeInfoWithStorage[10.34.82.33:50010,DS-3ef5ba49-87c9-440c-9fe2-7dfbd515c393,DISK]]
22/06/09 15:45:41 INFO SparkUI: Stopped Spark web UI at http://cdh-bjht-3878:17329
22/06/09 15:45:41 INFO DAGScheduler: ShuffleMapStage 0 (foreachPartition at VerticesProcessor.scala:195) failed in 35.360 s due to Stage cancelled because SparkContext was shut down
22/06/09 15:45:41 ERROR ApplicationMaster: User class threw exception: org.apache.spark.SparkException: Exception thrown in awaitResult: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
	at org.apache.spark.FutureAction$class.get(FutureAction.scala:108)
	at org.apache.spark.SimpleFutureAction.get(FutureAction.scala:162)
	at org.apache.spark.sql.execution.exchange.ExchangeCoordinator.doEstimationIfNecessary(ExchangeCoordinator.scala:228)
	at org.apache.spark.sql.execution.exchange.ExchangeCoordinator.postShuffleRDD(ExchangeCoordinator.scala:264)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:124)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
	at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:374)
	at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:610)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec$$anonfun$doExecute$1.apply(SortAggregateExec.scala:77)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec$$anonfun$doExecute$1.apply(SortAggregateExec.scala:75)
	at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec.doExecute(SortAggregateExec.scala:75)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:89)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.MapPartitionsExec.doExecute(objects.scala:185)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:374)
	at org.apache.spark.sql.execution.SerializeFromObjectExec.inputRDDs(objects.scala:110)
	at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
	at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:610)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:89)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
	at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3038)
	at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3036)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply$mcV$sp(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$withNewRDDExecutionId$1.apply(Dataset.scala:3350)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
	at org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3346)
	at org.apache.spark.sql.Dataset.foreachPartition(Dataset.scala:2735)
	at com.vesoft.nebula.exchange.processor.VerticesProcessor.process(VerticesProcessor.scala:195)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:150)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:126)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:126)
	at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:932)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:930)
	at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:930)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2126)
	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2039)
	at org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
	at org.apache.spark.SparkContext.stop(SparkContext.scala:1948)
	at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
22/06/09 15:45:41 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
22/06/09 15:45:41 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: org.apache.spark.SparkException: Exception thrown in awaitResult: 
	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
	at org.apache.spark.FutureAction$class.get(FutureAction.scala:108)
	at org.apache.spark.SimpleFutureAction.get(FutureAction.scala:162)
	at org.apache.spark.sql.execution.exchange.ExchangeCoordinator.doEstimationIfNecessary(ExchangeCoordinator.scala:228)
	at org.apache.spark.sql.execution.exchange.ExchangeCoordinator.postShuffleRDD(ExchangeCoordinator.scala:264)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:124)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
	at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:374)
	at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:610)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec$$anonfun$doExecute$1.apply(SortAggregateExec.scala:77)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec$$anonfun$doExecute$1.apply(SortAggregateExec.scala:75)
	at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec.doExecute(SortAggregateExec.scala:75)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:89)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.MapPartitionsExec.doExecute(objects.scala:185)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:374)
	at org.apache.spark.sql.execution.SerializeFromObjectExec.inputRDDs(objects.scala:110)
	at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
	at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:610)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:89)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
	at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3038)
	at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3036)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply$mcV$sp(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$withNewRDDExecutionId$1.apply(Dataset.scala:3350)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
	at org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3346)
	at org.apache.spark.sql.Dataset.foreachPartition(Dataset.scala:2735)
	at com.vesoft.nebula.exchange.processor.VerticesProcessor.process(VerticesProcessor.scala:195)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:150)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:126)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:126)
	at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:932)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:930)
	at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:930)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2126)
	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2039)
	at org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
	at org.apache.spark.SparkContext.stop(SparkContext.scala:1948)
	at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
)
22/06/09 15:45:41 INFO YarnClusterSchedulerBackend: Shutting down all executors
22/06/09 15:45:41 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
22/06/09 15:45:41 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
22/06/09 15:45:41 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/06/09 15:45:41 INFO MemoryStore: MemoryStore cleared
22/06/09 15:45:41 INFO BlockManager: BlockManager stopped
22/06/09 15:45:41 INFO BlockManagerMaster: BlockManagerMaster stopped
22/06/09 15:45:41 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/06/09 15:45:41 INFO SparkContext: Successfully stopped SparkContext
22/06/09 15:45:41 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: org.apache.spark.SparkException: Exception thrown in awaitResult: 
	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
	at org.apache.spark.FutureAction$class.get(FutureAction.scala:108)
	at org.apache.spark.SimpleFutureAction.get(FutureAction.scala:162)
	at org.apache.spark.sql.execution.exchange.ExchangeCoordinator.doEstimationIfNecessary(ExchangeCoordinator.scala:228)
	at org.apache.spark.sql.execution.exchange.ExchangeCoordinator.postShuffleRDD(ExchangeCoordinator.scala:264)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:124)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
	at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:374)
	at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:610)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec$$anonfun$doExecute$1.apply(SortAggregateExec.scala:77)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec$$anonfun$doExecute$1.apply(SortAggregateExec.scala:75)
	at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
	at org.apache.spark.sql.execution.aggregate.SortAggregateExec.doExecute(SortAggregateExec.scala:75)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:89)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.MapPartitionsExec.doExecute(objects.scala:185)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:374)
	at org.apache.spark.sql.execution.SerializeFromObjectExec.inputRDDs(objects.scala:110)
	at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
	at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
	at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:610)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:89)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
	at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3038)
	at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3036)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply$mcV$sp(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2736)
	at org.apache.spark.sql.Dataset$$anonfun$withNewRDDExecutionId$1.apply(Dataset.scala:3350)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
	at org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3346)
	at org.apache.spark.sql.Dataset.foreachPartition(Dataset.scala:2735)
	at com.vesoft.nebula.exchange.processor.VerticesProcessor.process(VerticesProcessor.scala:195)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:150)
	at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:126)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:126)
	at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:932)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:930)
	at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:930)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2126)
	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2039)
	at org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
	at org.apache.spark.SparkContext.stop(SparkContext.scala:1948)
	at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
)
22/06/09 15:45:41 INFO AMRMClientImpl: Waiting for application to be successfully unregistered.
22/06/09 15:45:41 INFO ApplicationMaster: Deleting staging directory hdfs://alg-hdfs/user/net_kg/.sparkStaging/application_1654741161919_19163
22/06/09 15:45:41 INFO ShutdownHookManager: Shutdown hook called
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data22/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-e0c6730a-d971-4240-93f5-9254e2e2e3b6
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data19/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-c97b7af8-4811-48a4-a353-322f82f82e8a
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data6/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-ec618ea1-e899-44a8-98f2-95f998ee7552
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data24/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-3c0c20df-a212-45bd-9c57-bc367a71ebc9
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data10/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-26181966-7ba1-42e8-94d5-dd89495afce2
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data13/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-ddf5522f-cd98-4f5e-815e-9f2e0282d6e9
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data3/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-9c93e170-649e-49a9-8475-5dccff969f70
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data11/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-52d6896b-e40f-4d86-a7d2-f52a1597d44d
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data4/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-87c2c07c-5f7d-4301-8c6e-ec1f2dfda919
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data17/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-848d801c-701e-4acd-9d63-ebf96b50d3de
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data16/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-c81e47e8-e364-4aea-b982-b3ac67ed93df
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data9/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-c935239a-8ae9-413a-95d0-38a5e8d82596
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data14/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-c48ee6e7-50f6-44cf-a96e-2da65a3d8bf6
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data20/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-9fd06c4c-6366-47d4-ab76-8b6a894edff8
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data23/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-4b16bdae-29c4-4d65-823d-a787c9b3db27
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data8/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-56e63519-b7ce-41b2-8ef8-4e4b955cfd23
22/06/09 15:45:41 INFO ShutdownHookManager: Deleting directory /data/data21/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-7b682a1d-68bb-4c06-ab6b-cf6c1965df58
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data15/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-3eb67a0e-aa78-4c56-9c05-1e86348a5d77
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data7/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-9622fcdb-a6a7-4d2c-8ee0-b0304b94d0ad
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data5/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-5e90ed13-6b73-44a4-859d-d889d5d2dd07
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data1/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-46b5b6cd-b7ae-4953-aeba-93d6e43c8c1e
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data12/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-0570dc4a-ab8b-4d44-ab0d-fdbd31700f00
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data18/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-14f1294b-3d43-49fb-a48f-5a33cec6db9a
22/06/09 15:45:42 INFO ShutdownHookManager: Deleting directory /data/data2/yarn/nm2/usercache/net_kg/appcache/application_1654741161919_19163/spark-907dcc4a-edb4-4e1b-a45e-5c96538cdc3e