exchange 2.5.0抛出ClassNotFoundException: java.sql.Date

  • nebula 版本:2.5
  • 部署方式(分布式 / 单机 / Docker / DBaaS):分布式
  • 是否为线上版本: N
  • 硬件信息
    • 磁盘( 推荐使用 SSD)
    • CPU、内存信息
  • 问题的具体描述
    spark版本:spark-2.4.8-bin-hadoop2.7
    spark-defaults.conf里·spark.sql.hive.metastore.jars maven

尝试用exchange访问hive的表,用的方式非直接连接mysql访问元数据,而是用thrift server的方式。看到metastore已经连上,但是抛了ClassNotFoundException: java.sql.Date的异常。

尝试把scala.version改为2.12,但是在v2.5分支上编译不通过

**日志:**

[zhanqing@gs-server-7729 nebula-exchange]$ spark-submit --master "local" \
> --principal zhanqing@GRIDSUM.COM \
> --keytab ./zhanqing.keytab \
> --class com.vesoft.nebula.exchange.Exchange  nebula-exchange-2.5.0.jar -c application.conf -h
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark-2.4.8-bin-hadoop2.7/jars/spark-unsafe_2.11-2.4.8.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
log4j:WARN No appenders could be found for logger (com.vesoft.nebula.exchange.config.Configs$).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/09/14 17:05:51 INFO SparkContext: Running Spark version 2.4.8
21/09/14 17:05:51 INFO SparkContext: Submitted application: com.vesoft.nebula.exchange.Exchange
21/09/14 17:05:52 INFO SecurityManager: Changing view acls to: zhanqing
21/09/14 17:05:52 INFO SecurityManager: Changing modify acls to: zhanqing
21/09/14 17:05:52 INFO SecurityManager: Changing view acls groups to:
21/09/14 17:05:52 INFO SecurityManager: Changing modify acls groups to:
21/09/14 17:05:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(zhanqing); groups with view permissions: Set(); users  with modify permissions: Set(zhanqing); groups with modify permissions: Set()
21/09/14 17:05:52 INFO Utils: Successfully started service 'sparkDriver' on port 54444.
21/09/14 17:05:52 INFO SparkEnv: Registering MapOutputTracker
21/09/14 17:05:52 INFO SparkEnv: Registering BlockManagerMaster
21/09/14 17:05:52 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/09/14 17:05:52 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/09/14 17:05:52 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-bc667bb5-7b89-411f-9097-a9f4c301347a
21/09/14 17:05:52 INFO MemoryStore: MemoryStore started with capacity 434.4 MB
21/09/14 17:05:52 INFO SparkEnv: Registering OutputCommitCoordinator
21/09/14 17:05:52 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/09/14 17:05:52 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://Master:4040
21/09/14 17:05:52 INFO SparkContext: Added JAR file:/opt/nebula-exchange/nebula-exchange-2.5.0.jar at spark://Master:54444/jars/nebula-exchange-2.5.0.jar with timestamp 1631610352850
21/09/14 17:05:52 INFO Executor: Starting executor ID driver on host localhost
21/09/14 17:05:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46802.
21/09/14 17:05:53 INFO NettyBlockTransferService: Server created on Master:46802
21/09/14 17:05:53 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/09/14 17:05:53 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, Master, 46802, None)
21/09/14 17:05:53 INFO BlockManagerMasterEndpoint: Registering block manager Master:46802 with 434.4 MB RAM, BlockManagerId(driver, Master, 46802, None)
21/09/14 17:05:53 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, Master, 46802, None)
21/09/14 17:05:53 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, Master, 46802, None)
21/09/14 17:05:53 INFO Exchange$: Processing Tag user
21/09/14 17:05:53 INFO Exchange$: field keys: is_mobile
21/09/14 17:05:53 INFO Exchange$: nebula keys: is_mobile
21/09/14 17:05:53 INFO Exchange$: Loading from Hive and exec select unique_identity, if(max(is_mobile),'true','false') as is_mobile from ad.fact_ad_report where day=20200601 and profile_group=9 and  profile_id=409 and
      client_hour_of_day >= 0 and client_hour_of_day < 1   group by unique_identity limit 1000
21/09/14 17:05:53 INFO SharedState: loading hive config file: file:/opt/spark-2.4.8-bin-hadoop2.7/conf/hive-site.xml
21/09/14 17:05:53 INFO SharedState: Setting hive.metastore.warehouse.dir ('/user/hive/warehouse') to the value of spark.sql.warehouse.dir ('hdfs://10.202.42.216:8020/user/hive/warehouse/').
21/09/14 17:05:53 INFO SharedState: Warehouse path is 'hdfs://10.202.42.216:8020/user/hive/warehouse/'.
21/09/14 17:05:54 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
21/09/14 17:05:54 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using maven.
https://maven-central.storage-download.googleapis.com/repos/central/data/ added as a remote repository with the name: repo-1
Ivy Default Cache set to: /home/zhanqing@gridsum.com/.ivy2/cache
The jars for the packages stored in: /home/zhanqing@gridsum.com/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-2.4.8-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.hive#hive-metastore added as a dependency
org.apache.hive#hive-exec added as a dependency
org.apache.hive#hive-common added as a dependency
org.apache.hive#hive-serde added as a dependency
com.google.guava#guava added as a dependency
org.apache.hadoop#hadoop-client added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-9fc83ed8-9b1f-4d0d-ad73-47105ccdc4cd;1.0
        confs: [default]
        found org.apache.hive#hive-metastore;1.2.2 in central
        found org.apache.hive#hive-serde;1.2.2 in central
        found org.apache.hive#hive-common;1.2.2 in central
        found org.apache.hive#hive-shims;1.2.2 in central
        found org.apache.hive.shims#hive-shims-common;1.2.2 in central
        found commons-logging#commons-logging;1.1.3 in central
        found log4j#log4j;1.2.16 in central
        found log4j#apache-log4j-extras;1.2.17 in central
        found com.google.guava#guava;14.0.1 in central
        found commons-lang#commons-lang;2.6 in central
        found org.apache.thrift#libthrift;0.9.2 in central
        found org.slf4j#slf4j-api;1.7.5 in central
        found org.apache.httpcomponents#httpclient;4.4 in central
        found org.apache.httpcomponents#httpcore;4.4 in central
        found commons-codec#commons-codec;1.4 in central
        found org.apache.zookeeper#zookeeper;3.4.6 in central
        found org.slf4j#slf4j-log4j12;1.7.5 in central
        found jline#jline;2.12 in central
        found io.netty#netty;3.7.0.Final in central
        found org.apache.hive.shims#hive-shims-0.20S;1.2.2 in central
        found org.apache.hive.shims#hive-shims-0.23;1.2.2 in central
        found org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.6.0 in central
        found org.apache.hadoop#hadoop-annotations;2.6.0 in central
        found com.google.inject.extensions#guice-servlet;3.0 in central
        found com.google.inject#guice;3.0 in central
        found javax.inject#javax.inject;1 in central
        found aopalliance#aopalliance;1.0 in central
        found org.sonatype.sisu.inject#cglib;2.2.1-v20090111 in central
        found asm#asm;3.2 in central
        found com.google.protobuf#protobuf-java;2.5.0 in central
        found commons-io#commons-io;2.4 in central
        found com.sun.jersey#jersey-json;1.14 in central
        found org.codehaus.jettison#jettison;1.1 in central
        found com.sun.xml.bind#jaxb-impl;2.2.3-1 in central
        found javax.xml.bind#jaxb-api;2.2.2 in central
        found javax.xml.stream#stax-api;1.0-2 in central
        found javax.activation#activation;1.1 in central
        found org.codehaus.jackson#jackson-core-asl;1.9.2 in central
        found org.codehaus.jackson#jackson-mapper-asl;1.9.2 in central
        found org.codehaus.jackson#jackson-jaxrs;1.9.2 in central
        found org.codehaus.jackson#jackson-xc;1.9.2 in central
        found com.sun.jersey.contribs#jersey-guice;1.9 in central
        found org.apache.hadoop#hadoop-yarn-common;2.6.0 in central
        found org.apache.hadoop#hadoop-yarn-api;2.6.0 in central
        found org.apache.commons#commons-compress;1.4.1 in central
        found org.tukaani#xz;1.0 in central
        found org.mortbay.jetty#jetty-util;6.1.26 in central
        found com.sun.jersey#jersey-core;1.14 in central
        found com.sun.jersey#jersey-client;1.9 in central
        found commons-cli#commons-cli;1.2 in central
        found com.sun.jersey#jersey-server;1.14 in central
        found org.apache.hadoop#hadoop-yarn-server-common;2.6.0 in central
        found org.fusesource.leveldbjni#leveldbjni-all;1.8 in central
        found org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.6.0 in central
        found commons-collections#commons-collections;3.2.2 in central
        found org.apache.hadoop#hadoop-yarn-server-web-proxy;2.6.0 in central
        found commons-httpclient#commons-httpclient;3.0.1 in central
        found junit#junit;4.11 in central
        found org.hamcrest#hamcrest-core;1.3 in central
        found org.mortbay.jetty#jetty;6.1.26 in central
        found org.apache.hive.shims#hive-shims-scheduler;1.2.2 in central
        found joda-time#joda-time;2.5 in central
        found org.apache.ant#ant;1.9.1 in central
        found org.apache.ant#ant-launcher;1.9.1 in central
        found org.json#json;20090211 in central
        found com.google.code.findbugs#jsr305;3.0.0 in central
        found org.apache.avro#avro;1.7.5 in central
        found com.thoughtworks.paranamer#paranamer;2.3 in central
        found org.xerial.snappy#snappy-java;1.0.5 in central
        found net.sf.opencsv#opencsv;2.3 in central
        found com.twitter#parquet-hadoop-bundle;1.6.0 in central
        found com.jolbox#bonecp;0.8.0.RELEASE in central
        found org.apache.derby#derby;10.10.2.0 in central
        found org.datanucleus#datanucleus-api-jdo;3.2.6 in central
        found org.datanucleus#datanucleus-core;3.2.10 in central
        found org.datanucleus#datanucleus-rdbms;3.2.9 in central
        found commons-pool#commons-pool;1.5.4 in central
        found commons-dbcp#commons-dbcp;1.4 in central
        found javax.jdo#jdo-api;3.0.1 in central
        found javax.transaction#jta;1.1 in central
        found org.antlr#antlr-runtime;3.4 in central
        found org.antlr#stringtemplate;3.2.1 in central
        found antlr#antlr;2.7.7 in central
        found org.apache.thrift#libfb303;0.9.2 in central
        found org.apache.hive#hive-exec;1.2.2 in central
        found org.apache.hive#hive-ant;1.2.2 in central
        found org.apache.velocity#velocity;1.5 in central
        found oro#oro;2.0.8 in central
        found org.antlr#ST4;4.0.4 in central
        found org.apache.ivy#ivy;2.4.0 in central
        found org.codehaus.groovy#groovy-all;2.1.6 in central
        found org.apache.calcite#calcite-core;1.2.0-incubating in central
        found org.apache.calcite#calcite-avatica;1.2.0-incubating in central
        found org.apache.calcite#calcite-linq4j;1.2.0-incubating in central
        found net.hydromatic#eigenbase-properties;1.1.5 in central
        found org.codehaus.janino#janino;2.7.6 in central
        found org.codehaus.janino#commons-compiler;2.7.6 in central
        found stax#stax-api;1.0.1 in central
        found org.apache.hadoop#hadoop-client;2.7.3 in central
        found org.apache.hadoop#hadoop-common;2.7.3 in central
        found org.apache.hadoop#hadoop-annotations;2.7.3 in central
        found org.apache.commons#commons-math3;3.1.1 in central
        found xmlenc#xmlenc;0.52 in central
        found commons-httpclient#commons-httpclient;3.1 in central
        found commons-net#commons-net;3.1 in central
        found log4j#log4j;1.2.17 in central
        found commons-configuration#commons-configuration;1.6 in central
        found commons-digester#commons-digester;1.8 in central
        found commons-beanutils#commons-beanutils;1.7.0 in central
        found commons-beanutils#commons-beanutils-core;1.8.0 in central
        found org.slf4j#slf4j-api;1.7.10 in central
        found org.codehaus.jackson#jackson-core-asl;1.9.13 in central
        found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in central
        found com.google.code.gson#gson;2.2.4 in central
        found org.apache.hadoop#hadoop-auth;2.7.3 in central
        found org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 in central
        found org.apache.directory.server#apacheds-i18n;2.0.0-M15 in central
        found org.apache.directory.api#api-asn1-api;1.0.0-M20 in central
        found org.apache.directory.api#api-util;1.0.0-M20 in central
        found org.apache.htrace#htrace-core;3.1.0-incubating in central
        found javax.servlet.jsp#jsp-api;2.1 in central
        found org.slf4j#slf4j-log4j12;1.7.10 in central
        found org.apache.hadoop#hadoop-hdfs;2.7.3 in central
        found io.netty#netty-all;4.0.23.Final in central
        found xerces#xercesImpl;2.9.1 in central
        found xml-apis#xml-apis;1.3.04 in central
        found org.apache.hadoop#hadoop-mapreduce-client-app;2.7.3 in central
        found org.apache.hadoop#hadoop-mapreduce-client-common;2.7.3 in central
        found org.apache.hadoop#hadoop-yarn-common;2.7.3 in central
        found org.apache.hadoop#hadoop-yarn-api;2.7.3 in central
        found org.codehaus.jackson#jackson-jaxrs;1.9.13 in central
        found org.codehaus.jackson#jackson-xc;1.9.13 in central
        found org.apache.hadoop#hadoop-yarn-client;2.7.3 in central
        found org.apache.hadoop#hadoop-mapreduce-client-core;2.7.3 in central
        found org.apache.hadoop#hadoop-yarn-server-common;2.7.3 in central
        found org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.7.3 in central
        found org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.7.3 in central
:: resolution report :: resolve 2447ms :: artifacts dl 47ms
        :: modules in use:
        antlr#antlr;2.7.7 from central in [default]
        aopalliance#aopalliance;1.0 from central in [default]
        asm#asm;3.2 from central in [default]
        com.google.code.findbugs#jsr305;3.0.0 from central in [default]
        com.google.code.gson#gson;2.2.4 from central in [default]
        com.google.guava#guava;14.0.1 from central in [default]
        com.google.inject#guice;3.0 from central in [default]
        com.google.inject.extensions#guice-servlet;3.0 from central in [default]
        com.google.protobuf#protobuf-java;2.5.0 from central in [default]
        com.jolbox#bonecp;0.8.0.RELEASE from central in [default]
        com.sun.jersey#jersey-client;1.9 from central in [default]
        com.sun.jersey#jersey-core;1.14 from central in [default]
        com.sun.jersey#jersey-json;1.14 from central in [default]
        com.sun.jersey#jersey-server;1.14 from central in [default]
        com.sun.jersey.contribs#jersey-guice;1.9 from central in [default]
        com.sun.xml.bind#jaxb-impl;2.2.3-1 from central in [default]
        com.thoughtworks.paranamer#paranamer;2.3 from central in [default]
        com.twitter#parquet-hadoop-bundle;1.6.0 from central in [default]
        commons-beanutils#commons-beanutils;1.7.0 from central in [default]
        commons-beanutils#commons-beanutils-core;1.8.0 from central in [default]
        commons-cli#commons-cli;1.2 from central in [default]
        commons-codec#commons-codec;1.4 from central in [default]
        commons-collections#commons-collections;3.2.2 from central in [default]
        commons-configuration#commons-configuration;1.6 from central in [default]
        commons-dbcp#commons-dbcp;1.4 from central in [default]
        commons-digester#commons-digester;1.8 from central in [default]
        commons-httpclient#commons-httpclient;3.1 from central in [default]
        commons-io#commons-io;2.4 from central in [default]
        commons-lang#commons-lang;2.6 from central in [default]
        commons-logging#commons-logging;1.1.3 from central in [default]
        commons-net#commons-net;3.1 from central in [default]
        commons-pool#commons-pool;1.5.4 from central in [default]
        io.netty#netty;3.7.0.Final from central in [default]
        io.netty#netty-all;4.0.23.Final from central in [default]
        javax.activation#activation;1.1 from central in [default]
        javax.inject#javax.inject;1 from central in [default]
        javax.jdo#jdo-api;3.0.1 from central in [default]
        javax.servlet.jsp#jsp-api;2.1 from central in [default]
        javax.transaction#jta;1.1 from central in [default]
        javax.xml.bind#jaxb-api;2.2.2 from central in [default]
        javax.xml.stream#stax-api;1.0-2 from central in [default]
        jline#jline;2.12 from central in [default]
        joda-time#joda-time;2.5 from central in [default]
        log4j#apache-log4j-extras;1.2.17 from central in [default]
        log4j#log4j;1.2.17 from central in [default]
        net.hydromatic#eigenbase-properties;1.1.5 from central in [default]
        net.sf.opencsv#opencsv;2.3 from central in [default]
        org.antlr#ST4;4.0.4 from central in [default]
        org.antlr#antlr-runtime;3.4 from central in [default]
        org.antlr#stringtemplate;3.2.1 from central in [default]
        org.apache.ant#ant;1.9.1 from central in [default]
        org.apache.ant#ant-launcher;1.9.1 from central in [default]
        org.apache.avro#avro;1.7.5 from central in [default]
        org.apache.calcite#calcite-avatica;1.2.0-incubating from central in [default]
        org.apache.calcite#calcite-core;1.2.0-incubating from central in [default]
        org.apache.calcite#calcite-linq4j;1.2.0-incubating from central in [default]
        org.apache.commons#commons-compress;1.4.1 from central in [default]
        org.apache.commons#commons-math3;3.1.1 from central in [default]
        org.apache.derby#derby;10.10.2.0 from central in [default]
        org.apache.directory.api#api-asn1-api;1.0.0-M20 from central in [default]
        org.apache.directory.api#api-util;1.0.0-M20 from central in [default]
        org.apache.directory.server#apacheds-i18n;2.0.0-M15 from central in [default]
        org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 from central in [default]
        org.apache.hadoop#hadoop-annotations;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-auth;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-client;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-common;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-hdfs;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-mapreduce-client-app;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-mapreduce-client-common;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-mapreduce-client-core;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-yarn-api;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-yarn-client;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-yarn-common;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.6.0 from central in [default]
        org.apache.hadoop#hadoop-yarn-server-common;2.7.3 from central in [default]
        org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.6.0 from central in [default]
        org.apache.hadoop#hadoop-yarn-server-web-proxy;2.6.0 from central in [default]
        org.apache.hive#hive-ant;1.2.2 from central in [default]
        org.apache.hive#hive-common;1.2.2 from central in [default]
        org.apache.hive#hive-exec;1.2.2 from central in [default]
        org.apache.hive#hive-metastore;1.2.2 from central in [default]
        org.apache.hive#hive-serde;1.2.2 from central in [default]
        org.apache.hive#hive-shims;1.2.2 from central in [default]
        org.apache.hive.shims#hive-shims-0.20S;1.2.2 from central in [default]
        org.apache.hive.shims#hive-shims-0.23;1.2.2 from central in [default]
        org.apache.hive.shims#hive-shims-common;1.2.2 from central in [default]
        org.apache.hive.shims#hive-shims-scheduler;1.2.2 from central in [default]
        org.apache.htrace#htrace-core;3.1.0-incubating from central in [default]
        org.apache.httpcomponents#httpclient;4.4 from central in [default]
        org.apache.httpcomponents#httpcore;4.4 from central in [default]
        org.apache.ivy#ivy;2.4.0 from central in [default]
        org.apache.thrift#libfb303;0.9.2 from central in [default]
        org.apache.thrift#libthrift;0.9.2 from central in [default]
        org.apache.velocity#velocity;1.5 from central in [default]
        org.apache.zookeeper#zookeeper;3.4.6 from central in [default]
        org.codehaus.groovy#groovy-all;2.1.6 from central in [default]
        org.codehaus.jackson#jackson-core-asl;1.9.13 from central in [default]
        org.codehaus.jackson#jackson-jaxrs;1.9.13 from central in [default]
        org.codehaus.jackson#jackson-mapper-asl;1.9.13 from central in [default]
        org.codehaus.jackson#jackson-xc;1.9.13 from central in [default]
        org.codehaus.janino#commons-compiler;2.7.6 from central in [default]
        org.codehaus.janino#janino;2.7.6 from central in [default]
        org.codehaus.jettison#jettison;1.1 from central in [default]
        org.datanucleus#datanucleus-api-jdo;3.2.6 from central in [default]
        org.datanucleus#datanucleus-core;3.2.10 from central in [default]
        org.datanucleus#datanucleus-rdbms;3.2.9 from central in [default]
        org.fusesource.leveldbjni#leveldbjni-all;1.8 from central in [default]
        org.json#json;20090211 from central in [default]
        org.mortbay.jetty#jetty;6.1.26 from central in [default]
        org.mortbay.jetty#jetty-util;6.1.26 from central in [default]
        org.slf4j#slf4j-api;1.7.10 from central in [default]
        org.slf4j#slf4j-log4j12;1.7.10 from central in [default]
        org.sonatype.sisu.inject#cglib;2.2.1-v20090111 from central in [default]
        org.tukaani#xz;1.0 from central in [default]
        org.xerial.snappy#snappy-java;1.0.5 from central in [default]
        oro#oro;2.0.8 from central in [default]
        stax#stax-api;1.0.1 from central in [default]
        xerces#xercesImpl;2.9.1 from central in [default]
        xml-apis#xml-apis;1.3.04 from central in [default]
        xmlenc#xmlenc;0.52 from central in [default]
        :: evicted modules:
        log4j#log4j;1.2.16 by [log4j#log4j;1.2.17] in [default]
        org.slf4j#slf4j-api;1.7.5 by [org.slf4j#slf4j-api;1.7.10] in [default]
        org.slf4j#slf4j-log4j12;1.7.5 by [org.slf4j#slf4j-log4j12;1.7.10] in [default]
        org.apache.hadoop#hadoop-annotations;2.6.0 by [org.apache.hadoop#hadoop-annotations;2.7.3] in [default]
        org.codehaus.jackson#jackson-core-asl;1.9.2 by [org.codehaus.jackson#jackson-core-asl;1.9.13] in [default]
        org.codehaus.jackson#jackson-mapper-asl;1.9.2 by [org.codehaus.jackson#jackson-mapper-asl;1.9.13] in [default]
        org.codehaus.jackson#jackson-jaxrs;1.9.2 by [org.codehaus.jackson#jackson-jaxrs;1.9.13] in [default]
        org.codehaus.jackson#jackson-xc;1.9.2 by [org.codehaus.jackson#jackson-xc;1.9.13] in [default]
        org.apache.hadoop#hadoop-yarn-common;2.6.0 by [org.apache.hadoop#hadoop-yarn-common;2.7.3] in [default]
        org.apache.hadoop#hadoop-yarn-api;2.6.0 by [org.apache.hadoop#hadoop-yarn-api;2.7.3] in [default]
        org.apache.hadoop#hadoop-yarn-server-common;2.6.0 by [org.apache.hadoop#hadoop-yarn-server-common;2.7.3] in [default]
        commons-httpclient#commons-httpclient;3.0.1 by [commons-httpclient#commons-httpclient;3.1] in [default]
        junit#junit;4.11 transitively in [default]
        org.hamcrest#hamcrest-core;1.3 transitively in [default]
        com.google.code.findbugs#jsr305;1.3.9 by [com.google.code.findbugs#jsr305;3.0.0] in [default]
        com.google.guava#guava;11.0.2 by [com.google.guava#guava;14.0.1] in [default]
        org.apache.avro#avro;1.7.4 by [org.apache.avro#avro;1.7.5] in [default]
        org.apache.httpcomponents#httpclient;4.2.5 by [org.apache.httpcomponents#httpclient;4.4] in [default]
        io.netty#netty;3.6.2.Final by [io.netty#netty;3.7.0.Final] in [default]
        com.sun.jersey#jersey-core;1.9 by [com.sun.jersey#jersey-core;1.14] in [default]
        com.sun.jersey#jersey-server;1.9 by [com.sun.jersey#jersey-server;1.14] in [default]
        com.sun.jersey#jersey-json;1.9 by [com.sun.jersey#jersey-json;1.14] in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |  145  |   0   |   0   |   22  ||  123  |   0   |
        ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-9fc83ed8-9b1f-4d0d-ad73-47105ccdc4cd
        confs: [default]
        0 artifacts copied, 123 already retrieved (0kB/38ms)
21/09/14 17:05:57 INFO IsolatedClientLoader: Downloaded metastore jars to /tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9
21/09/14 17:05:57 INFO metastore: Trying to connect to metastore with URI thrift://gs-server-1046:9083
21/09/14 17:05:58 INFO metastore: Connected to metastore.
Exception in thread "main" java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: java/sql/Date when creating Hive client using classpath: file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.google.protobuf_protobuf-java-2.5.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-net_commons-net-3.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/aopalliance_aopalliance-1.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.jackson_jackson-mapper-asl-1.9.13.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.datanucleus_datanucleus-rdbms-3.2.9.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.twitter_parquet-hadoop-bundle-1.6.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.thrift_libfb303-0.9.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-common-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.ant_ant-1.9.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.antlr_ST4-4.0.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.commons_commons-compress-1.4.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/joda-time_joda-time-2.5.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.slf4j_slf4j-api-1.7.10.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.antlr_antlr-runtime-3.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-mapreduce-client-shuffle-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-dbcp_commons-dbcp-1.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/jline_jline-2.12.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/oro_oro-2.0.8.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/net.sf.opencsv_opencsv-2.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.zookeeper_zookeeper-3.4.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.httpcomponents_httpcore-4.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/xerces_xercesImpl-2.9.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.sun.xml.bind_jaxb-impl-2.2.3-1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.sun.jersey_jersey-json-1.14.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.sonatype.sisu.inject_cglib-2.2.1-v20090111.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.thrift_libthrift-0.9.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.httpcomponents_httpclient-4.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/stax_stax-api-1.0.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-auth-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-server-common-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive_hive-serde-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-lang_commons-lang-2.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.jackson_jackson-xc-1.9.13.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/log4j_log4j-1.2.17.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-logging_commons-logging-1.1.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.mortbay.jetty_jetty-6.1.26.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.transaction_jta-1.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.json_json-20090211.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.ivy_ivy-2.4.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-mapreduce-client-jobclient-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.thoughtworks.paranamer_paranamer-2.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive_hive-exec-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive_hive-metastore-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.inject_javax.inject-1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.calcite_calcite-core-1.2.0-incubating.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive_hive-common-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-mapreduce-client-core-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-codec_commons-codec-1.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive_hive-shims-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.avro_avro-1.7.5.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.commons_commons-math3-3.1.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.google.guava_guava-14.0.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive.shims_hive-shims-scheduler-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-client-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.jackson_jackson-core-asl-1.9.13.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.antlr_stringtemplate-3.2.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.google.inject_guice-3.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/xml-apis_xml-apis-1.3.04.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/antlr_antlr-2.7.7.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.xml.bind_jaxb-api-2.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive.shims_hive-shims-0.23-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.janino_commons-compiler-2.7.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.servlet.jsp_jsp-api-2.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.calcite_calcite-avatica-1.2.0-incubating.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.activation_activation-1.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-common-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.google.code.findbugs_jsr305-3.0.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive.shims_hive-shims-0.20S-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-cli_commons-cli-1.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.google.inject.extensions_guice-servlet-3.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.derby_derby-10.10.2.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-pool_commons-pool-1.5.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.directory.api_api-util-1.0.0-M20.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.directory.server_apacheds-i18n-2.0.0-M15.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-client-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-api-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-beanutils_commons-beanutils-1.7.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-beanutils_commons-beanutils-core-1.8.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.directory.server_apacheds-kerberos-codec-2.0.0-M15.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-collections_commons-collections-3.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.tukaani_xz-1.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-server-web-proxy-2.6.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.htrace_htrace-core-3.1.0-incubating.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.sun.jersey_jersey-client-1.9.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.ant_ant-launcher-1.9.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.jolbox_bonecp-0.8.0.RELEASE.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.jettison_jettison-1.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-io_commons-io-2.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive_hive-ant-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.calcite_calcite-linq4j-1.2.0-incubating.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.sun.jersey_jersey-core-1.14.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.janino_janino-2.7.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-server-applicationhistoryservice-2.6.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-digester_commons-digester-1.8.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.google.code.gson_gson-2.2.4.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/asm_asm-3.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-configuration_commons-configuration-1.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/log4j_apache-log4j-extras-1.2.17.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/io.netty_netty-3.7.0.Final.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-mapreduce-client-common-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.fusesource.leveldbjni_leveldbjni-all-1.8.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.jdo_jdo-api-3.0.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.mortbay.jetty_jetty-util-6.1.26.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hive.shims_hive-shims-common-1.2.2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/io.netty_netty-all-4.0.23.Final.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.directory.api_api-asn1-api-1.0.0-M20.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.groovy_groovy-all-2.1.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.slf4j_slf4j-log4j12-1.7.10.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.sun.jersey_jersey-server-1.14.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/xmlenc_xmlenc-0.52.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-yarn-server-resourcemanager-2.6.0.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.velocity_velocity-1.5.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/javax.xml.stream_stax-api-1.0-2.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.datanucleus_datanucleus-core-3.2.10.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.datanucleus_datanucleus-api-jdo-3.2.6.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/commons-httpclient_commons-httpclient-3.1.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-annotations-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-mapreduce-client-app-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/com.sun.jersey.contribs_jersey-guice-1.9.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/net.hydromatic_eigenbase-properties-1.1.5.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.xerial.snappy_snappy-java-1.0.5.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.apache.hadoop_hadoop-hdfs-2.7.3.jar, file:/tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9/org.codehaus.jackson_jackson-jaxrs-1.9.13.jar
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:277)
        at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:392)
        at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:294)
        at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
        at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
        at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
        at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
        at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:116)
        at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:104)
        at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:143)
        at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:138)
        at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$2.apply(HiveSessionStateBuilder.scala:55)
        at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$2.apply(HiveSessionStateBuilder.scala:55)
        at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager$lzycompute(SessionCatalog.scala:91)
        at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager(SessionCatalog.scala:91)
        at org.apache.spark.sql.catalyst.catalog.SessionCatalog.isTemporaryTable(SessionCatalog.scala:736)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.isRunningDirectlyOnFiles(Analyzer.scala:814)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.resolveRelation(Analyzer.scala:748)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:780)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:773)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$apply$1.apply(AnalysisHelper.scala:90)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$apply$1.apply(AnalysisHelper.scala:90)
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:89)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186)
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:773)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:719)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:87)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:84)
        at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
        at scala.collection.immutable.List.foldLeft(List.scala:84)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:84)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
        at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:127)
        at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:121)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:106)
        at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
        at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
        at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:58)
        at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:56)
        at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:48)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:643)
        at com.vesoft.nebula.exchange.reader.HiveReader.read(ServerBaseReader.scala:71)
        at com.vesoft.nebula.exchange.Exchange$.com$vesoft$nebula$exchange$Exchange$$createDataSource(Exchange.scala:271)
        at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:136)
        at com.vesoft.nebula.exchange.Exchange$$anonfun$main$2.apply(Exchange.scala:128)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at com.vesoft.nebula.exchange.Exchange$.main(Exchange.scala:128)
        at com.vesoft.nebula.exchange.Exchange.main(Exchange.scala)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:271)
        ... 110 more
Caused by: java.lang.NoClassDefFoundError: java/sql/Date
        at java.base/java.lang.Class.getDeclaredConstructors0(Native Method)
        at java.base/java.lang.Class.privateGetDeclaredConstructors(Class.java:3138)
        at java.base/java.lang.Class.getConstructor0(Class.java:3343)
        at java.base/java.lang.Class.getDeclaredConstructor(Class.java:2554)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:128)
        at org.apache.hadoop.hive.ql.exec.Registry.registerGenericUDF(Registry.java:145)
        at org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:178)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:178)
        at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:186)
        at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:119)
        ... 115 more
Caused by: java.lang.ClassNotFoundException: java.sql.Date
        at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:226)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:215)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
        ... 127 more
21/09/14 17:05:58 INFO SparkContext: Invoking stop() from shutdown hook
21/09/14 17:05:58 INFO SparkUI: Stopped Spark web UI at http://Master:4040
21/09/14 17:05:58 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/09/14 17:05:58 INFO MemoryStore: MemoryStore cleared
21/09/14 17:05:58 INFO BlockManager: BlockManager stopped
21/09/14 17:05:58 INFO BlockManagerMaster: BlockManagerMaster stopped
21/09/14 17:05:58 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/09/14 17:05:58 INFO SparkContext: Successfully stopped SparkContext
21/09/14 17:05:58 INFO ShutdownHookManager: Shutdown hook called
21/09/14 17:05:58 INFO ShutdownHookManager: Deleting directory /tmp/hive-v1_2-11d69579-75b1-485a-95a2-2e35a47e55c9
21/09/14 17:05:58 INFO ShutdownHookManager: Deleting directory /tmp/spark-d9de9256-8226-45fd-9486-c2cd894d764b
21/09/14 17:05:58 INFO ShutdownHookManager: Deleting directory /tmp/spark-ca10f32c-2e10-4a8d-a489-f8db5b514c30

报错信息好像是某个 jar 没有引用到,你贴一下配置文件呢

除了下面2个配置文件,不知还需要其它吗?

**spark-default.xml:**

spark.master Master
spark.sql.hive.metastore.jars  maven


**application.conf**
{
  spark: {
    app: {
      name: Nebula Exchange 2.5
    }

    master:10.202.41.74

    driver: {
      cores: 1
      maxResultSize: 1G
    }

    executor: {
        memory:1G
    }

    cores:{
      max: 16
    }
  }


  nebula: {
    address:{
      graph:["10.202.41.81:9669","10.202.41.74:9669","10.202.40.174:9669"]
      meta:["10.202.41.81:9559","10.202.41.74:9559","10.202.40.174:9559"]
    }
    user: root
    pswd: nebula
    space: ad_graph


    connection {
      timeout: 30000
    }

    error: {
      max: 32
      output: /tmp/errors
    }

    rate: {
      limit: 1024
      timeout: 1000
    }
  }


  tags: [
    {
      name: user
      type: {
        source: hive
        sink: client
      }
      exec: """select unique_identity, if(max(is_mobile),'true','false') as is_mobile from ad.fact_ad_report where day=20200601 and profile_group=9 and  profile_id=409 and 
      client_hour_of_day >= 0 and client_hour_of_day < 1   group by unique_identity limit 1000"""
      fields: [is_mobile]
      nebula.fields: [is_mobile]
      vertex: {
        field: unique_identity
      }
      batch: 256
      partition: 32
    }
  ]
}

我在spark_defaults.xml里指定了spark.sql.hive.metastore.version 1.1.0,这个异常没有了。不过出现另一个异常。我查了,说是spark 2.4里用的ASM 6不支持java 11。可以装spark 3.x解决。

我的问题是:

  1. 我看到pom.xml里maven-scala-plugin设了-target:jvm-1.8,而且我的JAVA_HOME指向1.8版本的,但是为何build出来的是java 11的版本,不知哪里可以降为1.8吗?

  2. 如果我把spark升级为3.x, exchange 2.5跑起来不知会不会有问题?


Exception in thread “main” java.lang.IllegalArgumentException: Unsupported class file major version 55
at org.apache.xbean.asm6.ClassReader.(ClassReader.java:166)
at org.apache.xbean.asm6.ClassReader.(ClassReader.java:148)

应该是自己的环境问题,重新登录没问题了,可以降为1.8

浙ICP备20010487号