Sep 7, 2015

Strange spark problem: "Error:scalac: error while loading ..., Missing dependency bad symbolic reference..."

I am very new to spark and try to develop it on Windows using IntelliJ. This kind environment is not typical environment for developing spark because normally people use Ubuntu + IntelliJ.

I copied a famous example from spark official example SparkPi, when run it in idea, it pops an error:

I did a lot of search but no luck there. Finally I find out that the problem is caused by my pom.xml settings.

 

    org.apache.spark
    spark-core_2.10
    1.4.1
    ${myscope}

The reason of doing this is because we don't to package spark-core jar file in our finally package, but when specify it as provided it cannot run locally. The details can be check in another post of mine:
IntelliJ "Provided" Scope Problem

The solution is that:
1. keep the ${myscope} value here
2. add pom.xml a variable called myscope


      compile

3. In maven part, keep the setting
clean install -Dmyscope=provided

Now the spark object can run successfully and also package as we expected.

If you think this article is useful, please click the ads on this page to help. Thank you very much.