I copied a famous example from spark official example SparkPi, when run it in idea, it pops an error:
I did a lot of search but no luck there. Finally I find out that the problem is caused by my pom.xml settings.
The reason of doing this is because we don't to package spark-core jar file in our finally package, but when specify it as provided it cannot run locally. The details can be check in another post of mine:org.apache.spark spark-core_2.10 1.4.1 ${myscope}
IntelliJ "Provided" Scope Problem
The solution is that:
1. keep the ${myscope} value here
2. add pom.xml a variable called myscope
3. In maven part, keep the settingcompile
clean install -Dmyscope=provided
Now the spark object can run successfully and also package as we expected.
If you think this article is useful, please click the ads on this page to help. Thank you very much.