Jan 18, 2024

IntelliJ Editor NNBSP problem

 






Recently I start to use IntelliJ to coding, but I met a strange problem as shown above that there are special characters, which in my mind are garbage codes. How to remove them?

There a lots of posts. But let's first try OpenAI, let's see whether they know the answer well. 

Below are the interesting dialogs between me and ChatGPT. Have fun!






But actually it's not correct, for example in my editor settings, "Show whitespaces" are already unchecked. 


【Solutions】

Go to IntelliJ Settings -> Editor -> Advanced Settings,

Uncheck "Render special characters, such as control codes, using their Unicode name abbreviations.



We can see the contents are good now. 



PS: It has been quite a long time since my last post. How time flies! But better late than never, and let's go back to the beautiful engineer world!

Nov 9, 2016

Adobe Reader Error "adobe failed to connect to a dde server"

I met a problem that my Adobe Reader cannot open pdf files, and pops up an error: "adobe failed to connect to a dde server"

To solve this, simply open control panel->Adobe Reader->Modify, use repair model to fix the problem.


Sep 7, 2015

Strange spark problem: "Error:scalac: error while loading ..., Missing dependency bad symbolic reference..."

I am very new to spark and try to develop it on Windows using IntelliJ. This kind environment is not typical environment for developing spark because normally people use Ubuntu + IntelliJ.

I copied a famous example from spark official example SparkPi, when run it in idea, it pops an error:

I did a lot of search but no luck there. Finally I find out that the problem is caused by my pom.xml settings.

 

    org.apache.spark
    spark-core_2.10
    1.4.1
    ${myscope}

The reason of doing this is because we don't to package spark-core jar file in our finally package, but when specify it as provided it cannot run locally. The details can be check in another post of mine:
IntelliJ "Provided" Scope Problem

The solution is that:
1. keep the ${myscope} value here
2. add pom.xml a variable called myscope


      compile

3. In maven part, keep the setting
clean install -Dmyscope=provided

Now the spark object can run successfully and also package as we expected.

If you think this article is useful, please click the ads on this page to help. Thank you very much.

Aug 29, 2015

GitHub for Windows through Company Proxy

GitHub is a very famous tool but I am just starting to use it... Shy... Because we come from ancient times in source codes management: ClearCase, SVN, (TeamForge), now it's time to embrace GitHub.

I have downloaded a windows version. GitHub for Windows... Maybe the best way is to use the command line which I will investigate later.

This is the tool GitHub for Windows, which can be obtained here: https://git-scm.com/download/win



The problem is that we are under company proxy and there is no option to change it. Luckily there is a solution.

1. Open the file
C:\Users\YOURNAME\.gitconfig

2. Add there two lines
[http]
proxy = http://YOUR_COMPANY_PROXY:8080

[https]
proxy = http://YOUR_CONPANY_PROXY:8080

OK, all set!


By I feel using command line is more convenient.

1. clone

git clone REPOSITORY URL
for example

2. update
1) cd to the folder for example Spark
2) git pull


For the turorial of Git, we can check the great website, many thanks to the author!
http://rogerdudler.github.io/git-guide/

If you think this article is useful, please click the ads on this page to help. Thank you very much.

Aug 11, 2015

IntelliJ "Provided" Scope Problem

IntelliJ is such a famous tool that lots of users saying that once you get used to it, you can never come back. To a Eclipse user, there is some learning curves to use IntelliJ. Today I met a interesting problem that exist only in IntelliJ.

The background is that I moved my project from Eclipse to IntelliJ, the project runs very well in eclipse but I used the same way to run in IntelliJ (short as idea), it throws the error like below:


I searched a lot and finally find that it's because the storm package in pom.xml is specified as provided.


 org.apache.storm
 storm-core
 0.10.0-beta1
 provided


The reason the package must be specified as provided is that the storm cluster already contains the storm jar file, if our package contains the storm jar file again, it will cause conflicts.

But according to the definition of provided from maven website: https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html
  • provided
    This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime. For example, when building a web application for the Java Enterprise Edition, you would set the dependency on the Servlet API and related Java EE APIs to scope provided because the web container provides those classes. This scope is only available on the compilation and test classpath, and is not transitive.
This is why there is no problem when compile but has problems during runtime.

Here we come to a dilemma: when run it in IDE (intelliJ) we need to scope to be compile, but when package it to deploy on cluster, we need it to be package scope. One way to solve this is each time we change the scope, but this is very annoying. 

So here is the solution:
1. Set the scope to be a parameter, such as ${myscope}  

 org.apache.storm
 storm-core
 0.10.0-beta1
 ${myscope}


2. create a mvn task
expand Lifecycle, right click an item for example install, and choose create Practise...


then specify command line: clean install -Dmyscope=provided



With this, it's running very well and also can package with provided scope!

If you think this article is useful, please click the ads on this page to help. Thank you very much.