Here you can find tips to debug and understand Spark behavior.
(Many thanks to Francisco for these wonderful docs!)
Error logs can be opened from Help > View Logs
Generally, they are located in:
User Home Directory for example: c:\Documents and Settings%USER_NAME%\Spark\logs\
Installation Directory for example: c:\Program Files\Spark\logs\
Spark is based in the XMPP protocol and every packet that sends and
receives from the server can be seen in the Smack Debug Window.
To open the debug window you should press F-12 or Help -> Show Traffic Window
Spark in Debug Mode
To run Spark in Debug Mode you should
Download the following file:
Place in to %Spark Installation Path%/bin, for example: c:\Program Files\Spark\bin\
Execute Spark from the command line: startup -debug
Now you are running spark with a transcript window with the Spark debug information.
In Windows, to get that window in a correct dimension go to the windows menu -> Properties -> Design -> and set:
Buffer width: 120
Buffer height: 8000
Screen width: 120
Screen height: 30
Notice that you cannot run more than one Spark with this script unless you change address=8000 to address=8001 (for example).
Debug from source code
To be able to debug Spark or Smack adding breakpoints to the java source code, you need:
Run Spark in debug mode
Spark client and Java source code synchronized, i.e. the same version.
a Remote Debugger with IntelliJ IDEA. Go to Run -> Edit
Configurations -> Add New Configuration -> Remote -> Then fill
in Host: the host where the spark client is running and Port: 8000.
Add the necessary breakpoints and when Spark executes this code the IntelliJ debugger will be activated.
Using jconsole you can dynamically connect to the console and see what the JVM is doing.
The jconsole executable is in JDK_HOME/bin, where JDK_HOME is the directory where the JDK is installed.
Run jconsole then select org.jivesoftware.launcher.Startup and press Connect.
Now we have everything ready to catch a bug.
Once a bug is detected the objective is to get the detailed procedure
to reproduce the bug. In this task we will need to reproduce many times
the bug until the procedure is finished and add the relevant
information extracted from the installed tools.
Examples of Spark features
(soon some debugging examples)