AnsweredAssumed Answered

Newb needs help compiling and running spark

Question asked by kaffe_02 on Feb 23, 2008
Latest reply on Oct 8, 2008 by mudassir


Hello I am a total newb to spark, ive used openfire before and have found quite a wonderful product.  I was wanting to test out spark to see if it could be used in my current situation.  The environment it would be deployed in would be about half windows and half linux.  I have tried to use the documentation on here to compile and run spark but havent had any luck. So any help you could provide would be greatly appreciated.



I am using ubunu 7.10



java version "1.6.0_03"

Java(TM) SE Runtime Environment (build 1.6.0_03-b05)

Java HotSpot(TM) Client VM (build 1.6.0_03-b05, mixed mode, sharing)



Apache Ant version 1.7.0 compiled on August 29 2007



I have downloaded the source code from the svn repository. I wanted to just get it running stock before trying to modify anything.



I went to  $SPARK_HOME/build/ and ran "ant release" that worked fine, I got a BUILD SUCCESSFUL message.



Not seeing a run option in that build.xml I went to$SPARK_HOME/build/builder/build/  there I ran an "ant jar" and got a BUILD SUCESSFUL



Then I ran an "ant run" but got a BUILD FAILED



/usr/local/src/spark/build/builder/build/build.xml:64: /usr/local/src/spark/build/spark/lib not found.



What do I need to do to get the lib directory to be created?  Am I going about things the right way?  I couldn't find any documentation for building under linux so if there is any please let me know.



Thanks in advance.