mercredi 14 septembre 2016

How to run unit test in spark -java

Below is the test class I written to test file copy function

public class SparkTestClass {

        private Configuration conf;
        private FileSystem fs;
        private JavaSparkContext  jcTest;
        private FunUtil funUtil = new FunUtil();

        @BeforeClass
          public static void setupTest() throws Exception{
            // logging
            LogManager.getLogger(SparkDriver.class).setLevel(Level.DEBUG);
           }


        @Before
        public void setup() throws Exception{
            conf = new HdfsConfiguration();
            fs = FileSystem.get(conf);

            SparkConf conf = new SparkConf()
            .setAppName("File_copy_Test")
            .setMaster("local");

            jcTest = new JavaSparkContext("local", "test", conf);
      }

        @After
        public void tearDown() {
            if(jcTest != null){
                jcTest.stop();
                jcTest = null;
            }

        }

        @Test
        public void testCopytoRaw() throws FileAlreadyExistsException ,IOException,Exception{


            String path = "unit_test_output";
            JavaRDD<String> inputTestRDD = jcTest.parallelize(Arrays.asList(new String[] {
             "test_data1","test_data2","test_data3","test_data4",}));
            Assert.assertEquals(fs.exists(new Path(path)) , false);
            funUtil.copyTextToRaw(inputTestRDD, path);
            Assert.assertEquals(fs.exists(new Path(path)) , true);
                    }
    }

I run job by spark-submit --class "driver_name" "jar" --master [local].But How do I run test class cluster/local? This test case is to test file copy function.

Aucun commentaire:

Enregistrer un commentaire