@Beta public interface SparkPluginContext extends BatchContext
| Modifier and Type | Method and Description |
|---|---|
void |
setSparkConf(org.apache.spark.SparkConf sparkConf)
Sets a
SparkConf to be used for the Spark execution. |
createDataset, datasetExists, getArguments, getRuntimeArguments, setRuntimeArgumentdiscardDataset, getDataset, getDataset, getDataset, getDataset, releaseDatasetgetInputSchema, getInputSchemas, getLogicalStartTime, getMetrics, getNamespace, getOutputPortSchemas, getOutputSchema, getPipelineName, getPluginProperties, getPluginProperties, getStageName, loadPluginClass, newPluginInstancegetServiceURL, getServiceURLprovidevoid setSparkConf(org.apache.spark.SparkConf sparkConf)
SparkConf to be used for the Spark execution.
If your configuration will not change between pipeline runs,
use PipelineConfigurer.setPipelineProperties(java.util.Map<java.lang.String, java.lang.String>)
instead. This method should only be used when you need different
configuration settings for each run.
Due to limitations in Spark Streaming, this method cannot be used
in realtime data pipelines. Calling this method will throw an
UnsupportedOperationException in realtime pipelines.Copyright © 2017 Cask Data, Inc. Licensed under the Apache License, Version 2.0.