public class DBSource extends ReferenceBatchSource<org.apache.hadoop.io.LongWritable,DBRecord,co.cask.cdap.api.data.format.StructuredRecord>
| Modifier and Type | Class and Description |
|---|---|
static class |
DBSource.DBSourceConfig
PluginConfig for DBSource |
| Constructor and Description |
|---|
DBSource(DBSource.DBSourceConfig sourceConfig) |
| Modifier and Type | Method and Description |
|---|---|
void |
configurePipeline(co.cask.cdap.etl.api.PipelineConfigurer pipelineConfigurer) |
void |
destroy() |
co.cask.cdap.api.data.schema.Schema |
getSchema(co.cask.hydrator.plugin.db.batch.source.DBSource.GetSchemaRequest request,
co.cask.cdap.api.plugin.EndpointPluginContext pluginContext)
Endpoint method to get the output schema of a query.
|
void |
initialize(co.cask.cdap.etl.api.batch.BatchRuntimeContext context) |
void |
prepareRun(co.cask.cdap.etl.api.batch.BatchSourceContext context) |
void |
transform(co.cask.cdap.api.dataset.lib.KeyValue<org.apache.hadoop.io.LongWritable,DBRecord> input,
co.cask.cdap.etl.api.Emitter<co.cask.cdap.api.data.format.StructuredRecord> emitter) |
public DBSource(DBSource.DBSourceConfig sourceConfig)
public void configurePipeline(co.cask.cdap.etl.api.PipelineConfigurer pipelineConfigurer)
configurePipeline in interface co.cask.cdap.etl.api.PipelineConfigurableconfigurePipeline in class ReferenceBatchSource<org.apache.hadoop.io.LongWritable,DBRecord,co.cask.cdap.api.data.format.StructuredRecord>@Path(value="getSchema")
public co.cask.cdap.api.data.schema.Schema getSchema(co.cask.hydrator.plugin.db.batch.source.DBSource.GetSchemaRequest request,
co.cask.cdap.api.plugin.EndpointPluginContext pluginContext)
throws IllegalAccessException,
SQLException,
InstantiationException
request - GetSchemaRequest containing information required for connection and query to execute.pluginContext - context to create pluginsSQLExceptionInstantiationExceptionIllegalAccessExceptionpublic void prepareRun(co.cask.cdap.etl.api.batch.BatchSourceContext context)
throws Exception
prepareRun in class co.cask.cdap.etl.api.batch.BatchConfigurable<co.cask.cdap.etl.api.batch.BatchSourceContext>Exceptionpublic void initialize(co.cask.cdap.etl.api.batch.BatchRuntimeContext context)
throws Exception
public void transform(co.cask.cdap.api.dataset.lib.KeyValue<org.apache.hadoop.io.LongWritable,DBRecord> input, co.cask.cdap.etl.api.Emitter<co.cask.cdap.api.data.format.StructuredRecord> emitter) throws Exception
transform in interface co.cask.cdap.etl.api.Transformation<co.cask.cdap.api.dataset.lib.KeyValue<org.apache.hadoop.io.LongWritable,DBRecord>,co.cask.cdap.api.data.format.StructuredRecord>transform in class co.cask.cdap.etl.api.batch.BatchSource<org.apache.hadoop.io.LongWritable,DBRecord,co.cask.cdap.api.data.format.StructuredRecord>Exceptionpublic void destroy()
destroy in interface co.cask.cdap.etl.api.Destroyabledestroy in class co.cask.cdap.etl.api.batch.BatchSource<org.apache.hadoop.io.LongWritable,DBRecord,co.cask.cdap.api.data.format.StructuredRecord>Copyright © 2017 Cask Data, Inc. Licensed under the Apache License, Version 2.0.