public class UploadJob extends TunnelJob
TunnelJob.FileFormat
bucketName, nullValue, objectPrefix, partCols, tblSchema
Constructor and Description |
---|
UploadJob(Queue queue,
String dbName,
String tableName)
Deprecated.
|
UploadJob(Queue queue,
String dbName,
String tableName,
Boolean isOverwrite)
Create a upload job to upload data to DLI.
|
UploadJob(Queue queue,
String dbName,
String tableName,
PartitionSpec partitionSpec,
Boolean isOverwrite)
Create a upload job to upload data to DLI.
|
UploadJob(Queue queue,
String dbName,
String tableName,
String partitionSpec,
Boolean isOverwrite) |
Modifier and Type | Method and Description |
---|---|
void |
beginCommit() |
void |
beginCommit(String transactionId) |
Writer |
createWriter() |
JobStatus |
getCommitStatus()
Deprecated.
|
String |
getInternalJobId() |
long |
getUploadResultCount() |
String |
getUploadStatus() |
int |
getWriterDataBlockSize() |
void |
resetCommitted() |
void |
setWriterDataBlockSize(int writerDataBlockSize) |
newRow, prepare, prepare, validPartSpec
@Deprecated public UploadJob(Queue queue, String dbName, String tableName) throws DLIException
DLIException
public UploadJob(Queue queue, String dbName, String tableName, Boolean isOverwrite) throws DLIException
For overwrite it will keep same behaviors with Hive, see https://support.huaweicloud.com/sqlref-spark-dli/dli_08_0095.html.
queue
- the queue to running the upload spark jobdbName
- the database name which the data to uploadtableName
- the table name which the data to uploadisOverwrite
- if use overwrite when upload dataDLIException
- throw exception if create upload job failedpublic UploadJob(Queue queue, String dbName, String tableName, String partitionSpec, Boolean isOverwrite) throws DLIException
DLIException
public UploadJob(Queue queue, String dbName, String tableName, PartitionSpec partitionSpec, Boolean isOverwrite) throws DLIException
For overwrite it will keep same behaviors with Hive, see https://support.huaweicloud.com/sqlref-spark-dli/dli_08_0095.html.
queue
- the queue to running the upload spark jobdbName
- the database name which the data to uploadtableName
- the table name which the data to uploadpartitionSpec
- the partition spec for a partitioned tableisOverwrite
- if use overwrite when upload dataDLIException
- throw exception if create upload job failedpublic int getWriterDataBlockSize()
public void setWriterDataBlockSize(int writerDataBlockSize)
public void resetCommitted()
public Writer createWriter() throws DLIException
DLIException
public void beginCommit() throws DLIException
DLIException
public void beginCommit(String transactionId) throws DLIException
DLIException
public String getInternalJobId() throws DLIException
DLIException
@Deprecated public JobStatus getCommitStatus() throws DLIException
DLIException
public String getUploadStatus() throws DLIException
DLIException
public long getUploadResultCount() throws DLIException
DLIException
Copyright © 2023. All rights reserved.