package
9.0.0-alpha+incompatible
Repository: https://github.com/pingcap/tidb.git
Documentation: pkg.go.dev

# Functions

ASTArgsFromImportPlan creates ASTArgs from plan.
ASTArgsFromPlan creates ASTArgs from plan.
ASTArgsFromStmt creates ASTArgs from statement.
CancelJob cancels import into job.
CreateJob creates import into job by insert a record to system table.
FailJob fails import into job.
FinishJob tries to finish a running job with jobID, change its status to finished, clear its step.
FlushTableStats flushes the stats of the table.
GetActiveJobCnt returns the count of active import jobs.
GetAllViewableJobs gets all viewable jobs.
GetImportRootDir returns the root directory for import.
GetJob returns the job with the given id if the user has privilege.
GetMsgFromBRError get msg from BR error.
GetRegionSplitSizeKeys gets the region split size and keys from PD.
GetTargetNodeCPUCnt get cpu count of target node where the import into job will be executed.
Job2Step tries to change the step of a running job with jobID.
NewFileChunkProcessor creates a new local sort chunk processor.
NewImportPlan creates a new import into plan.
NewIndexRouteWriter creates a new IndexRouteWriter.
NewLoadDataController create new controller.
NewPlanFromLoadDataPlan creates a import plan from LOAD DATA.
NewProgress creates a new Progress.
NewTableImporter creates a new table importer.
NewTableImporterForTest creates a new table importer for test.
NewTableKVEncoder creates a new tableKVEncoder.
PostProcess does the post-processing for the task.
ProcessChunk processes a chunk, and write kv pairs to dataEngine and indexEngine.
ProcessChunkWithWriter processes a chunk, and write kv pairs to dataWriter and indexWriter.
RebaseAllocatorBases rebase the allocator bases.
StartJob tries to start a pending job with jobID, change its status/step to running/input step.
VerifyChecksum verify the checksum of the table.

# Constants

DataFormatCSV represents the data source file of IMPORT INTO is csv.
DataFormatDelimitedData delimited data.
DataFormatParquet represents the data source file of IMPORT INTO is parquet.
DataFormatSQL represents the data source file of IMPORT INTO is mydumper-format DML file.
DataSourceTypeFile represents the data source of IMPORT INTO is file.
DataSourceTypeQuery represents the data source of IMPORT INTO is query.
50GiB.
JobStatusRunning exported since it's used in show import jobs.
JobStepGlobalSorting is the first step when using global sort, step goes from none -> global-sorting -> importing -> validating -> none.
JobStepImporting is the first step when using local sort, step goes from none -> importing -> validating -> none.
constants for job status and step.

# Variables

CheckDiskQuotaInterval is the default time interval to check disk quota.
GetEtcdClient returns an etcd client.
LoadDataReadBlockSize is exposed for test.
96 KB (data + index).
MinDeliverRowCnt see default for tikv-importer.max-kv-pairs.
NewClientWithContext returns a kv.Client.
NewTiKVModeSwitcher make it a var, so we can mock it in tests.
TestLastImportJobID last created job id, used in unit test.
TestSyncCh is used in unit test to synchronize the execution.

# Structs

ASTArgs is the arguments for ast.LoadDataStmt.
FieldMapping indicates the relationship between input field and table column or user variable.
ImportParameters is the parameters for import into statement.
IndexRouteWriter is a writer for index when using global sort.
JobImportResult is the result of the job import.
JobInfo is the information of import into job.
JobSummary is the summary info of import into job.
LoadDataController load data controller.
LoadDataReaderInfo provides information for a data reader of LOAD DATA.
Plan describes the plan of LOAD DATA and IMPORT INTO.
Progress is the progress of the IMPORT INTO task.
QueryRow is a row from query result.
TableImporter is a table importer.

# Interfaces

ChunkProcessor is used to process a chunk of data, include encode data to KV and deliver KV to local or global storage.
KVEncoder encodes a row of data into a KV pair.

# Type aliases

DataSourceType indicates the data source type of IMPORT INTO.