prateek
prateek

Reputation: 69

import csv file into table using SQL Loader [but large no. of Columns]

I want to import data in the form of csv file into a table.[using Oracle SQL developer].I have such hundred files and each has about 50 columns.

From the wiki of SQL*Loader (http://www.orafaq.com/wiki/SQL*Loader_FAQ)

 load data
 infile 'c:\data\mydata.csv'
 into table emp
 fields terminated by "," optionally enclosed by '"'          
 ( empno, empname, sal, deptno )  //these are the columns headers

What i don't want to do is list down all the column headers.I just want all the enteries in the csv file to be assigned to members in the tables in the order in which they appear.

Moreover after all think i want to automate it for all the 100 files.

Upvotes: 3

Views: 12338

Answers (3)

iwita
iwita

Reputation: 91

You should write down the columns (and their type optionally) so as to assign the values of your csv file to each column. You should do this because the order of the columns in the table in your Oracle Database is not known in the script.

After you write the columns in the order they appear in your csv files, you can automate this script for all of your files by typing:

infile *.csv

Upvotes: 1

Kris Rice
Kris Rice

Reputation: 3410

An alternative to sqlldr that does what you are looking for is the LOAD command in SQLcl. It simply matches header row in the csv to the table and loads it. However this is not as performant nor as much control as sqlldr.

LOAD [schema.]table_name[@db_link] file_name

Here's the full help for it.

sql klrice/klrice
...
KLRICE@xe>help load
LOAD
-----

Loads a comma separated value (csv) file into a table.
The first row of the file must be a header row.  The columns in the header row must match the columns defined on the table.

The columns must be delimited by a comma and may optionally be enclosed in double quotes.
Lines can be terminated with standard line terminators for windows, unix or mac.
File must be encoded UTF8.

The load is processed with 50 rows per batch.
If AUTOCOMMIT is set in SQLCL, a commit is done every 10 batches.
The load is terminated if more than 50 errors are found.

LOAD [schema.]table_name[@db_link] file_name
KLRICE@xe>

Example from a git repo I have at https://github.com/krisrice/maxmind-oracledb

SQL> drop table geo_lite_asn;

Table GEO_LITE_ASN dropped.

SQL> create table geo_lite_asn (
  2     "network" varchar2(32),
  3     "autonomous_system_number" number,
  4     "autonomous_system_organization" varchar2(200))
  5  /

Table GEO_LITE_ASN created.

SQL> load geo_lite_asn GeoLite2-ASN-CSV_20180130/GeoLite2-ASN-Blocks-IPv4.csv
--Number of rows processed: 397,040
--Number of rows in error: 0
0 - SUCCESS: Load processed without errors
SQL> 

Upvotes: 0

Ari
Ari

Reputation: 157

You can try oracle csv loader. It automatically creates the table and the controlfile based on the csv content and loads the csv into an oracle table using sql loader.

Upvotes: 0

Related Questions