|Commit found by message id
Tue, 3 Jun 2008
[ 08:38:16 miwi ] |
databases Import CSV data and Large Object to PostgreSQL
pgloader imports data from a flat file and inserts it into one or
more PostgreSQL database tables. It uses a flat file per database
table, and you can configure as many Sections as you want, each one
associating a table name and a data file.
Data are parsed and rewritten, then given to PostgreSQL COPY command.
Parsing is necessary for dealing with end of lines and eventual trailing
separator characters, and for column reordering: your flat data file may
not have the same column order as the database table has.
pgloader is also able to load some large objects data into PostgreSQL,
as of now only Informix UNLOAD data files are supported. This command
gives large objects data location information into the main data file.
pgloader parse it add the text or bytea content properly escaped to the
pgloader issues some timing statistics every "commit_every" commits. At
the end of processing each section, a summary of overall operations,
numbers of rows copied and commits, time it took in seconds, errors
logged and database errors is issued.
Submitted by: Pierre-Emmanuel Andre
Number of ports [& non-ports] in this commit: 2
Showing files for just one port: databases/pgloader
show all files
hide all files
14 vulnerabilities affecting 20 ports have been reported in the past 14 days
* - modified, not new