Featured

    Featured Posts

Datapump Import Parameters in Oracle 11g



Import Parameters in Oracle Datapump :
Parameter
Description
abort_step
Undocumented feature
access_method
Data Access Method – default is Automatic
attach
Attach to existing job – no default
cluster
Start workers across cluster; default is Y
content
Content to import: default is ALL
data_options
Import data layer options
current_edition
Applications edition to be used on local database
directory
Default directory specification
dumper_directory
Directory for stream dumper
dumpfile
import dumpfile names: format is (file1, file2)
encryption_password
Encryption key to be used
estimate
Calculate size estimate: default is BLOCKS
exclude
Import exclude option: no default
flashback_scn
system change number to be used for flashback import: no default
flashback_time
database time to be used for flashback import: no default
full
indicates a full Mode import
help
help: display description of import parameters, default is N
include
import include option: no default
ip_address
IP Address for PLSQL debugger
job_name
Job Name: no default)’
keep_master
keep_master: Retain job table upon completion
logfile
log import messages to specified file
master_only
only import the master table associated with this job
metrics
Enable/disable object metrics reporting
mp_enable
Enable/disable multi-processing for current session
network_link
Network mode import
nologfile
No import log file created
package_load
Specify how to load PL/SQL objects
parallel
Degree of Parallelism: default is 1
parallel_threshold
Degree of DML Parallelism
parfile
parameter file: name of file that contains parameter specifications
partition_options
Determine how partitions should be handle: Default is NONE
query
query used to select a subset of rows for a table
remap_data
Transform data in user tables
remap_datafile
Change the name of the source datafile
remap_schema
Remap source schema objects to new schema
remap_table
Remap tables to a different name
remap_tablespace
Remap objects to different tablespace
reuse_datafiles
Re-initialize existing datafiles (replaces DESTROY)
schemas
schemas to import: format is ‘(schema1, , schemaN)’
service_name
Service name that job will charge against
silent
silent: display information, default is NONE
skip_unusable_indexes
Skip indexes which are in the unsed state)
source_edition
Applications edition to be used on remote database
sqlfile
Write appropriate SQL DDL to specified file
status
Interval between status updates
streams_configuration
import streams configuration metadata
table_exists_action
Action taken if the table to import already exists
tables
Tables to import: format is ‘(table1, table2, , tableN)
tablespaces
tablespaces to transport: format is ‘(ts1,, tsN)’
trace
Trace option: enable sql_trace and timed_stat, default is 0
transform
Metadata_transforms
transportable
Use transportable data movement: default is NEVER
transport_datafiles
List of datafiles to be plugged into target system
transport_tablespaces
Transportable tablespace option: default is N
transport_full_check
Verify that Tablespaces to be used do not have dependencies
tts_closure_check
Enable/disable transportable containment check: def is Y
userid
user/password to connect to oracle: no default
version
Job version: Compatible is the default


Facebook like and share

Related Posts You might be Interested:

Do you like this post? Please share this article.

HTML Link Code:

Post a Comment

https://marthadba.blogspot.in/

Copyright © MARTHADBA|About Us |Disclaimer | Contact Us |Sitemap |Designed By CodeNirvana