The docs state that the old classics of “exp” and “imp” for export/import are not a supported means for migrating data into Autonomous.

But I know that a lot of people often decide there’s a difference between “supported” and “Can I crack on and see if it works” 🙂

So let’s give it a try

First I’ll export the SCOTT schema on my local database


$ exp file=c:\tmp\scott_atp.dmp userid=scott/tiger@pdb21a

Export: Release 23.0.0.0.0 - Production on Mon May 12 20:00:48 2025
Version 23.7.0.25.01

Copyright (c) 1982, 2025, Oracle and/or its affiliates.  All rights reserved.


Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production
Version 21.16.0.0.0
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses AL32UTF8 character set (possible charset conversion)
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user SCOTT
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user SCOTT
About to export SCOTT's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions
. about to export SCOTT's tables via Conventional Path ...
. . exporting table                          BONUS          0 rows exported
. . exporting table                           DEPT          4 rows exported
. . exporting table                            EMP         14 rows exported
. . exporting table                           EMP2         11 rows exported
Export terminated successfully

And then try import it on Autonomous


$ imp userid=scott/******@atp_low file=c:\tmp\scott_atp.dmp 

Import: Release 23.0.0.0.0 - Production on Mon May 12 20:11:52 2025
Version 23.7.0.25.01

Copyright (c) 1982, 2025, Oracle and/or its affiliates.  All rights reserved.


Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.27.0.1.0

Export file created by EXPORT:V23.00.00 via conventional path

import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
IMP-00003: ORACLE error 1031 encountered
ORA-01031: insufficient privileges
IMP-00000: Import terminated unsuccessfully

That doesn’t look great, so lets try it as the ADMIN user


$ imp userid=admin/******@atp_low file=c:\tmp\scott_atp.dmp fromuser=scott touser=scott

Import: Release 23.0.0.0.0 - Production on Mon May 12 20:11:52 2025
Version 23.7.0.25.01

Copyright (c) 1982, 2025, Oracle and/or its affiliates.  All rights reserved.


Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.27.0.1.0

Export file created by EXPORT:V23.00.00 via conventional path

Warning: the objects were exported by SCOTT, not by you

import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
IMP-00003: ORACLE error 1031 encountered
ORA-01031: insufficient privileges
IMP-00000: Import terminated unsuccessfully

And there you have it. This simply isn’t going to work.

But c’mon folks…What are you doing using exp/imp anyway? 🙂

If you really really really needed to do this, simply import it into a standalone database and then use DataPump export/import to get the schema into Autonomous.

8 responses to “Old style export/import and Autonomous”

  1. Jan-Peter Meyer Avatar
    Jan-Peter Meyer

    Hi Connor,

    the reason I miss exp imp is the fact that it worked from the client and did not require access to the DB system file system.

    J.- P.

  2. aminaa49ed2e27e Avatar
    aminaa49ed2e27e

    Because

    SQL> datapump export
    Initiating DATA PUMP
    Database Time Zone: VERSION:43 CON_ID:0
    Log Location: DATA_PUMP_DIR:ESQL_6870.LOG
    Additional Information: ORA-04036: PGA memory used by the instance or PDB exceeds PGA_AGGREGATE_LIMIT.
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 3545
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 4168
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 4757
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 6245
    ORA-06512: at “SYS.DBMS_DATAPUMP”, line 7674
    ORA-06512: at line 48

    https://docs.oracle.com/error-help/db/ora-04036/

    PGA memory used by the instance or PDB exceeds PGA_AGGREGATE_LIMIT.”
    *Cause: Private memory across the instance or pluggable database (PDB)
    exceeded the limit specified in the PGA_AGGREGATE_LIMIT parameter.
    If the limit was reached when allocating from private memory used
    by more than one Oracle process (MGA or DGA), that process was
    interrupted. Otherwise, the largest sessions using Program
    Global Area (PGA) memory were interrupted to get under the limit.
    *Action: Increase the PGA_AGGREGATE_LIMIT initialization parameter or
    reduce memory usage.

    This is what I have in the schema

    “`
    OBJECT_NAME OBJECT_TYPE

    STATS VIEW
    RUN_STATS TABLE
    RUNSTATS_PKG PACKAGE
    RUNSTATS_PKG PACKAGE BODY
    SK_S SEQUENCE
    TERMS TABLE
    SYS_IL0000071218C00010$$ INDEX
    SYS_LOB0000071218C00010$$ LOB
    TERM_DESCRIPTION_INDEX INDEX
    TERM_DESCRIPTION LOB
    H102_U INDEX
    H102_I1 INDEX

    “`

    NAME TYPE VALUE

    pga_aggregate_limit big integer 2G
    pga_aggregate_target big integer 512M
    NAME TYPE VALUE

    allow_group_access_to_sga boolean FALSE
    lock_sga boolean FALSE
    pre_page_sga boolean TRUE
    sga_max_size big integer 1536M
    sga_min_size big integer 0
    sga_target big integer 0

    And I am using these hidden parameters

    NAME TYPE VALUE

    _exadata_feature_on boolean TRUE
    _instance_recovery_bloom_filter_size integer 1048576
    _partition_large_extents string FALSE

    1. Which database version? I’m guessing from your SGA that its Database Free ?
      (If so, drop sga_target / raise pga_agg_tgt for the duration of the export)

      1. aminaa49ed2e27e Avatar
        aminaa49ed2e27e

        With these settings

        allow_group_access_to_sga boolean FALSE
        lock_sga boolean FALSE
        pre_page_sga boolean TRUE
        sga_max_size big integer 1536M
        sga_min_size big integer 0
        sga_target big integer 0 — changed from 1536M

        pga_aggregate_limit big integer 2G
        pga_aggregate_target big integer 512M

        SQL> datapump export
        Initiating DATA PUMP
        Database Time Zone: VERSION:43 CON_ID:0
        Log Location: DATA_PUMP_DIR:ESQL_162.LOG
        Starting “GBMDBA”.”ESQL_162″:
        Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
        Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY
        Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
        Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
        Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA/LOGREP
        Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
        Processing object type SCHEMA_EXPORT/TABLE/TABLE
        Processing object type SCHEMA_EXPORT/TABLE/COMMENT
        Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_SPEC
        Processing object type SCHEMA_EXPORT/PACKAGE/GRANT/OWNER_GRANT/OBJECT_GRANT
        Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
        Processing object type SCHEMA_EXPORT/VIEW/VIEW
        Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
        Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
        Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA/LBAC_EXP
        . . exported “GBMDBA”.”TERMS” 10.8 KB 30 rows
        Master table “GBMDBA”.”ESQL_162″ successfully loaded/unloaded

        Dump file set for GBMDBA.ESQL_162 is:
        G:ORACLEPRODUCT23AIADMINFREEDPDUMPD339FC35970746BAA8ABD700DB8799F4ESQL_162.DMP
        Job “GBMDBA”.”ESQL_162″ successfully completed at Mon May 12 13:08:00 2025 elapsed 0 00:00:30
        DataPump Operation Status ‘COMPLETED’
        Jobname = ESQL_162

        Thanks

  3. abelabilgmailcom Avatar
    abelabilgmailcom

    Try to export from the lower version (19c) and import into the higher version (21c)

    1. Apologies, you’re exactly right that I should have taken care of version parity.

      But you’ll get the same result (I repeated with 19.26 export and import)

  4. Paolo Paolucci Avatar
    Paolo Paolucci

    Could it also be a matter of privileges and roles ADMIN user in ADB currently has? I mean DATAPUMP_CLOUD_EXP/IMP only allows the use of Export and Import DataPump (from 18.3 release on, especially) and that’s why old exp and imp are not allowed and included as roles (differently from a non-ADB?). Other than the fact that traditional exp/imp are deprecated (or at least you need two hops, having in the middle 12c prior to export to a target 19c+, again non-ADB).

Leave a reply to aminaa49ed2e27e Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending