Here’s a quirky one for you. It can happen when you are dealing with a partitioned table where the partition key is defined as RAW. To be honest, I really can’t think of a reason why you ever want a table with a raw partition key. (If you have one, please let me know in the comments). Anyway, here’s the demo. I’ll start with a table using the cool automatic list partition facility and load a single row into it.


C:\>sql18

SQL*Plus: Release 18.0.0.0.0 - Production on Mon Apr 29 16:38:19 2019
Version 18.6.0.0.0

Copyright (c) 1982, 2018, Oracle.  All rights reserved.

Last Successful login time: Mon Apr 29 2019 16:37:30 +08:00

Connected to:
Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Version 18.6.0.0.0


SQL> create table test
  2  (
  3     id         raw(16),
  4     groupid    raw(16),
  5     chardata   varchar2(20),
  6     numbdata   number,
  7     datedata   date
  8  )
  9    partition by list (groupid) automatic (partition P_ values ('FF'))
 10  ;

Table created.

SQL> alter table test
  2  add constraint test_pk
  3  primary key (id)
  4  using index;

Table altered.

SQL> insert into test
  2  values (sys_guid(), sys_guid(), 'some char data', 123, sysdate - 3);

1 row created.

SQL> commit;

Commit complete.

SQL> exit
Disconnected from Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Version 18.6.0.0.0

No dramas there. Now I’ll take a DataPump export of that table using the defaults.


C:\>expdp mcdonac/*****@db18_pdb1 directory=TEMP dumpfile=test_exp.dmp tables=test

Export: Release 18.0.0.0.0 - Production on Mon Apr 29 16:38:19 2019
Version 18.6.0.0.0

Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Starting "MCDONAC"."SYS_EXPORT_TABLE_01":  mcdonac/********@db18_pdb1 directory=TEMP dumpfile=test_exp.dmp tables=test
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
. . exported "MCDONAC"."TEST":"P_"                           0 KB       0 rows
. . exported "MCDONAC"."TEST":"SYS_P7389"                6.804 KB       1 rows
Master table "MCDONAC"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for MCDONAC.SYS_EXPORT_TABLE_01 is:
  C:\TEMP\TEST_EXP.DMP
Job "MCDONAC"."SYS_EXPORT_TABLE_01" successfully completed at Mon Apr 29 16:38:40 2019 elapsed 0 00:00:16

Everything is still going OK. If I took a DataPump export, it’s a reasonable assumption that I will be wanting to load that into a new database somewhere. To mimic that, I’ll simply drop the table I just created so that I can import the table back into the same schema anew.


C:\>sql18

SQL*Plus: Release 18.0.0.0.0 - Production on Mon Apr 29 16:38:42 2019
Version 18.6.0.0.0

Copyright (c) 1982, 2018, Oracle.  All rights reserved.

Last Successful login time: Mon Apr 29 2019 16:38:19 +08:00

Connected to:
Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Version 18.6.0.0.0



SQL> drop table test purge;

Table dropped.

SQL> exit
Disconnected from Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Version 18.6.0.0.0

And now we’ll run a simple DataPump import to put my table back in place.



C:\>impdp mcdonac/******@db18_pdb1 directory=TEMP dumpfile=test_exp.dmp tables=test

Import: Release 18.0.0.0.0 - Production on Mon Apr 29 16:38:43 2019
Version 18.6.0.0.0

Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
Master table "MCDONAC"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "MCDONAC"."SYS_IMPORT_TABLE_01":  mcdonac/********@db18_pdb1 directory=TEMP dumpfile=test_exp.dmp tables=test
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:"MCDONAC"."TEST" failed to create with error:
ORA-14308: partition bound element must be one of: string, datetime or interval literal, number, or NULL

Failing sql is:
CREATE TABLE "MCDONAC"."TEST" ("ID" RAW(16), "GROUPID" RAW(16), "CHARDATA" VARCHAR2(20 BYTE) COLLATE "USING_NLS_COMP", "NUMBDATA" NUMB
ER, "DATEDATA" DATE)  DEFAULT COLLATION "USING_NLS_COMP" PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  STORAGE( BUFFER_POOL DEFAULT F
LASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "USERS"  PARTITION BY LIST ("GROUPID") AUTOMATIC  (PARTITION "P_"  VALUES ('FF
') SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING  STORAGE( BUFFER_POOL DEFAULT FLASH_CAC
HE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "USERS"  READ WRITE , PARTITION "SYS_P7389"  VALUES (HEXTORAW('38E50ADF7A7840149B16767
9433196C5')) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING  STORAGE(INITIAL 8388608 NEX
T 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_
FLASH_CACHE DEFAULT) TABLESPACE "USERS"  READ WRITE )

Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39112: Dependent object type CONSTRAINT:"MCDONAC"."TEST_PK" skipped, base object type TABLE:"MCDONAC"."TEST" creation failed

Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job "MCDONAC"."SYS_IMPORT_TABLE_01" completed with 2 error(s) at Mon Apr 29 16:38:46 2019 elapsed 0 00:00:03

Uh oh… You can see from the highlighted part of the import log, that DataPump sees the RAW datatype definition and hence tries to map the partition keys to the the same datatype using HEXTORAW. It’s an interesting contradiction that you are allowed to partition a table by raw, but if you try to nominate a raw datatype in the actual partition definition values, then the database gets upset with you. Here’s the same DDL in simpler form showing the same error.



SQL> create table test
  2  (
  3     id         raw(16),
  4     groupid    raw(16),
  5     chardata   varchar2(20),
  6     numbdata   number,
  7     datedata   date
  8  )  partition by list (groupid) (partition P_ values (hextoraw('FF')));
)  partition by list (groupid) (partition P_ values (hextoraw('FF')))
                                                                   *
ERROR at line 8:
ORA-14308: partition bound element must be one of: string, datetime or interval literal, number, or NULL

To be honest, I’m not losing any sleep over this, because I can’t think of a reason to partition by raw. But if you do have a valid case for it, then you’ll need to pre-create the table (without the HEXTORAW expressions) and then use DataPump to load the data back into the table.

4 responses to “Raw partitions?”

  1. Possible use case for wanting to partition by raw: Suppose you have a parent table, primary key that’s a guid (raw), and a child table that foreign keys off the parent table. Both tables are extremely large. You have a query that joins the two tables using hash joins, but it runs extremely slowly because not all the records fit in memory, you end up using one-pass or multipass hash joins. If you wanted to take advantage of partition-wise joins, you could end up partitioning both tables on the join keys with the same partitioning scheme.

    1. Agreed, but I’m not a huge fan of guids for primary keys. Excessive space compared to an integer, and your indexes don’t end up as compact as they might be (if that’s an issue)

  2. […] issue was discovered by Connor McDonald about a month ago. In his blog post Raw Partitions?, he explains a test case with an Oracle 18.6 database. Connor discovered that DataPump import […]

  3. The problem seems to be fixed now. In Oracle 19.3, the DataPump import works:

    Oracle 19c: Auto-List Partitioning HIGH_VALUE Bugs fixed

Got some thoughts? Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending

Blog at WordPress.com.