user1751356
user1751356

Reputation: 615

oracle base 64 decode to clob

I have data present in an oracle table in base 64 encrypted data format. This data is present in a clob field.

create table tableA
(
id number, 
encoded_description clob);

insert into tableA 
(id, encoded_description) 
values
(1, 'Zm9sbG93IHVw');

insert into tableA 
(id, encoded_description) 
values
(2, 'dG8gbWFueSByZWQgZmxhZ3M=');

commit;

Table A output which contains base64 encoded data in encoded_description field: 

Table A: 

    ID, encoded_description 
    1   Zm9sbG93IHVw
    2   dG8gbWFueSByZWQgZmxhZ3M=



 create table tableB
    (
    id number, 
    decoded_description clob);

Table B: output after conversion 

    ID, Decoded_description
    1   <<Original Text>>
    2   <<Original Text>>

I want to load the data from tableA into tableB by decoding the base64 data in clob field to its original text form. How can i achieve this? Can i make use of any oracle functions to perform this conversion. Please help

Upvotes: 0

Views: 9920

Answers (1)

eaolson
eaolson

Reputation: 15094

Assuming your data is large enough to actually need to be stored in a CLOB (i.e. it's several thousand characters), you'll need to do the following:

  1. Retrieve your CLOB from the table.
  2. Convert to a BLOB using utl_lob.converttoblob.
  3. Break the BLOB into parts using utl_lob.substr, which returns a RAW. Each part must be less than the RAW size limit of 2000 characters and a multiple of 4 bytes (why 4? see: Is it possible to base64-encode a file in chunks?)
  4. Use utl_encode.base64_decode to base64-decode the chunk.
  5. Use utl_raw.cast_to_varchar2 to convert the decoded chunk back to a varchar2.
  6. Concatenate all the decoded chunks into a CLOB.

Step #5 may lead to some character set problems if your encoded data is not in the expected character set, but I'm not entirely clear how that conversion works.

Obviously, you want to write a function for this so you never have to do it more than once.

Upvotes: 0

Related Questions