Oracle GoldenGate for Filtering and Transformation Data
Purpose
This note provides background and troubleshooting information applicable to Oracle GoldenGate data transformation. Its main purpose as a master note is to bring together reference material in a form which allows a clear understanding of what is required in the configuration as well as to where to look when there are issues. Its focus is setup and understanding relevant parameters setting which are required to integrate data between source and target tables to obtain the format required by target tables and applications. It will include: differences in table names and column names, differences in table definitions, data transformation and data differences. The samples are based on Oracle Server database but apply to all databases where Oracle GoldenGate is certified.
Scope
This document is intended for anyone with an interest in understanding data and column transformation and mapping. Where samples are provided for reference to DBAs and Application developers. It is likely these will require modification in view of the fact that the examples presented are for illustration purposes. In order to have a better insight into how the examples work, please ensure you have read and understood the information below.
Whilst every care has been taken to ensure that the details are correct and accurate, code should be throroughly tested to verify it is fit for intended purpose.
Please note, the approach adopted in relation to knowledge has been to improve Oracle’s product documentation. Thus, relevant content in relation to data transformation is now found in the primary references guides for Oracle GoldenGate; if possible, please refer to the lastest documentation documentation library specific to your platform and database
Details
In Oracle GoldenGate it is called conversion or transformation the process of manipulating data to the format required by targets applications. Mapping and transformation can be done by Extract, Replicat, or both. Column mapping and conversion can be performed on the source system, on the target system, or on an intermediary system.,
eg: When using complex filtering or data transformation configurations, you can configure a data pump to perform the first transformation either on the source system or on the target system, or even on an intermediary system, and then use another data pump or the Replicat group to perform the second transformation.
To filter data, you can use:
● A FILTER or WHERE clause in a TABLE statement (Extract) or in a MAP statement (Replicat).
● A SQL query or procedure
● User exits
To transform data, you can use:
● Native Oracle GoldenGate conversion functions
● A user exit from the Extract or Replicat process that applies rules from an external
transformation solution, then returns the manipulated data to Oracle GoldenGate.
● Replicat to deliver data directly to an ETL solution or other transformation engine.
In pass-through mode, using PASSTHRU parameter for the data pump, the source and target tables and structures must be identical, and no filtering or manipulation or user exit processing can be done to the data.
Mapping and Transforming Columns
Oracle GoldenGate provides for column mapping at the table level and at the global level. All data selection, mapping, and manipulation that Oracle GoldenGate performs is accomplished by using options of the TABLE and MAP parameters. TABLE is used in the Extract parameter file, and MAP is used in the Replicat parameter file.
Mapping between dissimilar databases
For source and target structures to be considered identical, they must contain identical column names (including case, if applicable) and data types, and the columns must be in the same order in each table.
Mapping and conversion between tables that have different data structures requires either a source-definitions file, a target-definitions file, or in some cases both. When used, this file must be specified with the SOURCEDEFS or TARGETDEFS parameters.
Colmap
Use COLMAP option of the MAP and TABLE parameters to explicitly map source columns to target columns that have different names or to specify default column mapping when source and target names are identical. COLMAP provides instructions for selecting, mapping, translating, and moving data from a source column into a target column. Within a COLMAP statement, you can employ any of the Oracle GoldenGate column-conversion functions to transform data for the mapped columns.
When using COLMAP, you might need to create a data-definitions file. When the tables have identical structures, and you are using COLMAP for other functions such as conversion, a source definitions file is not needed. You can use the ASSUMETARGETDEFS parameter instead.
When source and target tables that are not identical in structure, you must generate data definitions for the source tables, transfer them to the target, and use the SOURCEDEFS parameter to identify the definitions file. The COLMAP option is available with the following parameters:
EXTRACT | REPLICAT |
TABLE <table spec>; | MAP <table spec>, TARGET <table spec>,
COLMAP ([USEDEFAULTS, ] <target column> = <source expression>);
|
SOURCEDEFS ./dirdef/reference.def
MAP “SRC.TCUSTORD”, TARGET “TRG.TCUSTORD”,
COLMAP (USEDEFAULTS,
CUST_CODE= @IF(@STREQ(@STRRTRIM(CUST_CODE),””),CUST_CODE,@STRRTRIM(CUST_CODE)),
ORDER_DATA = @DATE(“YYYY-MM-DD”,”YYYYMMDD”,ORDER_DATE));
Use the replicat ALLOWDUPTARGETMAP and NOALLOWDUPTARGETMAP parameters to control whether or not duplicate MAP statements for the same source and target objects are accepted in a parameter file. NOALLOWDUPTARGETMAP is the default value.
For example, the following parameter file would be permissible with:
REPLICAT repcust
USERID GoldenUser, PASSWORD ****
SOURCEDEFS /ggs/dirdef/source.def
ALLOWDUPTARGETMAP
GETINSERTS
GETUPDATES
IGNOREDELETES
MAP ggs.tcustmer, TARGET ggs.tcustmer, COLMAP (USEDEFAULTS, deleted_row = “N”);
IGNOREINSERTS
IGNOREUPDATES
GETDELETES
UPDATEDELETES
MAP ggs.tcustmer, TARGET ggs.tcustmer, COLMAP (USEDEFAULTS,
deleted_row = “Y”);
Use the replicat ALLOWNOOPUPDATES and NOALLOWNOOPUPDATES parameter to control how Replicat responds to a “no-op” operation. A no-op operation is one in which there is no effect on the target table, eg: The source table has a column that does not exist in the target table, or has a column that was excluded from replication (with a COLSEXCEPT clause). In either case, if that source column is updated, there will be no target column name to use in the SET clause within the Replicat SQL statement.
Colmatch
To create global rules for column mapping across all tables in subsequent MAP statements, use the COLMATCH parameter.
With COLMATCH, you can map between similarly structured tables that have different column names for the same sets of data
Column names can be explicitly mapped:
COLMATCH NAMES CUST_CODE = CUST_ID
Additionally , you can configure a global column name prefix or suffix to be ignored.
The following example maps a target column having a _PK suffix to a source column with the same name, such as COL1 to COL1_PK.
The syntax in the parameter file GLOBALS would be:
COLMATCH SUFFIX _SRC
COLMATCH PREFIX SRC_
The RESET keyword after COLMATCH is used to turn off the global mapping rules for subsequent tables defined in an Extract or Replicat parameter file.
Warning: Never use the COLMAP option in a Data Pump’s configuration when using the PASSTHRU parameter. The concept of PASSTHRU does not allow any mapping or transformation
Defgen Utility
The DEFGEN utility creates a data definitions file for the tables in your source or target database schema. It defines the column structure, datatypes, field length, and offset. Originally designed for heterogeneous environments, DEFGEN provides an easy method of map configuration between non-identical schemas. When used, this file must be specified with the SOURCEDEFS or TARGETDEFS parameters. Use the SOURCEDEFS parameter to identify the definitions file for Replicat on a target system or use the TARGETDEFS parameter to identify the definitions file for Extract or a data pump on a source system or intermediary system.
To configure GoldenGate to use a data definitions file includes the following main steps:
• Creating a parameter file for the DEFGEN utility
• Running the DEFGEN utility to generate the file
• Configuring the GoldenGate process to reference the definitions file
An example procedure for creating a source definitions file is described next:
- Log on to the database server (as the Oracle user).
- Change directory to the GoldenGate Home
- Run GGSCI.
- Execute the following commands to create a DEFGEN parameters file:
- GGSCI> EDIT PARAMS DEFGEN
- DEFSFILE ./dirdef/scott.def
- USERID ggsadmin, PASSWORD ggsadmin
- TABLE GSSUSER.TCUSTMER;
- TABLE GSSUSER.TCUSTORD;
10. Exit GGSCI.
11. GGSCI > EXIT
From the GoldenGate Home, run the DEFGEN utility on the Linux command line to create the scott.def file.
Filtering Columns
To control which columns of a source table are extracted by Oracle GoldenGate, use the COLS and COLSEXCEPT options of the TABLE parameter in the extract.
Use COLS to select columns for extraction, and use COLSEXCEPT to select all columns except those designated by COLSEXCEPT.
Restricting the columns that are extracted can be useful when a target table does not contain the same columns as the source table, or when the columns contain sensitive information, such as a personal identification number or other proprietary business information.
The following parameter processes only columns: employee_id ,email, department_id
TABLE hr.employees, COLS (employee_id ,email, department_id);
The following parameter processes all columns except columns salary and comission_pct.
TABLE hr.employees, COLSEXCEPT (employee_id ,email, department_id);
Do not exclude key columns, and do not use COLSEXCEPT to exclude columns that contain unsupported data types.
For Replicat use COLMAP to map all the wanted mapping.
You can use only the colmap to filter columns from operations if the column is in the trail. In the following sample, when the operation is insert, all the columns will be mapped. When the operations is not insert only the specified columns will be replicated. Two filter will be used, one filter for the non insert operations and one for the insert operations. ALLOWDUPTARGETMAP parameter should be used:
ALLOWDUPTARGETMAP
MAP HR.EMPLOYEES , TARGET HR.AUDIT_EMP, COLMAP (employee_id , first_name, last_name , email), FILTER ( @STRCMP( @GETENV (“GGHEADER”, “OPTYPE”), “INSERT”) <> 0));
MAP HR.EMPLOYEES , TARGET HR.AUDIT_EMP, FILTER ( @STRCMP( @GETENV (“GGHEADER”, “OPTYPE”), “INSERT”) = 0 ));
MAP HR.EMPLOYEES , TARGET HR.AUDIT_EMP, COLMAP (employee_id , first_name, last_name , email), FILTER ( @STRCMP( @GETENV (“GGHEADER”, “OPTYPE”), “INSERT”) <> 0));
MAP HR.EMPLOYEES , TARGET HR.AUDIT_EMP, FILTER ( @STRCMP( @GETENV (“GGHEADER”, “OPTYPE”), “INSERT”) = 0 ));
Filtering Partitions in Extract
ALTID option is only available for MAP extract to allow it to capture tables of a foreign archived log. In Oracle, ALTID allows to capture partition(s) or sub-partition(s) in a high-volume load table so that the load can be distributed to more than one extract.
When ALTID is specified the map will be resolved as IMMEDIATE, regardless of WILDCARDRESOLVE setting. This means user may experience a lag in extract start time, especially when a table has a large number of partitions.
Wildcard table name is not allowed when ALTID is specified in the map parameter and DDL will not be supported with ALTID.
Wildcard table name is not allowed when ALTID is specified in the map parameter and DDL will not be supported with ALTID.
Example #1:
The above extract parameter will substitute 77740 as the object id of source table regardless of the object-id value as defined by the database.
WILDCARDRESOLVE IMMEDIATE
MAP SH.SALES, ALTID 77740;
Example #2:
The below extract parameter will cause extract to only capture partitions 77741 and 77742
WILDCARDRESOLVE IMMEDIATE
MAP SH.SALES, ALTID 77741,77742;
Filtering DML Statements
By default, Oracle GoldenGate captures and applies insert, update, and delete operations.
You can use the following parameters in the Extract or Replicat parameter file to control which kind of operations are processed, such as only inserts or only inserts and updates.
IGNOREINSERTS | GETINSERTS
IGNOREUPDATES | GETUPDATES
IGNOREDELETES | GETDELETES
The parameter remains in effect for all subsequent TABLE or MAP statements, until the other parameter is encountered.
IGNOREUPDATES, IGNOREINSERTS and IGNOREDELETES can be used before a list of TABLE or MAP statements to filter out respective DML records.
GETUPDATES, GETINSERTS and GETDELETES are the default values , they have to be explicitly declared in the Extract or Replicat’s parameter file to reset the previous configuration.
Filtering Users
If you want to ignore all operations by a specific user you can use for Oracle databases:
TRANLOGOPTIONS EXCLUDEUSER <user name>
Filtering DDL
When DDL replication is enabled, we can filter in a subset of DDL. In the following sample, we will let CREATE TABLE and ALTER TABLE ADD or MODIFY column statements. But we will filter ADD constraint or MODIFY constraint statements.
DDL INCLUDE OPTYPE ALTER OBJTYPE ‘table’ INSTRWORDS ‘modify’
INCLUDE OPTYPE CREATE OBJTYPE ‘table’
INCLUDE OPTYPE ALTER OBJTYPE ‘table’ INSTRWORDS ‘add’
EXCLUDE INSTRWORDS ‘constraint’
Filtering Rows
There are two data filtering options:
• Complex
• Non-complex
Non-complex filtering is achieved through the WHERE clause in a TABLE (Extract) or MAP (Replicat) statement, while complex data evaluations use the FILTER clause.
FILTER can select rows and columns for a given operation, whereas WHERE just selects rows. FILTER can also use GoldenGate built-in functions.
Where Clause
Configuring a WHERE clause is much like the WHERE clause in an SQL statement. In this case we add it at the end of the TABLE or MAP statement. The following example will filter out from the source trail, those records that have a SALARY greater than 100000 in the EMPLOYEES table:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES, WHERE (SALARY > 1000000);
Additionally you can use the functions @PRESENT, @ABSENT, and @NULL with the WHERE clause These are the only GoldenGate functions compatible with the WHERE clause.
The @PRESENT and @ABSENT functions test for the existence of columns in a data record. The @NULL function tests for nulls in data only. When used in conjunction with<> (not equals) the test is not null.
WHERE clause cannot perform arithmetic operators, or refer to trail headers and user tokens values and floating-point datatypes are NOT supported .
The following example mapping succeeds if the SALARY column exists in the source data record and it is not null:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES,WHERE (SALARY = @PRESENT AND SALARY <> @NULL);
Adding the @PRESENT function to the WHERE clause causes the record not to be discarded when SALARY is absent.
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES, WHERE (SALARY = @PRESENT AND SALARY > 1000);
Filter Clause
FILTER clause adds an extra value as it allows to evaluate row data and column names for a given DML operation and table. You can use in Extract and Replicat. The maximum size of the filter clause is 5,000 bytes.
To use the example from the previous section: FILTER clause will allow UPDATE or DELETE operations only on the EMPLOYEES target table. It will also filter on records having SALARY greater than 1000.
You may prefer to set the filter at the EXTRACT, selecting only “updates and deletes” for replication. The IGNORE keyword provides the inverse to ON.
TABLE HR.EMPLOYEES, FILTER (ON UPDATE, ON DELETE, SALARY > 1000);
FILTER can use GoldenGate functions to provide complex data comparison and evaluation.
The following example demonstrates the @STRFIND function that provides string comparison within the row data, selecting only records from the EMPLOYEES table having “JOHN” in the FIRST_NAME column:
TABLE HR.EMPLOYEES, FILTER (@STRFIND(FIRST_NAME , “JOHN”) > 0);
We can also perform calculations for FILTER to evaluate using the @COMPUTE function.
The following example selects data records from the CREDITCARD_ACCOUNT table having a remaining credit balance of over 10,000:
TABLE SRC.CREDITCARD_ACCOUNTS, FILTER (@COMPUTE(CREDIT_LIMIT-CREDIT_BALANCE) > 10000);
Multiple FILTERS can be specified per a given TABLE or MAP entry, these are executed until one fails or until all are passed.
You can also use the @COLTEST to check if a record is found in the table:
TABLE HR.EMPLOYEES,
FILTER (@GETVAL (SALARY) > 1000 );
would be
FILTER (@COLTEST (@GETVAL (SALARY), PRESENT);
Or to check if nothing is found;
FILTER (@COLTEST (@GETVAL (SALARY), MISSING, NULL);
This example about COST table will assign the value of UNIT_COST to the target column UNIT_PRICE unless the column UNIT_COST is missing or invalid. If the UNIT_COST column is missing or invalid, replicat assigns the value 0.
UNIT_PRICE = @IF (@COLTEST (UNIT_COST , MISSING, INVALID), 0, UNIT_COST)
The FILTER RAISEERROR options created a user-defined error number if the filter clause is true:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES,FILTER (ON UPDATE, ON DELETE, SALARY > 1000, RAISEERROR 9999 );
Oracle GoldenGate does not support FILTER for columns that have a multi-byte character set or a character set that is incompatible with the character set of the local operating system.
@Range
All GoldenGate functions are expressed with the “@” prefix in parameter files and always type spaces between keywords.
It is also possible to filter data on a range of values, such as date or number.
Additionally @RANGE function can be used to increase the throughput of large and heavily accessed tables and also can be used to divide data into sets for distribution to different destinations, dividing the rows of any table across two or more Oracle GoldenGate processes. It can be used specifying each range in a FILTER clause in a TABLE or MAP statement. @RANGE is safe and scalable. It preserves data integrity by guaranteeing that the same row will always be processed by the same process group. This is similar in concept to Oracle’s Hash algorithm for table partitioning.
In the following example, the replication workload is split into three ranges (between three Replicat processes) based on the PROD_ID column of the source SALES table:
(REPLICAT GROUP_1 PARAMETER FILE)
MAP SH.SALES, TARGET SH.SALES, FILTER ( @RANGE (1, 3, PROD_ID ));
(REPLICAT GROUP_2 PARAMETER FILE)
MAP SH.SALES, TARGET SH.SALES, FILTER ( @RANGE (2, 3,PROD_ID ));
(REPLICAT GROUP_3 PARAMETER FILE)
MAP SH.SALES, TARGET SH.SALES, FILTER ( @RANGE (3, 3, PROD_ID ));
When multiple filters are specified per TABLE or MAP statement, each one is executed in turn until one fails. The failure of any filter results in a failure for all filters.
@Case, @Eval,@ If
Use the @IF function to return one of two values, based on a condition. You can use the @IF function with other Oracle GoldenGate functions to begin a conditional argument that tests for one or more exception conditions. You can direct processing based on the results of the test. You can nest @IF statements, if needed:
Example 1. The following returns an amount only if the AMT column is greater than zero; otherwise zero is returned.
AMOUNT_COL = @IF (AMT > 0, AMT, 0)
Example 2. The following returns WEST if the STATE column is CA, AZ or NV; otherwise it returns EAST
MAP GSS.TCUSTMER, TARGET GSS.TCUSTMER_REPORT,
COLMAP ( USEDEFAULTS, COUNTRY_REGION = @IF (@VALONEOF (STATE, “CA”, “AZ”, “NV”), “WEST”, “EAST”));
Example 3. If the order_date is 00 then map with a forced date else map with order_date
MAP GSS.TCUSTORD, TARGET GGS.TCUSTORD,
COLMAP ( USEDEFAULTS,
ORDER_DATE = @IF ( @STRNCMP ( ORDER_DATE, “00”,2 ) = 0, @DATE (“YYYY-MM-DD HH:MI:SS” , “YYYY-MM-DD HH:MI:SS”, “2000-06-01 12:00:00”),ORDER_DATE));
Example 4. The following returns the result of the PRICE column multiplied by the QUANTITY column if both columns are greater than 0. Otherwise, the @COLSTAT (NULL) function creates a NULL value in the target column.
ORDER_TOTAL = @IF (PRICE > 0 AND QUANTITY > 0, PRICE * QUANTITY, @COLSTAT (NULL))
Use the @EVAL function to select a value based on a series of independent tests. There is no limit to the number of conditions to specify.
Example 5. In the following example, if the AMOUNT column is greater than 10000, “high amount” is returned. If AMOUNT is greater than 5000 (and less than or equal to 10000), “somewhat high” and a default value is specified, and a result of “lower” is returned if AMOUNT is less than or equal to 5000.
@EVAL (AMOUNT > 10000, “high amount”, AMOUNT > 5000, “somewhat high”, “lower”)
Use the @CASE function to select a value depending on a series of value tests. There is no limit to the number of cases you can test with @CASE.
Example 6. The following returns “A car” if PRODUCT_ID is “CAR” and “A truck” if PRODUCT_CODE is “TRUCK”. If PRODUCT_CODE is neither “CAR” nor “TRUCK”, the function returns “A vehicle.”
@CASE (PRODUCT_ID, “CAR”, “A car”, “TRUCK”, “A truck”, “A vehicle”)
Example 7. In the following samples we place a value into a given column based on the value of an existing column. This statement reads “If COL_B is 23, set COL_A to 14, else if COL_B is 24, set COL_A to 15, in all other cases, set COL_A to 16”);
MAP HR.TABLE1, TARGET HR.TABLE1,
COLMAP (USEDEFAULTS, COL_A = @CASE(COL_B,23,14,24,15,16));
Filtering with macros
You can use macros to automate multiple uses of a parameter statement:
<OGG>/dirmac/macros.mac
macros.mac content:
macro #filter
begin
FILTER (ON UPDATE, ON DELETE, SALARY > 1000);
end;
In the extract:
INCLUDE <OGG_DIR>/macros.mac
TABLE HR.EMPLOYEES, #filter()
Data Transformation
You can manipulate source values into the appropriate format for target columns. GoldenGate functions enable you to manipulate numbers and characters, perform tests, extract parameter values, return environment information, and more.
Column Conversion Functions
Using the column conversion functions of Oracle GoldenGate, you can manipulate source values into the appropriate format for target columns. GoldenGate functions enable you to manipulate numbers and characters, perform tests, extract parameter values, return environment information, etc..
All GoldenGate functions are expressed with the “@” prefix in parameter files.
Retrieve dates and times, perform computations on them, and convert them.: @DATE, @DATEDIFF, and @DATENOW
The @DATENOW function can be used to populate a MODIFIED_AT column on the target table. The default date format for GoldenGate is’YYYY-MM-DD HH:MI:SS’, this being generated by the @DATENOW function.
The @DATENOW function would be the equivalent of the SYSDATE Oracle Database function.
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES_REPORT,
COLMAP (USEDEFAULTS,
MODIFIED_AT = @DATENOW());
To perform a conversion on a numeric string, use the @DATE function, choosing the relevant format options, which are similar to SQL. The example uses @DATE to derive the HIRE_DATE_FORMAT by converting source date columns YEAR in the format YY, DAY in the format DD, and MONTH in the format MM to a target date with the format YYYY-MM-DD:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES_REPORT,
COLMAP ( USEDEFAULTS,
HIRE_DATE_FORMAT= @DATE (“YYYY-MM-DD”,“YY”, YEAR, “MM”, MONTH,“DD”, DAY));
Convert a binary or character string to a number. @NUMBIN ,@NUMSTR
represented as a string. Do NOT use the @NUMBIN function to map tokens . These are integer, not binary data.
– Map a string (character) to a number.
– Use a string column that contains only numbers in an arithmetic expression.
Convert a number to a string. @STRNUM
GoldenGate supports numeric conversions through two functions. The @NUMSTR function converts a string to a number for arithmetic calculations. Similarly the @STRNUM converts a number to a string, but with the additional option of padding characters. The following example will convert the CREDIT_BALANCE value from the source table to a string, padded with zeros to a maximum of five characters:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES_REPORT,
COLMAP (USEDEFAULTS,
CREDIT_BALANCE = @STRNUM(CREDIT_BALANCE,RIGHTZERO, 5));
Compare strings: @STRCMP, @STRNCMP
–1 if the first string is less than the second.
0 if the strings are equal.
1 if the first string is greater than the second.
An example of the extract parameter to FILTER on transaction type:
TABLE HR.EMPLOYEES, FILTER ( @STRCMP( @GETENV (“GGHEADER”, “OPTYPE”), “INSERT”) = 0 ));
To filter only those records where NAME=”JOHN”
TABLE HR.EMPLOYEES, FILTER ( ON INSERT, ON UPDATE, ON DELETE, @STRCMP (NAME, “JOHN”) = 0);
Concatenate strings: @STRCAT, @STRNCAT
The @STRCAT function provides string concatenation by joining two or more separate strings together. In the following example, we concatenate the FIRST_NAME and SURNAME fields from the source table into the ENAME field on the target table:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES_REPORT,
COLMAP (USEDEFAULTS, ENAME = @STRCAT(FIRST_NAME,” “,SURNAME));
Extract from a string: @STREXT, @STRFIND
In addition to string manipulation, the @SUBEXT function for string truncation and character extraction. Although the function will extract any characters from a string based on begin and end character positions, it lends itself to truncating the string to a certain length, as shown in the following example.
The example uses @STREXT to extract portions of a string field into 3 different columns. It takes the first through the third characters from the source’s PHONE_NUMBER to populate the target’s AREA_CODE, characters 4 through 6 for the PHONE_PREFIX, and 7 to 10 for the PHONE_NUMBER:
MAP HR.EMPLOYEES, TARGET HR.EMPLOYEES_REPORT,
COLMAP ( USEDEFAULTS,
AREA_CODE = @STREXT (phone_number, 1, 3),
PHONE_PREFIX = @STREXT (phone_number, 4, 6),
PHONE_NUMBER = @STREXT (phone_number, 7, 10) );
Return the length of a string: @STRLEN
Substitute one string for another: @STRSUB .The @STRSUB function provides string substitution, allowing a pattern of characters to be replaced with a new string. The following example converts the TITLE field from the source table into an abbreviated form on the target:
MAP SRC.CREDITCARD_ACCOUNTS, TARGET TGT.CREDITCARD_ACCOUNTS,
COLMAP (USEDEFAULTS, TITLE = @STRSUB(TITLE, “DOCTOR”,”DR”,”MISTER”,”MR”));
Case chasing: @STRUP.
The following example illustrates the function’s simplicity:
MAP SRC.CREDITCARD_ACCOUNTS, TARGET TGT.CREDITCARD_ACCOUNTS,
COLMAP (USEDEFAULTS, &TITLE = @STRUP(TITLE));
Another samples
COLSTAT – Returns an indicator that a column is MISSING, NULL, or INVALID.
The following example returns a NULL into target column JOB_ID:
MAP HR.EXPLOYEES, TARGET HR.EMPLOYEES,COLMAP(USEDEFAULTS, job_id = @COLSTAT(NULL));
The following @IF calculation uses @COLSTAT to return NULL to the target column if PRICE and QUANTITY are less than zero:
ORDER_TOTAL = PRICE * QUANTITY, @IF (PRICE < 0 AND QUANTITY < 0,@COLSTAT(NULL))
This following compares the source BANK column with NULL, and if it is null, the target column is set to NULL; otherwise the target column gets the value of the source LOANUM:
LOANUM = @IF (@COLTEST (BANK, NULL), @COLSTAT(NULL), LOANUM)
DML Conversion Parameters
You can convert one type of SQL operation to another by using the following parameters in the Replicat parameter file:
– Use INSERTUPDATES to convert source update operations to inserts into the target table. This is useful for maintaining a transaction history on that table. The transaction log record must contain all of the column values of the table, not just changed values. Some databases do not log full row values to their transaction log, but only values that changed.
– Use INSERTDELETES to convert all source delete operations to inserts into the target table. This is useful for retaining a history of all records that were ever in the source database.
– Use UPDATEDELETES to convert source deletes to updates on the target.
User Tokens
A user token is information data that is captured and stored in the user token area of the GoldenGate record header, in the trail record for replication. These are typically used to store environment or system information but the values could be anything, even values calculated from the database column values. Token data can be retrieved and used in many ways to customize the way that GoldenGate delivers data.
To define a user token and associate it with data, use the TOKENS clause of the TABLE parameter in the Extract parameter file. It can be used to define a user token and associate it with GoldenGate environment data using the @GETENV function.
Syntax TABLE <table spec>, TOKENS (<token name> = <token data> [, …]) ;
There is a limit on the total space allocated for token keys and values stored in the trail. The token area in the record header permits up to 2,000 bytes of data. Token names, the length of the data, and the data itself must fit into that space
A best practice for naming tokens could be to call them the same name as the @getenv call, with a user-defined prefix such as “TKN”.
For example, for the commit sequence number (using “getenv” category “TRANSACTION” and key value “CSN”), the token would be called “TKN-TRANSACTION-CSN”.
TABLE SRC.AUDIT,
TOKENS (TKN-OSUSER = @GETENV(“GGENVIRONMENT”,”OSUSERNAME”),
TKN-TRANSACTION-CSN =@GETENV(“TRANSACTION”,”CSN”),
TKN-DBNAME = @GETENV (“DBENVIRONMENT”,”DBNAME”),
TKN-HOSTNAME = @GETENV (“GGENVIRONMENT”,”HOSTNAME”),
TKN-COMMITTIME = @GETENV(“GGHEADER”,”COMMITTIMESTAMP”),
TKN-REC-FILESEQNO=@GETENV (“RECORD”, “FILESEQNO”),
TKN-REC-FILERBA=@GETENV (“RECORD”, “FILERBA”));
The @TOKEN function is used to retrieve token data that is stored in the user token area of the GoldenGate record header. To use the defined token data in target tables, use the @TOKEN column-conversion function in the COLMAP clause of a Replicat MAP statement. The @TOKEN function maps the name of a token to a target column. Syntax @TOKEN (“<token name>”)
The following MAP statement maps target columns “osuser,”, “transaction_csn,” and so forth to tokens “tk-osuser,” “tk-transaction-csn,” and so forth.
MAP SRC.AUDIT, TARGET TRG.AUDIT,
COLMAP (USEDEFAULTS,
OSUSER = @TOKEN(“TKN-OSUSER”),
TRANSACTION_CSN = @TOKEN(“TKN-TRANSACTION-CSN”),
DBNAME = @TOKEN(“TKN-DBNAME”),
HOSTNAME = @TOKEN(“TKN-HOSTNAME”),
COMMITTIME = @TOKEN(“TKN-COMMITTIME”),
RECFILESEQNO = @TOKEN (“TKN-REC-FILESEQNO”),
REC-FILERBA = @TOKEN (“TKN-REC-FILERBA”));
As an alternative, you can use @TOKEN within a SQLEXEC statement, a GoldenGate macro, or a user exit.
For more information about using tokens, see the GoldenGate for Windows and UNIX Administrator Guide.
@GETENV
We can use the @GETENV function to return information about the Oracle GoldenGate environment:
Syntax @GETENV (“OSVARIABLE”, “HOME”)
– General information types, eg: lag information, last replicated operation, including detailed error information.
– Table-level statistics information types (starting 11.2).
– Oracle GoldenGate information types, eg: GoldenGate environment, record header information.
– Database information types, eg: information about a source transaction.
– Operating system information type, eg: information about operating system environment variable.
– Base 24 information types.
This option is valid for Extract and Replicat You can use the information as input into the following:
– Stored procedures or queries (with SQLEXEC)
– Column maps (with the COLMAP option of TABLE or MAP).
In the following replicat file we add a column in the target with the COMMITTIMESTAMP value:
MAP GGS.TCUSTORD, TARGET GGS.TCUSTORD,
COLMAP (USEDEFAULTS,
COMMIT_HEADER = @GETENV (“GGHEADER”, “COMMITTIMESTAMP”));
COLMAP (USEDEFAULTS,
COMMIT_HEADER = @GETENV (“GGHEADER”, “COMMITTIMESTAMP”));
– User tokens (defined with the TOKENS option of TABLE and mapped to target columns by means of the @TOKENS function)
– The GET_ENV_VALUE user exit function.
– Macros:
MACRO #optype
BEGIN
BEGIN
See also: Oracle GoldenGate Windows and UNIX Reference Guide
The results are printed to screen and the report file. Use one GETENV statement per variable to be retrieved. The function only returns an exact match, other variables that contain “HOME,” such as ORACLE_HOME, would not be returned.
Not all @getenv keys return meaningful values in all environments. For example, a key may return a value on Oracle on Unix but not MS SQL Server on Windows. Also, values on one platform may have slightly different semantics for a different OS or database. Some values returned by @getenv are always available from the trail (such as commit timestamps), and therefore do not need to be stored as tokens.
There is a parameter named GETENV parameter that allows to view environment variables that were set with the SETENV parameter. It is valid for Extract and Replicat.
User Exits
Oracle GoldenGate allows to call user exits or user developed callbacks. These are user defined C or C++ function calls.
The user exits can be called from either an Extract or Replicat process via the CUSEREXIT parameter. You can employ user exits as an alternative to, or in conjunction with, the column conversion function.. Within a user exit, data is processed once (when extracted) rather than twice (extracted and then read again to perform the transformation). Applications of user exits could be: perform coxplex arithmetic operations or data transformation, accumulate totals and gather statistics, respond to events in custom ways, for example, by sending an e-mail message based on a field value, etc.
A user exit, requires create a shared object in C and create a routine to be called from Extract or Replicat. The routine must accept the following parameters that provide the communication between GoldenGate and your C program: EXIT_CALL_TYPE, EXIT_CALL_RESULT, EXIT_PARAMS.
SQLEXEC
The SQLEXEC parameter of Oracle GoldenGate enables Extract and Replicat to communicate with the database to do the following:
– Execute a database command, stored procedure, or SQL query to perform a database function, return results (SELECT statements) or perform DML (INSERT, UPDATE, DELETE) operations.
– Retrieve output parameters from a procedure for input to a FILTER or COLMAP clause.
SQLEXEC is a parameter not a command and cannot be called directly from the GGSCI command prompt. SQL statements should be enclosed in double quotes with no terminating semi-colon and will be executed in the order they appear after the DBLOGIN specification.
A query or procedure must be structured correctly when executing a SQLEXEC statement. If Replicat encounters a problem with the query or procedure, then the process will immediately abend, regardless of any error-handling rules that are in place. When SQLEXEC parameter is used within a MAP statement, the procedure or query that is executed can accept input parameters from source or target rows and pass output parameters.
SQLEXEC has an option to be executed before or after the filters (AFTERFILTER | BEFOREFILTER).
An example of finding a value from a lookup table on the EXTRACT. It is stored in the trail as a token, and the token value can be applied in the COLMAP on the Replicat if desired.
TABLE scott.emp,
SQLEXEC (id lookup, query “select dname from dept where deptno= :p1_deptno”,
PARAMS (p1_deptno = deptno) ),
TOKENS (dname_tk = @GETVAL(lookup.dname)) ;
To execute a procedure within a TABLE or MAP statement:
Syntax SQLEXEC (SPNAME <sp name>,[ID <logical name,]{PARAMS <param spec> | NOPARAMS}). The SPNAME Replicat parameter specifies the name of the PL/SQL stored procedure, while PARAMS specifies its parameters.
The following example maps data from the EMPLOYEES table to NEW_EMPLOYEES table. The procedure LOOKUP_DEPARTMENT is executed prior to executing the column map. This stored procedure has two parameters: an IN and an OUT. The IN parameter accepts a DEPARTMENT ID and is named CODE_IN_PARAM.
The value returned by the stored procedure’s OUT parameter is obtained by the @GETVAL function and used in the COLMAP statement to populate the new department_name column.
CREATE OR REPLACE PROCEDURE LOOKUP_DEPARTMENT OUT parameter (CODE_IN_PARAM IN NUMBER, CODE_OUT_PARAM OUT VARCHAR2)
BEGIN
SELECT department_name
INTO CODE_OUT_PARAM
FROM departments
WHERE department_id = CODE_IN_PARAM;
END;
/
The SPNAME Replicat parameter specifies the name of the PL/SQL stored procedure, while PARAMS specifies its parameters.
To pass values from a stored procedure or query as input to a FILTER or COLMAP clause, we must specify the stored procedure name followed by the OUT parameter name.
MAP HR.EMPLOYEES , TARGET HR.NEW_EMPLOYEES,
SQLEXEC (SPNAME LOOKUP_DEPARTMENT,
PARAMS (CODE_IN_PARAM = department_id)),
COLMAP (USEDEFAULTS,
DEPARTMENT_NAME = @GETVAL(LOOKUP_DEPARTMENT.CODE_OUT_PARAM));
We can perform the same lookup operation using SQL.
MAP HR.EMPLOYEES , TARGET HR.NEW_EMPLOYEES,
SQLEXEC (ID LOOKUP_DEPT,
QUERY “SELECT DEPARTMENT_NAME FROM DEPARTMENTS WHERE DEPARTMENT_ID = :CODE_IN_PARAM”,
PARAMS (CODE_IN_PARAM = department_id )),
COLMAP (USEDEFAULTS,
DEPARTMENT_NAME = @GETVAL(LOOKUP_DEPT.DEPARTMENT_NAME));
The QUERY must be within quotes and must be contained all on one line. The limit for the PARAMS section in SQLEXEC is 1000 columns or 30000 bytes of data.
Auditing With Goldengate
Oracle GoldenGate enables you to retain a history of changes made to a target record and to map information about the operation that caused each change. This history can be useful
for creating a transaction-based reporting system that contains a separate record for every operation performed on a table, as opposed to containing only the most recent version of
each record.
for creating a transaction-based reporting system that contains a separate record for every operation performed on a table, as opposed to containing only the most recent version of
each record.
To convert one type of SQL operation to another the transaction log record must contain all of the column values of the table, not just changed values. Some databases do not log full row values to their transaction log, but only values that changed.
The ADD TRANDATA command can be used to enable table-level supplemental logging of key values and non-key columns (COLS option).
The ADD TRANDATA command can be used to enable table-level supplemental logging of key values and non-key columns (COLS option).
– Use INSERTUPDATES to convert source update operations to inserts into the target table. This is useful for maintaining a transaction history on that table.
– Use INSERTDELETES to convert all source delete operations to inserts into the target table. This is useful for retaining a history of all records that were ever in the source database.
– Use UPDATEDELETES to convert source deletes to updates on the target.
– Use INSERTALLRECORDS to insert a new record in the target table for every change operation made to the source table.
Some of the data available in the Oracle GoldenGate record header can be used for mapping by using the GGHEADER option of the @GETENV function or by using any of the following transaction elements as the source expression in a COLMAP statement in the TABLE or MAP parameter, eg:
To make any transaction unique when using the replicat INSERTALLRECORDS for creating historical data, include in the EXTRACT table clause the following tokens using the GETENV function:
TKN-REC-FILESEQNO=@GETENV (“RECORD”, “FILESEQNO”)
TKN-REC-FILERBA=@GETENV (“RECORD”, “FILERBA”)
TKN-REC-FILERBA=@GETENV (“RECORD”, “FILERBA”)
See also: Oracle GoldenGate Windows and UNIX Reference Guide
Another option would be to use a PL/SQL procedure to insert any delete transactions into an audit table, e.g.: rather than having GoldenGate apply the data changes to the target database, it is possible to have SQLEXEC do this via INSERT, UPDATE, or DELETE commands.
The following replicat example adds five columns to the target REPORT.SALES_HISTORY with special values, based on @getenv and tokens, to build history of sales data:
INSERTALLRECORDS
MAP SH.SALES, TARGET REPORT.SALES_HISTORY,
COLMAP (USEDEFAULTS,
TRAN_TIME = @GETENV(“GGHEADER”,”COMMITTIMESTAMP”),
OP_TYPE = @GETENV(“GGHEADER”, “OPTYPE”),
BEFORE_AFTER_IND = @GETENV(“GGHEADER”, “BEFOREAFTERINDICATOR”),
REC-FILESEQNO = @TOKEN (“TKN-REC-FILESEQNO”)
REC-FILERBA = @TOKEN (“TKN-REC-FILERBA”));
INSERTALLRECORDS causes Replicat to insert every change operation made to a record as a new record in the database. The initial insert and subsequent updates and deletes are maintained as point-in-time snapshots.
COLMAP uses the @GETENV function to get historical data from the GoldenGate trail header – TRAN_TIME picks up the commit timestamp for the date of the transaction; OP_TYPE stores whether it is an insert, update, or delete operation; and BEFORE_AFTER_IND indicates whether it is storing a “before” or “after” image. REC-FILESEQNO and REC-FILERBA are tokens that identifies the transactions as unique.
The following example calls the AUDIT_PROC procedure that inserts a record into an audit table, keeping a history of GoldenGate transactions. On successful execution, the DBOP keyword ensures the transaction is committed to the target database. Using SQLEXEC in standalone mode to execute DML against the target database necessitates the DBOP keyword in the configuration to commit the changes, else the transaction will rollback.
MAP SCHEMA.TABLENAME, TARGET SCHEMA.TABLENAME_AUDIT,
SQLEXEC (SPNAME AUDIT_PROC,
PARAMS (@GETENV (“GGENVIRONMENT”,”HOSTNAME”),
@GETENV(“GGENVIRONMENT”,”OSUSERNAME”),
@GETENV (“GGHEADER”, “OPTYPE”),
@GETENV(“GGHEADER”,”COMMITTIMESTAMP”),
@GETENV (“GGHEADER”, “TABLENAME”),
@GETENV (“GGHEADER”,”USERID”),
@GETENV (“GGHEADER”,”TIMESTAMP”),
ALLPARAMS REQUIRED, ERROR REPORT, DBOP)
The ID clause of SQLEXEC must be used if this procedure is used for multiple tables, or replicat will ABEND with an error about duplicate PL/SQL procedure names.
How to transform a single source record in multiple target records
The following REPLICAT parameters converts a single source record in SRC.EVENT_MSG to multiple target records TRC.EVENT_MSG_MULTIPLE:
ALLOWDUPTARGETMAP
MAP SRC.EVENT_MSG, TARGET TRC.EVENT_MSG_MULTIPLE,
FILTER (@COLTEST (DFP_KEY_1, present)),
COLMAP (ID_STEP = @STRCAT (MSGUID, STEP_ID, STATUS_STEP),
KEY = DFP_KEY_1,
VALUE = DFP_VALUE_1);
MAP SRC.EVENT_MSG, TARGET TRC.EVENT_MSG_MULTIPLE,
FILTER (@COLTEST (DFP_KEY_2, present)),
COLMAP (ID_STEP = @STRCAT (MSGUID, STEP_ID, STATUS_STEP),
KEY = DFP_KEY_2,
VALUE = DFP_VALUE_2);
MAP SRC.EVENT_MSG, TARGET TRC.EVENT_MSG_MULTIPLE,
FILTER (@COLTEST (DFP_KEY_3, present)),
COLMAP (ID_STEP = @STRCAT (MSGUID, STEP_ID, STATUS_STEP),
KEY = DFP_KEY_3,
VALUE = DFP_VALUE_3);
MAP SRC.EVENT_MSG, TARGET TRC.EVENT_MSG_MULTIPLE,
FILTER (@COLTEST (DFP_KEY_4, present)),
COLMAP (ID_STEP = @STRCAT (MSGUID, STEP_ID, STATUS_STEP),
KEY = DFP_KEY_4,
VALUE = DFP_VALUE_4);
MAP SRC.EVENT_MSG, TARGET TRC.EVENT_MSG_MULTIPLE,
FILTER (@COLTEST (DFP_KEY_5, present)),
COLMAP (ID_STEP = @STRCAT (MSGUID, STEP_ID, STATUS_STEP),
KEY = DFP_KEY_5,
VALUE = DFP_VALUE_5);
11.2 Enhancements
Column Conversion Functions
A new @DDL function returns the name and owner of an object in a DDL record, the type of object that was affected, the operation type, and the first 200 characters of the DDL statement.
New STATS and DELTASTATS options of @GETENV return DDL and DML statistics for one or more tables.
This example returns statistics only for INSERT and UPDATE operations:
REPLICAT TEST
MAP TEST.ABC, TARGET TEST.ABC, COLMAP (USEDEFAULTS, IU = @COMPUTE(@GETENV(“STATS”, “TABLE”, “ABC”, “DML”) – (@GETENV (“STATS”, “TABLE”,”ABC”, “DELETE”));
Character-Set conversion
GoldenGate is now case aware and performs case-sensitive comparisons if the database is case-sensitive or if the database uses double quotes to enforce case-sensitivity. If the system or database is case-insensitive, Oracle GoldenGate converts case-insensitive names to the case in which they are stored when required for mapping purposes
The Oracle GoldenGate Replicat process supports the conversion of data from one character set to another when the data is contained in character column types. Character-set conversion support is limited to column-to-column mapping as performed with the COLMAP or USEDEFAULTS clauses of a TABLE or MAP statement. It is not supported by the column-conversion functions, by SQLEXEC, or by the TOKENS feature.
Oracle GoldenGate supports text input and output in the default character set of the host operating system for the following: Console, command-line input and output, FORMATASCII, FORMATSQL, FORMATXML parameters, text files such as parameter files, data-definitions files, error log, process reports, discard files, and other human-readable files that are used by Oracle GoldenGate users to configure, run, and monitor the Oracle GoldenGate environment.
- Is there an impact on Golden Gate performance from enabling DDL replication?
When the DDL replication is enabled, there will be extra overhead. GG will read the DDL statements from the source archivelogs, write them to the source DDL tables under GGS schema, frame new DDL statements and those will be applied at the target database.
The impact is not quantified.
- How to skip Orphan Transactions in Golden Gate.
When golden gate extract is restarted, it will start to scan at the recovery checkpoint archive
ex:
ex:
This means that there is an open transaction in 1013793 archive
GGSCI 2> info E_CBDS showch
GGSCI 2> info E_CBDS showch
EXTRACT E_CBDS Last Started 2010-05-03 03:07 Status RUNNING
Checkpoint Lag 00:00:00 (updated 00:00:03 ago)
Log Read Checkpoint Oracle Redo Logs
2012-05-04 11:59:26 Thread 1, Seqno 1013813, RBA 542140976
Current Checkpoint Detail:
Checkpoint Lag 00:00:00 (updated 00:00:03 ago)
Log Read Checkpoint Oracle Redo Logs
2012-05-04 11:59:26 Thread 1, Seqno 1013813, RBA 542140976
Current Checkpoint Detail:
Read Checkpoint #1
Oracle RAC Redo Log
Startup Checkpoint (starting position in the data source):
Thread #: 1
Sequence #: 1013526
RBA: 13032976
Timestamp: 2012-05-03 03:02:53.000000
SCN: 48.3433707770 (209592137978)
Redo File: /cbdspredo02/redo_1_02.dbf
Oracle RAC Redo Log
Startup Checkpoint (starting position in the data source):
Thread #: 1
Sequence #: 1013526
RBA: 13032976
Timestamp: 2012-05-03 03:02:53.000000
SCN: 48.3433707770 (209592137978)
Redo File: /cbdspredo02/redo_1_02.dbf
Recovery Checkpoint (position of oldest unprocessed transaction in the data source):
Thread #: 1
Sequence #: 1013793
RBA: 529800208
Timestamp: 2012-05-04 09:41:20.000000
SCN: 48.3631197264 (209789627472)
Redo File: Not Available
Sequence #: 1013793
RBA: 529800208
Timestamp: 2012-05-04 09:41:20.000000
SCN: 48.3631197264 (209789627472)
Redo File: Not Available
* checked the open transactions in Golden Gate The oldest transaction is 688.6.2315775
GGSCI (kamino) 7> send E_CBDS showtrans
Sending SHOWTRANS request to EXTRACT E_CBDS …
Oldest redo log files necessary to restart Extract are:
Redo Thread 1, Redo Log Sequence Number 1013634, SCN 48.3548506190 (209706936398), RBA 482658832
Oldest redo log files necessary to restart Extract are:
Redo Thread 1, Redo Log Sequence Number 1013634, SCN 48.3548506190 (209706936398), RBA 482658832
————————————————————
XID: 688.6.2315775
Items: 62523375
Extract: E_CBDS
Redo Thread: 1
Start Time: 2012-05-03:22:25:21
SCN: 48.3548506190 (209706936398)
Redo Seq: 1013634
Redo RBA: 482658832
Status: Running
XID: 688.6.2315775
Items: 62523375
Extract: E_CBDS
Redo Thread: 1
Start Time: 2012-05-03:22:25:21
SCN: 48.3548506190 (209706936398)
Redo Seq: 1013634
Redo RBA: 482658832
Status: Running
checked all the transaction from database but the oldest transaction stared 05/04/12 09:35:06 so transaction XID: 688.6.2315775 is mismatched anyway
SQL>SELECT s.inst_id,
s.sid,s.serial#,t.start_time,
s.status
FROM GV$session s, GV$transaction t, GV$rollstat r
WHERE s.saddr=t.ses_addr
and t.xidusn=r.usn
and s.inst_id=t.inst_id
and t.inst_id=r.inst_id
order by t.start_time;
s.status
FROM GV$session s, GV$transaction t, GV$rollstat r
WHERE s.saddr=t.ses_addr
and t.xidusn=r.usn
and s.inst_id=t.inst_id
and t.inst_id=r.inst_id
order by t.start_time;
1 10689 2949 05/04/12 09:35:06 INACTIVE
2 10657 2782 05/04/12 09:44:40 ACTIVE
2 10819 4957 05/04/12 09:49:20 INACTIVE
2 10054 65352 05/04/12 09:50:22 ACTIVE
2 10626 2361 05/04/12 09:55:16 ACTIVE
2 10624 5782 05/04/12 10:00:15 ACTIVE
* Skip this transaction BUT BE CAREFULL this means inconsistency with the target replication side Before doing that analyze the transaction XID: 688.6.2315775 in archive 1013634 with logminer (Refer to this Logminer article for detail) and then replicate this transaction manullay to the target database
2 10657 2782 05/04/12 09:44:40 ACTIVE
2 10819 4957 05/04/12 09:49:20 INACTIVE
2 10054 65352 05/04/12 09:50:22 ACTIVE
2 10626 2361 05/04/12 09:55:16 ACTIVE
2 10624 5782 05/04/12 10:00:15 ACTIVE
* Skip this transaction BUT BE CAREFULL this means inconsistency with the target replication side Before doing that analyze the transaction XID: 688.6.2315775 in archive 1013634 with logminer (Refer to this Logminer article for detail) and then replicate this transaction manullay to the target database
begin
sys.dbms_logmnr.add_logfile (logfilename => ‘arch_1013634_1_461937508.arc’,options=>sys.dbms_logmnr.new);
sys.DBMS_LOGMNR.START_LOGMNR(options => sys.dbms_logmnr.DICT_FROM_ONLINE_CATALOG);
end;
/
sys.dbms_logmnr.add_logfile (logfilename => ‘arch_1013634_1_461937508.arc’,options=>sys.dbms_logmnr.new);
sys.DBMS_LOGMNR.START_LOGMNR(options => sys.dbms_logmnr.DICT_FROM_ONLINE_CATALOG);
end;
/
create table serdar.logminer_table as select * from V$LOGMNR_CONTENTS;
–XID: 688.6.2315775
select * from serdar.logminer_table where xidusn=688 and xidslt=6 and xidsqn=2315775;
* Now skip the transaction
GGSCI 1>send E_CBDS showtrans 688.6.2315775 file tran_688.6.2315775.dmp detail
select * from serdar.logminer_table where xidusn=688 and xidslt=6 and xidsqn=2315775;
* Now skip the transaction
GGSCI 1>send E_CBDS showtrans 688.6.2315775 file tran_688.6.2315775.dmp detail
GGSCI 2> SEND EXTRACT E_CBDS , SKIPTRANS 688.6.2315775 THREAD 1
Sending SKIPTRANS request to EXTRACT E_CBDS …
Are you sure want to skip transaction [XID 688.6.2315775, Redo Thread 1, Start Time 2012-05-03:22:25:21, SCN 48.3548506190 (209706936398)]? (y/n)y
Sending SKIPTRANS request to EXTRACT E_CBDS …
Transaction [XID 688.6.2315775, Redo Thread 1, Start Time 2012-05-03:22:
Are you sure want to skip transaction [XID 688.6.2315775, Redo Thread 1, Start Time 2012-05-03:22:25:21, SCN 48.3548506190 (209706936398)]? (y/n)y
Sending SKIPTRANS request to EXTRACT E_CBDS …
Transaction [XID 688.6.2315775, Redo Thread 1, Start Time 2012-05-03:22:
- Now that GoldenGate is an Oracle product, what are the plans for adding the monitoring of GoldenGate into Oracle Enterprise Manager?
GoldenGate current provides a graphical user interface named Oracle GoldenGate Director.
Oracle GoldenGate Director is a GUI that allows you to monitor GoldenGate instances deployed across the enterprise and also includes a built-in configuration wizard to assist with new configurations.
Oracle GoldenGate Monitor, which will be released shortly, is similar to Oracle GoldenGate Director but it has much more monitoring points. Oracle GoldenGate Monitor will not provide any configuration or management capabilities that Oracle GoldenGate Director currently supports.
The first release of Oracle GoldenGate Monitor will not be integrated with Oracle Enterprise Manager but there is a product roadmap to integrate Oracle GoldenGate Monitor into Oracle Enterprise Manager.