Wednesday, October 26, 2011

Script Extract Update

I previously blogged about extracting the script from the repository in order to perform post processing. This code is nice but I've been driven to update the script for some specific version over version comparisons.
Recently I blogged about holding multiple repositories in the shared development database in order to compare components.
I decided that I need to bring these two ideas together: why can't I do code metric calculations across versions to see how much technical debt we are accumulating in our systems.

I've updated the sql which is now produces a much larger output (depending on how many repositories you have). To get the code out, I couldn't use SSMS so I used BCP:

bcp "exec SIEBEL_CODE_DUMP" queryout file.xml -w -S <> -T
 I created a stored procedure called SIEBEL_CODE_DUMP which could be called from BCP which used the queryout parameter to dump the script.
We have 10 repositories in our database and the output file was over 20MB.
This is the schema:
Obviously, I now have a very rich model from which to build metrics from. I've just scratched the surface, but I'm happy with what I'm getting. I'm thinking of doing dependency crawling, but that will involve some scripting and I have to see how much time that will take.

Thursday, October 6, 2011

Preparing for deployment

Migration to production is always a heart wrenching affair. With some Siebel releases, with multiple developers coming and going it gets worse.

We have attempted to keep a fairly strict deployment method as we do deployments entirely by hand; our admins have decided not to use ADM for better or worse. We're a multilingual installation and I think there were issues in getting the LOV migration to go well. In any case, using ADM or not, managing deployment artifacts is always tricky and the more time between releases the situation can get very grave.

For the latest release we've been doing a significant overhaul of some processes that use workflow processes, unfortunately some cruft has crept into the repository with multiple versions of the same process marked as Completed without being updated to Not In Use. Add to that, but how do we tell the deployment team what to deploy?

We have a series of deployment artifacts that document what goes into each release (and is maintained in a version control system):
  • Bill of Materials.docx - general document that holds a list of the specific files, images, etc that must be installed, instructions etc.
  • Dispatch Rule Sets.xlsx - a spreadsheet with dispatch rules
  • List Of Values.xlsx - a spreadsheet with the LOV changes (adds in green, updates in black, and deletions (deactivations) in red
  • Siebel System Configuration.docx - the standard reference bible for our implementation. Includes the component configurations, workflow policies, repeating jobs, workflows, web services, etc. Basically, the reference guide to our environment
I was having trouble getting a good list of workflow processes that were added, updated and deactivated between this upcoming release and the last one, so I wrote some SQL that helped me come up with a proper list (and helped me clean up the WF repository).

The prerequisite to using these statements is to import the last release repository into the development database under a different name. We typically just increment the version number so we have these repositories to choose from:


EXPIRED WORKFLOWS BETWEEN REPOSITORIES
SELECT DISTINCT A.PROC_NAME
FROM S_WFR_PROC A
    INNER JOIN S_REPOSITORY B ON A.REPOSITORY_ID = B.ROW_ID
WHERE
    B.NAME = 'Siebel Repository 1.9'
    AND A.STATUS_CD = 'COMPLETED'
    AND A.PROC_NAME IN
    (
        SELECT DISTINCT PROC_NAME
        FROM S_WFR_PROC A
            INNER JOIN S_REPOSITORY B ON A.REPOSITORY_ID = B.ROW_ID
        WHERE
            B.NAME = 'Siebel Repository'
            AND NOT EXISTS
            (
            SELECT * FROM S_WFR_PROC A1
                INNER JOIN S_REPOSITORY B1 ON A1.REPOSITORY_ID = B1.ROW_ID
            WHERE
                B1.NAME = 'Siebel Repository'
                AND A1.PROC_NAME = A.PROC_NAME
                AND A1.STATUS_CD = 'COMPLETED'
            )   
    )
 NEW AND UPDATED WORKFLOWS BETWEEN REPOSITORIES
SELECT A.PROC_NAME, A.VERSION [v.NEXT], C.VERSION [v.1.9]
FROM S_WFR_PROC A
    INNER JOIN S_REPOSITORY B ON A.REPOSITORY_ID = B.ROW_ID
    LEFT OUTER JOIN
    (
        -- GETS MOST CURRENT VERSION
        SELECT A.PROC_NAME, A.VERSION
        FROM S_WFR_PROC A
            INNER JOIN S_REPOSITORY B ON A.REPOSITORY_ID = B.ROW_ID
            INNER JOIN
                (
                    SELECT PROC_NAME, MAX(VERSION) [VERSION]
                    FROM S_WFR_PROC A1
                        INNER JOIN S_REPOSITORY B1 ON A1.REPOSITORY_ID = B1.ROW_ID
                    WHERE
                        B1.NAME = 'Siebel Repository 1.9'
                    GROUP BY A1.PROC_NAME
                ) C ON C.PROC_NAME = A.PROC_NAME AND C.VERSION = A.VERSION
        WHERE
            B.NAME = 'Siebel Repository 1.9'
            AND A.STATUS_CD = 'COMPLETED'
    ) C ON A.PROC_NAME = C.PROC_NAME
WHERE
B.NAME = 'Siebel Repository'
AND A.STATUS_CD = 'COMPLETED'
AND A.PROC_NAME LIKE 'ABC%'
AND (A.VERSION != C.VERSION OR C.VERSION IS NULL)
ORDER BY A.PROC_NAME

Friday, September 30, 2011

Search Criteria and index field order

I was working on some complex workflows the other day and after several days I completed the logic, ran the simulator a thousand times and was finally satisfied with the outcome.

Then I deployed it into test...performance was terrible.

I checked the indexes on the S_EVT_ACT table and found a custom index that matched the columns in the query. We had added a new column called "X_CHANNEL" to indicate how the activity was completed (email, fax, phone, etc). My query was looking for [Status] = 'Done' and [Channel] = 'Email' (with a sort on the started date).
Everything should be running fine...after scratching my head for quite a while, creating and dropping indexes on the table.

The existing (custom) index on the table was:

CREATE NONCLUSTERED INDEX [ABC_S_EVT_ACT_X9_X] ON [dbo].[S_EVT_ACT]
(
    [X_CHANNEL] ASC,
    [EVT_STAT_CD] ASC,
    [TODO_ACTL_START_DT] DESC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
GO


I changed the order of the search expression to [Channel] = 'Email' and [Status] = 'Done', and the query took 62ms instead of 5 minutes (300000ms). I had never put much thought into the order of the fields in the index until now. I'll be keeping that in mind from now on...

Workflow runtime events

A lot of information is available in bookshelf and online regarding workflows and runtime events so I won't cover that again. I would like to share a diagram I created for a co-worker that is getting up to speed on Siebel workflows and run-time events.
I was trying to explain how the same workflow needs to be duplicated in order to capture the same run-time event but under different business objects.


Hope someone finds this helpful.

Monday, September 26, 2011

Drilldowns using Minibuttons on form applets

I was looking for a convenient way for users to navigate to the service request from an activity form applet. From a list applet a drill down is a very effective tool for users to navigate - the breadcrumb provides context and quick backwards navigation, but how can we use drill downs from a form applet?

This is the final product:



Obviously, using scripting we can use GoToView, but I wanted to create a code free solution, so I used a "Named Method". On my form (which is based on the Action bc)

Create two user properties on the applet:
Name: CanInvokeMethod: GoToSR
Value: [SR Number] is not null

Name: Named Method: GoToSR
Value: 'INVOKE', 'DrillDown', '"Service Request"'

Add a new control:

Name: xxx
HTML Type: MiniButton
Method Invoked: GoToSR
Runtime: true
Field: Activity SR Id

I bolded the Field property because it's non-intuitive: why would I define a field for a button? The field on the button is the context for the drilldown object:

Name: Service Request
Hyperlink Field: Activity SR Id
View: Service Request Detail View
Source Field: Activity SR Id
Business Component: Service Request
Destination Field: Id

If the field is left blank on the button an error like the one below will appear:
The drilldown isn't defined or enabled on the button because it doesn't have a field.

Friday, September 23, 2011

Siebel monitoring with bubblegum (awk)

If your organization hasn't made a sizable investment in Third-Party Siebel monitoring tools (or even leveraged the impressive monitoring available from Oracle), you are risking business processes that are dependent on the background processes that keep the trains running on time.

Our Siebel installation has several integrations based on asynchronous workflow processes and workflow policies. Both of these processes take work off of the main user focused object managers to maintain responsiveness and insulate our users from outages in our integrated system.

That's great, except when the background processes in Siebel fail. Unless your users are very observant, it's easy for a workflow manager to go offline and no one notices until business processes and customer are impacted.

Without access to the fantastic (and fantastically expensive monitoring tools available) one is left with waiting for alert email's from Siebel and/or periodic checking.

Rules of Thumb
  1. don't depend on the system being monitored to tell you when it's down. If it's not available, it probably can't send email
  2. Periodic checking is a waste of everyone's time. If your organization can spend money having someone check on components manually, you are lucky (unless you are the one doing the checking).
  3. Siebel has some funny defenitions of "down" especially concerning workflow monitors (or background components)
I authored this mashup of SRVRMGR.exe, dos batch, blat, and awk for text parsing.
  1. srvrmgr.exe is the commandline window into Siebel. Basically everything that can be done on the Administration - Configuration and Administration - Management screens can be done with the command line (actually more...)
  2. dos batch (for windows) ties the output from the srvrmgr tool and pipes it into awk. I probably could have done everything in awk, but this got the job done
  3. blat is a command line tool for sending email (very old school, but hey...)
  4. awk (gawk actually) is a command line scripter's text wrangler. I consider it perl lite - not as capable but easy to create text parsing scripts
So...here are the scripts:

Monitor.bat
@echo off

set srvrmgr=E:\sea78\siebsrvr\BIN\srvrmgr.exe
set gateway=GATEWAY
set enterprise=SIEBEL
set query=list components for server SIEBELSRVR show CC_ALIAS, CC_NAME, CP_DISP_RUN_STATE, CP_END_TIME

REM REM REM REM REM REM REM REM REM REM REM REM REM REM

"%srvrmgr%" /g %gateway% /e %enterprise% /u SADMIN /p Sadm!n!! /c "%query%" /b | gawk.exe -W re-interval -f parse_siebel.awk
parse_siebel.awk
 BEGIN {


#
# Comma seperated list of component aliases to monitor
#
components = "AsgnSrvr,WorkMonActivity,WorkMonAsset,WorkMonT4Hist,MailMgr"

#
# Comma seperated list of email addresses (no spaces please)
#
emailto = "SOMEONE@SOMEWHERE.COM"
#
# (Fictional) Email address that message is from
#
emailfrom = "NO_REPLY@SOMEWHERE.COM"
#
# Subject line
#
subject = "\"Siebel Component Failure\""
#
# SMTP server name
#
smtp = "SMTP.SERVER.COM"

################---Don't edit below this line---###################
FS = "[ ]{2,}"
blat = "blat.exe"
split(components, comps, ",")
};
$3 ~ /Starting Up/ { (msg $2 " is starting up\n") }
$3 ~ /Shutdown|Offline|Shutting Down|Unavailable/ {
    for (c in comps) {
        if ($1 == comps[c]) {
            msg = (msg $2 " (" $3 ") @ " $4 "\n")
        }
    }
}
END {
    if (msg != "") {
        msg = ("Some components are in an invalid state, please investigate:\n\n" msg)
        cmd = (blat " - -to " emailto " -subject " subject " -server " smtp " -f " emailfrom " -q")
        print msg | cmd
        print "Sent mail!"
    }
};
 I know that this script could be much enhanced, but it works well for a couple hours of work

Wednesday, September 21, 2011

Consuming Siebel Web Services with InfoPath

Continuing my work on InfoPath I've completed the web service interface in Siebel based on very simple web service contracts that are easy for the rules engine in InfoPath to use. I will walk through the process of setting a reference to the web service and configuring an InfoPath form to use the service.

Firstly - InfoPath uses the WSDL file that Siebel produces on the Inbound Web Service page to create the internal proxy. If you don't know what WSDL is then it's best to learn about it and then continue.

1. Save the wsdl from Siebel into a convenient location
2. Open up an InfoPath form and open up the Data Connections window available from Tools, Data Connections
 3. Click on  Add...
A wizard starts - it's important that the Receive Data option is chosen; with this method you can specify inbound parameters in our case, we want to specify the SR number as well as the name of the activity plan to attach.
4. The next dialog asks what the source/destination of the data connection. Select Web Service
5. Here we have to specify the location of the WSDL file we saved from Siebel earlier. If we were connecting to a .Net web service we could provide the URL that would auto generate the wsdl on the fly.
6. The next step involves picking the specific operation we want to use. If the web service has multiple operations we need to pick only one. I've created a web service from a workflow process so there will only be one operation.
7.The next dialog asks us if we want to specify any default values. In this case, there are none, but there are scenarios where we want to call a staticly defined service (let's say to get a specific List Of Values type) and we don't want to use rules to set the initial parameters.
8. We can choose to fire the web service call when the form is launched - perhaps to refresh a set of values retrieved from an external source (again the List Of Values idea comes to the fore). In this scenario, we don't want to choose this option as the form will execute the service based on rules.
 9. On the next page we can choose to pre-cache the data from the data connection. Again there are valid scenarios for this option (like a list of values query) where we want to pre-cache a set of values in the InfoPath template. It will save on network traffic if constant refreshing is not needed. (Great for combo-boxes on the form)
10. That's it. Quite a few steps, but the wizard interface breaks all of the options down into an easily digestible set of steps.

I'll go over how to actually use the web service in a future post. (Code free!)

Wednesday, September 14, 2011

Infopath Integration with Siebel

I have worked out the front half of the integration with Infopath: the creation, prepopulation and launching of the infopath form. I'll write about that in a separate article. However, I've been scratching my head on the back half of the integration: saving the form and updating Siebel.

The basis of the workflow is to use Siebel as workflow and assignment. Data capture will take place in InfoPath:  the number of data capture points is very large and subject to complex rules - a perfect fit in my mind. The interesting part is how we want to push the data capture in two directions:
  1. the XML document that is generated will be stored in a Sharepoint document repository (but linked to via a SR attachment url)
  2. data captured in the form should drive elements in Siebel such as Service Request Due Date (commit time) and priority. In addition, if the data capture indicates multiple followup steps (they almost all do), then activity plans should be attached to the Service Request.
While InfoPath has a very rich object model and supports C# coding, adding any code immediately requires the form to be digitally signed. Do-able but not fun, and I don't want to start messing around with certificates, especially if non-programmers are going to be designing the forms primarily. Thus my objective with InfoPath is to be "code-free" and base all of the logic on the built in rules-engine for decision making logic.

The one limitation of the InfoPath rules when calling secondary data sources is that we are limited to setting values on nodes, not creating nodes. So for example - I have a Siebel Service Request Integration Object that has a ListOfActivityPlan node with a one to many relationship with the child activity plans, I can't create additional activity plan nodes from within the form rules, I would have to resort to code thus violating my rule above.

I thought about this one quite a while and considered a number of options:
  1. http posting the entire InfoPath form into Siebel and then writing a complex business service, or workflow to handle the logic of deciding what activity plans need to be attached.
  2. send the form to an asynchronous middleware system which could do the necessary transformations into a Siebel SR Integration object. I didn't like this one because the user might be disoriented that the plan didn't generate synchronously. This would be frustrating for the user if they had to reassign an activity but had to keep re-querying the applet until the plan(s) was/were attached.
  3. write a very simple web service interface for Siebel based on a custom (simple) external schema that allowed for one activity plan to be attached per call.
I decided on the last approach as it was the simplest and still kept the business logic in the form beside the data that drives the logic. (I'm sure there are arguments to externalizing the logic but this was the direction we went in) The one downside is that if there are multiple plans to attach, then multiple web services have to be called with the resultant performance hit. I'm willing to take it on the chin for that given the other trade-offs.

This is what I have so far:
InfoPathServices.xsd - this holds the request/response pairs for the web service. I consider this to be a contract-first approach that is in line with how Siebel is architected.


These were then added to Siebel using the new object dialog:
Then I used the Integration Object Builder. I picked a project and then I used the EAI XSD Wizard to import the integration object from the xsd file.
Pick the element that you want to import, give it a unique name in the repository, click next a couple of times and you will have some new entries in you integration object list.

I'm using a workflow for my web service, so I defined the in and out process properties to be the integration objects I just created. The image below shows what I have so far; it isn't complete because I need to create the outbound object, but it's close and it simulates well.

Note: I used the Workflow Utilities.Echo method to get the values out of the integration object. This was tricky because of the 75 character limit for "dot" notation. I used the Alias data type to define the XPath-ish retrieval pattern and I was able to get my properties.

I also know that I could have used the transformation engine to create an integration object, but that was more work than this approach. If my inbound object had more than a couple of fields I would have either used the DTE or xslt to do a transformation.

Monday, September 12, 2011

Configuring the MSMQ Receiver Component

Understanding and configuring Named Subsystems is critical for successful use of the EAI Transports. This is a powerful yet relatively misunderstood part of the Siebel Architecture.



@lex at Siebel Essentials has a great article on Named Subsystems so I won't repeat what he said best.

However there are few things that will still cause to you pull your hair out: Advanced parameters - these aren't visible in the UI and can only be managed using the command line tool. Get to know srvrmgr.exe it is your friend - really.
  • To work effectively with the tool, make sure you spool out the results and then use an external editor to view the query results. I use BareTail as it automatically scrolls. One can also limit the columns that are emitted, but that requires remembering each column specifically and then adding those to the command when executing.
list advanced parameters for named subsystem MSMQReceiver
  • If using a dispatch ruleset (highly recommended), this is where you set the name of the ruleset
change parameter RollbackOnDispatchError=False for named subsystem MSMQDataSubsys
  • Xml Converter service - when working with the dispatch service ensure that the ConverterService parameter is set to "EAI XML Converter"
change parameter ConverterService="EAI XML Converter" for named subsystem MSMQDataSubsys
Once your Subsystems are configured you can create (or modify the existing) background component to use these subsystems:
  • "Receiver Data Handling Subsyst" parameter
  • "Receiver Connection Subsystem parameter
You will then still need to specify the Receiver method name to work with the data handling and connection subsystems.
I used ReceiveDispatch as I wanted to use a rules based routing approach (the workflow process/BS is specified in the rule) with a separate return path. One can also specify ReceiveDispatchSend to use same connection parameters for a response.

Thanks

Tuesday, September 6, 2011

Multiple repositories in Siebel Tools

Normally one develops using a single repository but upon occasion - especially when working with support - there is a need to do some testing and/or comparisons with the vanilla repository.

One approach is to load the repository into the Server Db using "Repository Migration Configuration Utility" with a new name - such as "Siebel Repository - Vanilla". Then one can connect to the server db using tools and export sif files for comparisons. This works well for comparisons, but if one is wanting to check for that one "breaking change", then being able to make a quick local change to the repository to then do a compile, managing multiple local repositories is preferable.

Managing Multiple repositories: there are actually two ways of doing this:
  1. Maintain multiple local databases and multiple tools configurations
    • Con: unable to use the Compare Objects --> Other repository command
  2. Single database with multiple repositories integrated
    • Pro: able to use the Compare Objects --> Other Repository command
    • Pro: easy to switch between repositories in one window
I use the second approach as it is a little simpler to manage.

Steps to extract a second repository into your local database (all steps take place within Siebel Tools):
1. Tools --> Check Out
Check out dialog
Choose repository

Click on Get


After clicking on get some warning dialogs appear:
Initial warning

Second warning
These warnings can be dismissed without concern.
Final Warning
Finally after the Get has completed, this dialog appears that indicates you have multiple repositories including one that isn't active.

After I've retrieved the repository I'll lock it locally, make a change and cut a new SRF. If I put it on the server I'll be able to do a quick check to see if that one change was the "breaking change" and is invaluable for providing the feedback to Oracle Support.

[Edit]

I posted a link to this article on a LinkedIn Group and the group's created (an Oracle Support engineer) commented, pointing to the following link in Oracle support:

Multiple Repositories Causes Task Based UI (TBUI) Field To Not Show Any Values [ID 1297783.1] 

Please keep that in mind when using multiple repositories - thanks Valter

Monday, September 5, 2011

MSMQ Inbound Integration

We have built a custom middleware solution at our organization that leverages MSMQ for a lot of workload balancing internally. We have also started to push the MSMQ concept to our application partners as we find that it provides a resiliency to network outages and the various maintenance windows that are required for patches and upgrades.

We haven't done much MSMQ work with Siebel until recently as there wasn't much documentation on the MSMQ transport. However, given the potential, we recently invested the time and effort to make this happen.

These are the requirements that the middleware imposed:
  1. have one inbound queue on our Siebel application server
  2. this one inbound queue would be the channel for multiple message types that might require different processing in Siebel.
  3. once processing is complete (or an exception takes place) processing status for the inbound message needs to be echoed back to the middleware server to a specific message queue



The diagram above illustrates the components and the flow of control and data.

Important points:
  1. The MSMQ Receiver is configured as Receive and Dispatch and not Receive, Dispatch and Send as I wanted to specify a different response queue depending on the situation.
  2. Using the EAI Dispatcher allows the design to accommodate multiple workflow processes based on incoming data. It also allows for parameter injection into the invoked workflow process. (I've found other methods of making workflows environment agnostic, but this is great!)
  3. Sending acknowledgement using an asynch server request is key so that the original message can be committed off the queue - this took me a long time to figure out.
Sending a response was packaged into a workflow process that accepts a message id plus some sort of error message if required. This is then sent back to the middleware system using a standard xml document contract as an xsd/integration object. Middleware then knows to remove the sent message from its internal processing queues.
Getting the right xml converter configured on the MSMQ Receiver was important for the dispatcher so that the (quasi) xpath rules would function properly. In this case the following search expression worked.
/*/SiebelMessage/ListOfShippingHistoryIO
The "/*/" at the beginning accomodates the Siebel-Property-Set root node.

This setup is working well - it's high performance and outage resilient. Just remember to set a reasonable Time-to-recieve on the MSMQ message so that if there is a problem on the Siebel side, the message will bounce back to the sender.

Sunday, September 4, 2011

EAI HTTP Transport

Little known fact: the HTTP transport supports PUT in addition to the documented GET and POST. Just set the HTTPRequestMethod parameter to the literal PUT.

Also, for bonus points the transport supports NTLM authentication out of the box. There is an Oracle Support ticket ([ID 758617.1]) that indicates that it is not, but that is wrong. I did a Wireshark capture from our Siebel server and found that not only did it support NTLM, but it pre-authenticated thus reducing the number of round trips that are required.

---
These undocumented facts were discovered while attempting to integrate InfoPath and Sharepoint with Siebel. As part of our process to bind the products together I need to create an instance xml document of an InfoPath form and then upload it to Sharepoint (thereby storing only the link in the SR attachments table).

Several approaches were taken with varying complexity, but none with satisfaction until I discovered that the HTTP Transport supported PUT. (the instance document is created by an xslt on the server)
1. Have the workflow process save the instance document to a webdav share from sharepoint. It works from windows explorer but not from the EAI File Transport Send method.
2. I explored using the Sharepoint web service API's and creating a Java business service - the complexity was too high for something this simple. It had a code smell to it that I didn't like. Plus this would be our first Java business service so there would have to be a significant server deployment footprint.
3. Use the MSXMLHTTP COM object in a custom business service. The code is dead easy b/c sharepoint supports a simple PUT method to a document library. I was in the process of starting to code the new business service when I decided to just "Try" inserting the PUT into the business service method arg.

So, if your target server want an HTTP PUT you can use it using the built in HTTP Transport.

Tuesday, August 30, 2011

Infopath and Siebel integration

Why would one want to integrate InfoPath and Siebel? Here are some reasons that I thought of:
  • Siebel data is highly structured - it's a relational database after all
  • InfoPath is based on semi-structured data - it's document focused
  • Siebel releases take months
  • InfoPath forms can be updated and released the same day
InfoPath may already be in use as a flexible data capture tool that can be line of business and process specific. It has a great declarative designer that can be used by non-programmers and technical BA's. Also, because it's based on XML, the form data can travel wherever XML is accepted.

Siebel on the other hand is a great work package management tool in directing and assigning work to teams and users. It manages the life cycle of work. Stitching the two tools together is not so easy and requires a bit of thinking for how things need to work together.

Siebel Workflow Wait steps for input validation

One of the most common requests I see involve comparing old row values to proposed row values. This is used in input validation or conditional execution of business logic.

For instance, we wanted to trigger the update to the employee profile if an activity status changed from one status to another status.

This is easily achieved through code using the PreSetFieldValue and WriteRecord events using a shared variable in the buscomp code. In looking for a workflow approach I was able to use a simple workflow that used the Workflow Utilities.Echo method, right after the start step and then right after a wait step with the WriteRecord event attached to it. Simple, see the picture below for an example with a workflow using the Asset Management business object:



The Get Original Value step output argument is configured like this:

Property Name: OriginalStatus
Type: Business Component
Business Component Name: Asset Mgmt - Asset
Business Component Field: Status

The Get New Value step output argument is configured like this:
Property Name: NewStatus
Type: Business Component
Business Component Name: Asset Mgmt - Asset
Business Component Field: Status

This approach works well if you are capturing the events on the primary buscomp of your business object. For example: the Asset Mgmt - Asset bc of the Asset Management bo. Great!

What if you want to capture the events of a child bc. Not so easy - after trolling around Oracle support I came across one document that looked promising: [ID 496724.1]. The problem with this approach is that I was unable to get the values before the write event, in other words, when the wait step was reached the values had already been copied into the BC.

I updated my workflow to include a Workflow Utilities.Echo step before the wait step, but instead of retrieving the values from the BC, I used the expression syntax. It worked beautifully.

The Get Original Value step is configured with output args like this:
Property Name: OriginalStatus
Type: Expression
Business Component Name: Asset Mgmt - Asset
Value: [Status]

The Get New Value step output argument is configured like this:
Property Name: NewStatus
Type: Expression
Business Component Name: Asset Mgmt - Asset
Value: [Status]