Sunday, May 2, 2010

Script Editor Extensions for Extranet 5.6

One of my first posts to this blog was to announce the availability of an experiment I was working on to enhance the Workflow Script editor so that it supports syntax highlighting and automatic indentation. The experiment worked well enough that several of you gave it a try and are still using it even today. That’s not to say there weren’t rough edges and annoyances. There most certainly were, but I told myself that if I were to revisit this experiment again, that I’d keep those in mind.
Recently, I’ve been wanting to update the editor so that it was compatible with Extranet 5.6 and this weekend I took the time to do just that. I’m happy to announce version 2.0 of the Script Editor Extension.
image
All previous features were retained:
  • Automatic Syntax Highlighting for JScript
  • Automatic indentation according to JScript syntax rules
Version 2.0 includes some new features too:
  • Extensions for both the Workflow Script Editor and the Command Window
  • Completely rewritten to avoid inclusion of base Extranet pages in the Script Editor Extension download package. This does imply making a minor change to the standard editor pages but it is as simple as adding a single symlink. Nothing already on the base pages is altered in any way. My hope is this will make it easier to adopt new base enhancements as they are released.
  • Deployable as a standard Update via the Administration manager
  • Automatically resizes the editor as the page is resized
  • The color scheme used for syntax highlighting matches the standard color scheme used in Site Designer and Entity Manager.
If you are already on Extranet 5.6 and are interested in giving this a try, just download the package and follow the included instructions.
Though this is provided as-is and is not part of the standard product, I’d love to hear your feedback. I plan to use this myself and if you encounter any issues, there’s a good chance I’ll post updates.
Cheers!

Friday, April 23, 2010

Retrieving production WOM Logs using only the browser

From the “Did you know” department comes a nice little reminder about a standard feature that is often forgotten.

Have you ever found yourself being asked to troubleshoot a problem on production? The first question I always ask is, what’s in the WOM Log, only to find out that, though I can browse to the site, I’m not allowed to remote into the server. I wholeheartedly support the security rule preventing server level access but that doesn’t change the fact that I want to take a peek at the WOM Log. Enter WOM Log Center to save the day.

image

As a Site Manager on the site, you have the ability to get your hands on the WOM log files any time you want via the WOM Log Center. This feature allows you to package up all Log files last modified within a specific date range into a single zip archive and download them to your desktop for your reading pleasure. You even have the ability to download the zip as an encrypted file to hide data from prying eyes.

image

You can find this handy utility right on your site in the Site Options tab on the Site Administration page.

Cheers!
- Tom

Monday, April 19, 2010

Hybrid CDT’s and Cloning

It’s exciting to see an idea thrive in the wild. It’s through real-world use that an idea is truly tested and the Hybrid CDT concept has held up well for the most part. Recently, however, a corner case was discovered that I thought you should know about. First, a little background on cloning…
Cloning in one form or another has been in existence with the starter sites for a long time. It’s used for a variety of purposes, initially to support the Amendment process then later to support things like Project templates. I’ve even seen it used as a means to generate large volumes of test data. In order for cloning to work, it must respect a certain set of rules. These rules allow the result of the clone (i.e. the copy) to adhere to all the structural and relational rules that define a project. Some examples, include:
  • The copy must have a unique ID
  • All Data Entry CDTs must be uniquely owned by the project
  • Persons, Organizations, and Selection CDTs must not be duplicated as they are considered to be shared resources, referenceable by multiple projects (among other things).
To make it easier to adhere to all of these rules, there is an Entity Type named EntityCloner which provides all the methods you need to clone a project or any other entity graph. These methods have the rules of cloning baked into them.
So what does this have to do with the Hybrid CDT approach? Keep in mind that the Hybrid CDT approach uses the default behavior for Data Entry and Selection CDTs in a unique way to take advantage of the native user interface controls, thus avoiding the need to craft a custom UI. To accomplish this, we use two CDTs: A Data Entry CDT that “wraps” a Selection CDT. From a UI perspective this gives us the flexibility that we need. However, from a data persistence perspective we’re actually deviating from the standard definition of a Selection CDT. Rather than the selection CDT entity being a shared resource that can be referenced by multiple entities, we change the rules a bit and only allow the selection CDT to be referenced by the data entry CDT that wraps it. In turn, being a data entry CDT, it can only be “owned” by one project. The combination of the two really represents data that should be managed as a data entry CDT entity.
The clone method in EntityCloner, however, has no idea that we changed the rules in the case of the Hybrid CDT, so it still clones Data Entry CDT entities but does not clone Selection CDT entities. This means that, following a clone the same entity is referenced by both the original and copy. This should not be allowed. To overcome this problem, we need to enhance EntityCloner to be aware of this new special case. The good news is that the enhancement is very simple to implement. The bad news is that it requires a base enhancement. Here’s what you need to do…
Via Entity Manager, edit the EType named EntityCloner. On that type, there is a method named isClonable. This method is called by the clone process to determine if an entity should be cloned. It returns a Boolean where true means clone and false means don’t. In that method, you’ll find the logic that makes this determination. The part we’re interested in looks like this:
if
(
    entitytype == Document    ||
    entitytype == WebPage     ||
    (entitytype.inheritsFrom(CustomDataType) && entitytype._usage == "dataEntry") ||
("isClonable") && entitytype.isClonable()) ||
    entitytype.inheritsFrom(CustomAttributesManager) ||
    entitytype == HistoricalDocument ||
    entitytype == ResourceHistory
)
{
     clonable = true;
}
You will want to enhance it by adding the highlighted code as shown here
if
(
    entitytype == Document    ||
    entitytype == WebPage     ||
    (entitytype.inheritsFrom(CustomDataType) && entitytype._usage == "dataEntry") ||
   (entitytype.inheritsFrom(CustomDataType) && entitytype._usage == "selection") &&
                            (entitytype.hasTypeMethodNamed("isClonable") && entitytype.isClonable()) ||

    entitytype.inheritsFrom(CustomAttributesManager) ||
    entitytype == HistoricalDocument ||
    entitytype == ResourceHistory
)
{
     clonable = true;
}
This will cause the method to determine if the Selection cDT entity is to be cloned by asking the entities type rather than just blindly saying “don’t clone”. The next step is to add the isClonable() method as a per-type method to each of the Selection CDTs that exists as one half of a Hybrid CDT. The method you will add looks like this:
function isClonable()
{
    try {
        // this selection CDT is used as part of a Hybrid CDT pair so entities of this type need to be cloned.
        return true;
    }
    catch (e) {
        wom.log("EXCEPTION _YourSelectionCDT.isClonable: " + e.description);
        throw(e);
    }
}

As EntityCloner is a base Extranet type, you will need to track this change in case you need to reapply it after a base Extranet upgrade but, once implemented, you can safely clone projects which leverage the Hybrid CDT technique.
Cheers!

Monday, March 15, 2010

Upgrades, New Sites, and New Projects – Oh My!

Since we last “spoke” those of us in Click Professional Services have participated in the successful upgrade of existing customer sites to Extranet 5.6, helped usher in brand new compliance sites, and kicked off several new projects. It’s only fair to mention that none of this would be possible without your significant efforts. Don’t be shy, you know who you are.

Let’s take the process of upgrading to Extranet 5.6, for example. While we can help with all the technical bits, and guide your effort an upgrade will never go smoothly without the diligence of your technical and business staff to regression test your site to verify that there are no upgrade surprises. From a calendar time perspective, this is the biggest part of any upgrade. This past weekend was the culmination of a well executed upgrade project for a IRB and CTRC site. Though I stayed within reach of my computer all weekend just in case I was needed, I only received a couple of calls and the upgrade completed without any serious complications. This is a testament to the work done by the site owner to properly test and prepare for the actual upgrade and proves that a well executed upgrade doesn’t need to be characterized by a plethora of crossed fingers. I’m not saying it can be completely stress free as there are always those little worries and time pressures but those who are well prepared can keep those worries from defining the effort. Congratulations on a well executed upgrade!

Also this past weekend, I supported the roll out of a brand new IACUC and IBC site. New site deployments are often tricky as the communication plan which includes all the pre-launch training, the “new site is on it’s way” announcements and all the promises  up the management chain really increase the stress level. First impressions of a site are so critical to it’s near term success that not meeting the commitments made about the when’s and what’s negatively contribute to those first impressions. On the flip side, rolling out the site as promised , on time and with the advertised features is the first step in establishing the long term success of the site. No pressure, right? The number of variables involved in rolling out a site also contributes to the stress level. Did the legacy data load work? Were any of the steps missed? Can the users properly log in? I’m so tired, why didn’t I eat a good breakfast this morning? Did I forget to turn off the stove? OK – those last two may seem unrelated but once we start down the path of worrying about all that can go wrong, the lines seem to blur a bit – or is that just me?

Just like an upgrade, most sources of angst can be minimized through proper preparation, practice, and testing. This means that time is allocated in the schedule for a “let the dust settle” period between development/test and deployment. This period is used to do all the little things that are often forgotten such as triple check the things that caused you to loose sleep and to make sure you turned off the stove. Hopefully this will help you gain a small amount of calm confidence.

Even if you aren’t able to achieve the desired level of calm, don’t worry. Realize that your users are new to this site as well and you know more than they do about how to use it. Expect there to be new issues discovered after going live and expect there to be a series of configuration updates to be applied to incrementally correct and tune the site. The feedback you will get over the first few weeks (and beyond) is a great source of information and informs your release planning process. Constructive feedback is not only a sign that user’s are using the site but that they are also invested in it’s success. If you haven’t already put your SDLC into action, now is the time because you have actual users, live data, and a steady stream of future enhancements to push out the door. Accept the fact that a site is never done but by all means celebrate your achievement of going live! Now where’s the party?

Cheers!

Ideas that worked and those that haven’t (yet)

Over the past year, I’ve released a few development techniques into the wild, not knowing if they would return to bite me or flourish. I’m happy to report that I’ve yet to be bitten. Here are the techniques that are starting to flourish:

  • Hybrid CDT – I’m aware of this technique being deployed in at least 5 customer sites so far with great success.
  • ProjectValidationRule – To my knowledge this is being used in 3 customer sites so far and I’ve already been getting enhancement requests which is a good sign.
  • Using JQuery and Ajax in your site – To be honest, this technique shows a pretty basic pattern so knowing if my post spurred you to action or you came up with it on your own is difficult to say but I’m personally aware of 6 sites which use this or a variation and I’ve see a couple of very impressive uses of Ajax to implement things like inline form expansion to facilitate the entry of CDT data instead of using the standard popup. I have no doubt we’ll be seeing more examples of this approach over the coming months, including in the base Extranet product. I’m looking forward to the Ajax-driven choosers in Extranet 5.7, for example.

I also blogged about a couple of experiments I was trying which, unfortunately have yet to work out. They are:

  • Modules and Solution support in Process Studio – For those of you unfamiliar with this feature, it was released as a “technology preview” in Extranet 5.6 (which basically means it hasn’t been verified to work in all cases). The goal of this feature is to allow you to be able to categorize your workflow elements into different modules and to define Solutions which are comprised of those modules. When you build a configuration update, you can select which solutions to include. It’s an exciting idea because, when it works, it will allow you to manage separate release schedules for each of your solutions that coexist in a single site. I gave it a try on a recent project and was able to identify a few issues which makes it unusable in its current version. This effort was useful though because we were able to both identify the issues and, through practice, suggest to the Engineering team the short list of fixes that would be required before it can be tried again. I’m still very excited about this feature and will definitely give it a go again when the few critical issues have been addressed.
  • Object Model documentation approaches – I’ve blogged about various data modeling techniques as well as the importance of establishing your data model as early in the project as possible. One of the challenges in getting a model implemented is how to review the object model design before making the investment to actually implementing. I explored various tools using a few key evaluation objectives and eventually settle on two open source tools: StarUML and Doxygen. I was very pleased with the results when used as an initial design tool but less pleased with the fact that it wasn’t integrated into the overall implementation process. What was required was round-trip-engineering support. Once designed, it would be great to then auto-create or update the Project Types, Custom Data Types and Activity types in the development store. It would be equally nice to also publish the implemented model back into StarUML. I can see how both could be accomplished but failed to find the time to make it happen. I still use this technique for initial model design and review but it becomes too costly to maintain once development is underway. One day, I hope to find the time to add round-trip support. In the meantime, the Applications team is doing some promising work to be able to generate eType reference pages in real time. I like this approach because those reference pages are always accurate. The downside is that the model needs to actually be implemented before those pages will work and reviewing a design after implementing seems a bit counterintuitive. The value in that technique dramatically increases after the model has been reviewed and implemented for real.

There are a couple of other techniques which I hope will be useful but have not heard about since released into the wild:

  • Extranet to Extranet Single Sign-On. I’d love to hear if you have put this technique into practice.
  • Script Editor Update – One of me earliest posts, It’s been fun to stumble upon customer sites where this has been put into practice. It wasn’t perfect by any means and I still have the list of improvements some of you have suggested and hope to implement them some day. I verify that it works on Extranet 5.6 and if it doesn’t publish a new version. If you already know if it does, please let me know.

So, no real failures but a couple of experiments that need more time to mature and a couple that haven’t flourished…yet. The efforts that flourished are enough for me to want to keep throwing ideas your way. If you find them useful (or even if you don’t), let me know. I’d love the feedback.

Cheers!

Tuesday, January 26, 2010

Kicking off the new year in high gear

We're nearly one month into 2010, and I'm excited about all that's happening around the office. Here are a few highlights:

  • CCC membership is growing fast and that means that all of us in Professional Services are keeping very busy and we continue to grow our staff to meet the ever increasing demand. This is good stuff and I welcome all the new CCC members.
  • We have our first Grants customer live on Click Commerce Extranet 5.6 and within days of going live they electronically submitted their first grants application to Grants.gov using Click Commerce SF424 1.7. Milestones like this are always rewarding.
  • We just completed delivery of our Introduction to Process Automation course last week.
  • C3DF preparations are in high gear and I hope to see many of you here next month.
  • Our upcoming Advanced Workflow Configuration course, scheduled for the Monday immediately following C3DF is already full and more have signed up for the next offering. This is personally exciting to me because one of the things I enjoy most is sharing what can be accomplished on the Extranet platform.
  • We've kicked off a project with the nation's largest IRB (I'll let you wonder who this may be since the official press release isn't out yet)
  • We're kicking off several other projects as well including the NASA IRB project, though we're disappointed that the meeting won't include a trip to the International Space Station. I can only imagine the battle to be a member of that project team if it did!
  • We've kicked off our first Grants project that will leverage a completely new approach to budget grid implementation. This new Grid approach is very exciting and will rapidly spread out beyond Grants and surface in other solutions such as Clinical Trials, Participant Tracking and Animal Operations.
  • We've begun to work through best practices for defining and implementing a new generation of reports. I'll definitely be talking more about this as we roll out our Enterprise Reporting Services.

It's been a frenzied start to the new year and I've been unable to return to this blog since the relative quiet of the holidays. Realizing that things aren't going to slow down any time soon, I either accept the fact that I go dark on this blog or somehow find the time to keep things relatively current. It's not for a lack of topics, I rarely find myself without something to say. Whether or not it's something you actually want to hear is up to you and I certainly hope to hear from you either way. Not being one to accept defeat so easily, I sit here at my laptop late at night night, writing down whatever comes to mind. Hopefully it will make sense to at least some of us. ;-)

In several of my previous posts, I promised I would revisit topics to let you know how things are going and I plan to do just that in my next post. Beyond that, if you have any thoughts on what you want to see in future posts please let me know. I'd love to hear about them. Keep those cards and letters coming!

Cheers!

Thursday, December 24, 2009

A time to reflect

As I wrap up the first year of my second stint with Click Commerce, I find myself looking back over the year with a real appreciation for the company and the wonderful customers I get to work with. Rejoining Click last December as Services Manager after nearly 2 years away took me full circle as I started in the services group way back in 1999, when Click Commerce was still Webridge, before moving into the Engineering team to manage our product's transition into the Research and Healthcare market.

Since returning, I've had the pleasure to get to know the Professional Services development team that has grown significantly in my absence and continues to grow to keep pace with the burgeoning Click Compliance Consortium membership. I've also had the pleasure to work with many of you on your deployment efforts and am impressed by the work you have done.

I started this blog with my first post on February 27th as my own personal experiment without a real idea of what I would write about and an apparently unrealistic goal of writing something each week. As I'm sure you've seen, I fell short of that goal quickly, but that didn't mean the desire wasn't there. It's just that real work quickly consumed my time. I guess that was to be expected. I look back now and realize that this is my 28th post and even more amazing to me is that it's actually being read. It's this last revelation that keeps me going and I thank you for your indulgence. You guys are amazing!

A while back I decided to try to address my angst about whether this effort was worth continuing by collecting some usage stats. As I look at it now, I'm gratified (and frankly shocked) at how many of you have taken a peek. I've had visitors from 11 countries. Though I consider visits from countries other than the US and Canada random internet noise, one exception is the inexplicable 5.32% of the visits coming from Bareuri Brazil. Whoever you are, do you need an onsite visit? ;-) Within the US and Canada, readers have come from 27 different states and 2 provinces. Though this medium is mostly anonymous, I have heard from a few of you and I'm grateful for the feedback. If there's something you want to hear more about, please let me know and I'll do my best to oblige.

2010 looks to be another exciting year. New customers, new solutions, and continued growth will keep us all very busy. In the midst of all that, we've begun planning the details for C3DF to be held here in our offices on February 24-25. I hope to see many of you there as the agenda looks like it will be packed with a lot of great information, including some interesting customer presentations. In addition, I'll be teaching our advanced development course the following week. For those of you planning to enroll, you might consider staying in Portland between the two events. Can you say "Ski weekend?"

I hope you all have a happy and safe holiday season.

Cheers!

Friday, December 4, 2009

Centralizing your implementation of validation rules

It’s your business rules that allow you to turn all the building blocks provided by Click Commerce Extranet into your own solution. Business rules manifest themselves in the names of the workflow states you use, how a project moves through it’s lifecycle and the actions available to the user at each step along the way, how you define your users, security policies, the information you collect, and criteria against which that information is verified. All of these configuration choices, supported by the Click Commerce Extranet platform, are what makes your site uniquely yours. Verifying information according to your institutional requirements involves the implementation of Validation Rules. This post will present one approach to their implementation.

Consider this validation rule:

In your IACUC solution, if animals are to be sourced through donation, you require that there be a description of the quarantine procedures that will be used.

Sounds like a reasonable requirement, right? So, where would you enforce that rule? A key advantage to the Click Commerce Extranet platform is it’s flexibility but sometimes determining the best approach requires that you weigh the pros and cons. This rule is an example of a conditionally required field. Here are a few of the most common approaches to implementing this type of rule:

  1. SmartForm branching
    You can enforce the rule by separating the selection of animal source and the follow-up questions onto different SmartForm steps and use a combination of SmartForm branching and required fields. This is by far the easiest implementation because it allows you to take advantage of the built-in required fields check and can be accomplished without any additional scripting. It does however, require that the questions be separated into multiple SmartForm steps. This isn’t a big deal if there is already an additional steps where the follow-up question could be placed but this may not always be the case.
  2. Conditional Validation Logic
    With a bit more work you can keep the questions on a single view and implement conditional validation logic in a script. This allows you to keep the fields together but the follow-up question will be visible to the user in all cases. You will need to include instructional text into the form to let the user know that the second question is required if the initial question is answered in a particular way.
  3. Conditional Validation Logic with Dynamic Hide/Show
    With yet even more work, you could dynamically show the relevant dependant questions only when the user is required to answer them. They would otherwise be hidden. This technique is the subject of an upcoming post but it’s important to understand that the enforcement of the validation rule is still accomplished through custom script.

By far, the easiest implementation is option1 because the Extranet application will perform all the validation checks for you. But what if your rule doesn’t fit within a simple required field check or your users won't let you separate the questions onto different SmartForm steps? In these cases, you will have to implement some logic. Knowing that, you are still faced with the decision of where to put it. Again, there are options:

  1. Add custom logic in a View Validation Script
    This is the preferred approach as the configuration interfaces are exposed via the standard Web-based configuration tools.
  2. Override the Project.setViewData() method
    This technique has been replaced with the View Validation Script but before that script hook was available was the best place to add custom logic. It is no longer the recommended approach.
  3. Override the Project.validate() method
    This method is called when validating the entire project as would happen when the user clicks on Hide/Show errors in the SmartForm or executes an activity where the "validate Before Execution" option is set. It is not invoked, however, when a single SmartForm step is saved so is really only a good place for Project level validation.

Option 1 is the most common approach and is much preferred over option 2. Both approaches will allow you to enforce validation rules whenever a view (or SmartForm step) is saved. This is what I call “view validation.” All view validation rules must be met before the information in the view can be considered valid. This means that a user cannot save changes or continue through the SmartForm until all rules for the current step are met. This is applicable to most needs but not all. Let’s consider another rule that also must be enforced:

In order for a protocol to be submitted for review, the PI and Staff must have all met their training requirements.

Enforcing this rule when posting a view or SmartForm step would be overly restrictive. The PI and Staff should be able to complete the forms even if their training is incomplete. The Rule is that the PI cannot submit the protocol for review until all have met the training requirements, so a View validation won’t work. What is needed is Project level validation which can be accomplished by overriding the Project.validate() method.

So this means that some rules are implemented in a View Validation script using the web-based configuration tools, while other rules are implemented using Entity Manager. This approach definitely works and is seen in a large number of sites. The downside is that code is maintained in different places and script is required for all rules.

In addition, all the scripting approaches It doesn’t take into consideration that many rules follow common patterns. For example,

  • If field A has value X, then a response for field B is required, or
  • Field A must match a specific pattern such as minimum or maximum length

What if these patterns could be formalized into a standard way of defining a validation rule? What if all rules were defined in the same way and in the same place? Would that make implementation, testing and maintenance easier? I certainly think so.

Introducing the ProjectValidationRule implemenation

ProjectValidationRule is a Selection Custom Data Type whose sole purpose is to serve as a central place to define and execute your site’s validation rules. Some of you may have previous experience with a CDT named “SYS_Validation Rules” and any similarities you see are not a coincidence. That type provided the seeds from which this new implementation was grown. It allows for rules following common patterns to be defined without authoring any additional script and provides the flexibility to define whether the rule is to be enforced at a View or Project Level. As an added bonus, you can also define activity specific rules.

Download the ProjectValidationRule package for specific implementation details. This approach is still evolving and it’s proven very effective so far but there is always room for improvement so feedback is always appreciated.

Cheers!

Sunday, November 8, 2009

Extranet-Centric Single-Sign-On

You’re ready to expand your use of Click Commerce into multiple modules (or perhaps you already have) and have elected to separate them into more than one physical server. That’s great! there are a lot of good reasons to do so. Perhaps you have different development teams for the different solutions who work on different release schedules, or you want to align the different servers across organizational lines (Human Research, Animal Research, and Grants, for example), or you’re simply taking a pragmatic approach to managing your continued expansion. Whatever the reason, the approach is becoming increasingly common; especially as the number of deployed Click Commerce modules increases.

Now that you have made that choice, you now need to address all of the little integration issues. One such issue is how to streamline authentication such that a user doesn’t have to login to each server. For those of you who have implemented a Single-Sign-On (SSO) solution such as Shibboleth or CA SiteMinder, this issue is already handled. But what if you don’t have an Institution-wide SSO implementation? Whether you take advantage of Delegated Authentication to an external credential source such as Active Directory or LDAP or are using Click Commerce Extranet’s built-in authentication engine, your users will typically have to login to each site.

I recently completed some work for a customer to eliminate this hassle by allowing one Extranet-based site to be used as the SSO authentication source for any other Extranet-based site. The implementation is simple enough to apply to other sites such as your own that I thought I'd share it with you. As this implementation deals with security related subject matter, I’m going to ask you to continue reading about this new capability on ClickCommerce.com. Sorry for the inconvenience, but better to keep secret stuff just between us. As an added bonus, I’ve packaged the entire implementation into a download that you can use for your own sites. In the download you will find an explanation of the implementation, requirements, and installation instructions.

As always, I’d love to hear how this works out for you.

Cheers!

Implementing a Hybrid CDT - UPDATE

This is a follow-up to my earlier post on implementing Hybrid CDTs.  If you haven’t yet, I encourage you to read that post first or very little of this will make sense.

I’ve been pleased to hear from those of you who have taken this approach and identified areas within your own configuration where it provides value. Some of you have asked about what happens when an entity is deleted and I thought I’d follow up with some additional detail as I didn’t directly address that case in the original post.

The deletion case is relatively simple to address but does require another trip to Entity Manager.  Every eType has a built-in method called @unregister and Custom Data Types are no exception. This method is called whenever an entity is being deleted from the site. For the deletion to have real-time effect across all the references to the associated Selection CDT entity, you will also need to delete the corresponding Selection CDT entity. This is done by implementing logic into the @unregister method of the Data Entry CDT type which also unregisters the associated Selection CDT entity.

In the example described in the original post, you would implement an @unregister method on the MyChildDE type that looks like the following:

function unregister()
{
try {
// the supertype @unregister is called by default by the framework

// If the data entry entity (this) is unregistered, then also unregister the
// referenced Selection CDT
var child = this.getQualifiedAttribute("customAttributes.child");
if (child != null) {
// This is the complete solution, but actually unregistering the selection CDT entity at this
// time might have an adverse performance impact. If that is the case, it might be better to simply
// remove the entity from all sets.
child.unregisterEntity();

// If, in your data model, MyChildSE entities are only referenced as members of sets, you can
// improve performance by limiting the work done at this time to the removal if the
// child entity from all sets and defer the remaining deletion until the next time Garbage Collect is
// run.
// removeFromAllSets(child);

// Even more performant would be to explicitly remove all known uses of the child entity but this
// approach requires ongoing maintenance as it needs to stay in synch with wherever the Selection
// CDT can be used.
}
}
catch (e) {
wom.log("EXCEPTION _MyChildDE.unregister: " + e.description);
throw(e);
}
}



With this extra bit of code, the effect of a user deleting the data entry entity is immediately obvious in that any reference to the associated selection entity is also removed.


Cheers!