Table of Contents
1. Introduction
1.1 What is KnowledgeWorker®
1.2 Document History
2. Free Functionality
2.1 Customization functionality in the standard interface
2.1.1 Categories & Attributes
2.1.2 Saved Searches
2.1.3 Workflows / Business Processes
2.1.4 MIME Types
2.1.5 Manage Page Text
2.2 Additional Components
2.2.1 HTMLPage.html
2.2.2 KWXMLUploader
2.2.3 Import Vault
2.2.4 External Search
2.2.5 Workflow Trigger
2.3 Workflow
2.3.1 Generate Workflow Title
2.3.2 Generate Letter
2.3.3 Select Performer
2.3.4 Email Notification
2.3.5 Escalation
3. Architectures
3.1 User Interfaces
3.1.1 The Browser
3.1.2 Office Applications
3.1.3 Search and Retrieval
3.2 Server Components
3.2.1 Interface Virtual Directory
3.2.2 COM Components
3.2.3 Using COM Methods
3.2.4 Database
3.2.5 Vaults & Vault Manager
3.2.6 Notifications
4. Rules for Customizing
5. Areas of Customization
5.1 Multilingual Support
5.2 Branding
5.3 Custom Forms
5.4 Creating a Custom Workflow Form
5.5 Custom Handlers
5.6 Creating a Custom Workflow Handler
5.7 Event Handlers
5.7.1 Example Use of Workflow Events
5.8 Custom Menus
5.9 Home Page Entries
5.10 COM Components
5.11 Debug Hints
1. Introduction
This document describes the activities that a developer working with KnowledgeWorker® might undertake in order to customize functionality. This guide assumes that the developer is familiar with the content of the other documentation provided, including:
- User Guide.
- Workflow Guide.
- System Administration Guide.
This document does not fully document the API (interface methods) for the KnowledgeWorker® COM components or a database schema but supplies information to help developers understand key elements that are used in customizations.
1.1 What is KnowledgeWorker®
KnowledgeWorker® is a collaborative Knowledge and Content Management solution. Designed for rapid deployment over the Internet or a corporate Intranet/Extranet, KnowledgeWorker® gives organisations the ability to capture, analyse, apply, and reuse information to make faster, smarter decisions from anywhere in the world – at any time – day or night.
KnowledgeWorker® provides a Central Library for the creation and storage of information in any file format, a Search Engine to find information, Project Collaboration to share information and Workflow Engine to automate business processes and move information intelligently around an organisation. Access to information is via a web browser, thereby removing geographic and technology constraints. KnowledgeWorker® will allow you to:
- Improve customer service and quality.
- Avoid duplication of effort.
- Improve staff performance.
- Handle help desk and enquiry calls.
- Maintain ISO9000 compliance.
- Automate business processes.
- Achieve substantial cost saving.
- Manage all documentation, including e-mails.
KnowledgeWorker® has been built using Microsoft web-based technologies for e-business needs. KnowledgeWorker® utilises Internet technologies to provide a robust, high performance solution that can be rapidly deployed in any vertical market.
1.2 Document History
This section outlines the change history of this document in relation to the builds that have been created for KnowledgeWorker® Version 3.1
Version |
Date |
Sections |
Notes |
3.1.0 |
7 Mar 05 |
All |
New for Version 3.1 |
3.1.1 |
23 May 05 |
All |
Updated with Peer Review comments |
|
|
|
|
|
|
|
|
2. Free Functionality
KnowledgeWorker® has been designed to provide a flexible solution to fit in any vertical market. This flexibility is provided within the interface through the features, such as workflow designer, and with external components that are made available to developers.
The standard KnowledgeWorker® interface provides the functionality to define:
- Categories and attributes to suit user requirements.
- Saved Searches for an item, data record or workflow.
- Workflows / Business Processes using the Workflow Designer, including features to generate documents.
- Mime types that can be extended to support any application.
- Screen text in the Manage Page Text.
The external components provide the functionality to define:
- The addition of HTMLPage.html to a folder.
- A workflow trigger, to launch the appropriate workflow based upon external XML data.
- A VB COM component for workflow event handling.
- A Web Service for workflow event handling to a web service and a KnowledgeWorker™ Web Service. to receive responses to update the corresponding workflow tasks.
- External Search functionality to plug into another web site, such as an internet or intranet.
- KWXMLUploader to take document content into the repository, assign category data to the document based upon the content of an XML file and set the relevant permissions.
2.1 Customization functionality in the standard interface
2.1.1 Categories & Attributes
KnowledgeWorker® provides a standard set of categories in the repository and the ability for any appropriately privileged user to create more categories. Categories have a number of different uses within the repository, including:
- Extending the information associated with a user, such as line manager and sign-off limit.
- Definition of data to capture with each item added to the document management system, such as client name, document usage and document expiry / update date.
- Providing private data tables, such as for look-up tables used for populating dropdowns or configuration tables.
- Fields for the data gathered throughout a business process.
The use of a category is defined by setting the scope, which is described in full in the System Administration Guide. In basic terms, each category can be viewed as a user definable database table or in simpler terms a spreadsheet and the attributes are simply a column within the table / sheet. As with any column, an attribute must have its data type defined with care, as once data is loaded it may be more difficult to convert the column to a different data type.
As with good database design, good category and attribute design can go a long way towards making customization and maintenance easier. Consider too holding global variables that are specific to the repository in a data category as this makes it easier to migrate the customization from one repository to another, such as the user acceptance test (UAT) environment to the production environment, without requiring code changes.
A category may have one or more attributes depending upon the data requirements for its scope of use. The internal name of the attribute must respect the relevant database rules for reserved words and the use of punctuations. At this time, no validation is made in the browser interface that you have not used reserved words or invalid punctuations. It is a recommended practice that an abbreviation for the category name be used as part of the attribute name to help reduce the likelihood of using a reserved word.
Important: When the repository database is Oracle then the attribute name, being a schema object, is limited to 30 characters.
The attribute display name is the opportunity to present a user-friendly name to the user and perhaps suggest the content expected in the field.
There is a wide variety of input types in order that the category designer gains the appropriate input from the user, including entry fields, checkboxes, radio / option buttons, dropdowns and multiple select lists. Some input types allow the data to be sourced from another category in the repository or even an external database and there is the opportunity to create parent / child relationships for dropdowns (see the System Administration Guide for more details). Consider using data categories for the population of dropdowns or selection options as this will make it easier to maintain or re-order these fields to suit user requirements and reduce the risk that in future the number of options will exceed those that can be held in the format values field. The format values field is limited to 255 characters.
The data type for the attribute / column should be selected to suit the input type and the input that might be expected from the user. No automated check is made that these two types (input and data) align so it is important that the developer gives due consideration over the appropriate selections. In some case, such as in dropdowns, the data type may depend upon the value assigned. For example, a dropdown containing the display values Yes and No might be held in a character if Y and N were used as the values or a number / integer if 1 and 2 were used as the values.
The following table illustrates the possible data types, in both Oracle (OR) and Microsoft SQL Server (MS), for each of the input types that are available.
|
Date |
Number |
Varchar2 |
DateTime |
Decimal |
Float |
Int |
Numeric |
Real |
Varchar |
Database Type |
OR |
OR |
OR |
MS |
MS |
MS |
MS |
MS |
MS |
MS |
Entry Field |
|
X |
X |
|
X |
X |
X |
X |
X |
X |
Multi-line Text Box |
|
X |
X |
|
X |
X |
X |
X |
X |
X |
Checkboxes |
|
X |
X |
|
X |
|
X |
X |
X |
X |
Option Buttons |
|
X |
X |
|
X |
|
X |
X |
X |
X |
Dropdown |
|
X |
X |
|
X |
|
X |
X |
X |
X |
Multiple Select List |
|
|
X |
|
|
|
|
|
|
X |
Date List |
X |
|
|
X |
|
|
|
|
|
|
Date Time Lists |
X |
|
|
X |
|
|
|
|
|
|
Time List |
X |
|
|
X |
|
|
|
|
|
|
Category Field Lookup (Multiple Checkboxes) |
|
|
X |
|
|
|
|
|
|
X |
Category Field Lookup (Dropdown) |
|
X |
X |
|
X |
|
X |
X |
X |
X |
Category Field Lookup (Multiple Select list) |
|
|
X |
|
|
|
|
|
|
X |
Category Multi Select list with other entries |
|
|
X |
|
|
|
|
|
|
X |
External Table Lookup (Multiple Checkboxes) |
|
|
X |
|
|
|
|
|
|
X |
External Table Lookup (Dropdown) |
|
X |
X |
|
X |
|
X |
X |
X |
X |
External Table Lookup (Multiple Select list) |
|
|
X |
|
|
|
|
|
|
X |
User List |
|
X |
X |
|
|
|
X |
X |
|
X |
User Multi Select List |
|
|
X |
|
|
|
|
|
|
X |
Users & Groups List |
|
X |
X |
|
|
|
X |
X |
|
X |
Users & Groups Multi Select List |
|
|
X |
|
|
|
|
|
|
X |
Project Groups Multi Select List |
|
|
X |
|
|
|
|
|
|
X |
Subcategory List |
|
X |
X |
|
|
|
X |
X |
|
X |
Subcategory Reference |
|
X |
X |
|
|
|
X |
X |
|
X |
Library Item Reference |
|
X |
X |
|
|
|
X |
X |
|
X |
Node Table Folder List |
|
X |
X |
|
|
|
X |
X |
|
X |
Thumbnail Reference |
|
X |
X |
|
|
|
X |
X |
|
X |
Normally attributes used in a workflow are not made mandatory because this is controlled by the selection made by the workflow designer. Making an attribute mandatory will require the attribute to be provided and can cause contentions when the workflow task does not ask for user input for a mandatory attribute early in the workflow instance.
2.1.2 Saved Searches
Saved Searches enable complex searches to be built to locate items, data records or process data and for these searches to be shared with less skilled members of the repository user community. All saved searches are stored as text files in the repository vaults.
The use of Saved Searches is described in the User Guide that illustrates the three typically types of saved search that are available:
- Category.
- Data.
- Workflow Data.
It should be noted that for each of the above searches a number of attributes and filters might be used.
The Saved Searches are selected from the dropdown list and can be filtered using the buttons provided. The list of available saved searches for a user is restricted based upon the permission model that controls the access to the saved search files. Therefore, a user who creates a saved search that is to be utilised by other users should adjust the permissions on the saved search to allow others see and see content permissions.
Note - When the Advance Search Engine (ASE) option is configured then an additional button called Advanced Search will appear here. The Advanced Search is based upon text and settings with which to conduct the search against the ASE and not attributes.
2.1.2.1 Category Search
The saved category search is used to locate objects, such as documents, in the repository. The saved category search will involve using the category search to construct an appropriate search across the data associated with items that have been added to the repository.
The construction of category search is described in full in the User Guide. The category search can be used for multiple attributes in a single category, or for attributes from multiple categories, where the attributes for each category are defined.
The category search effectively constructs a SQL query on the category and when multiple categories are used this effectively builds a join between the categories.
The saved search results for an item will reflect those items with the category directly assigned and those for whom the category has been assigned to a shortcut / link. The normal permissions model for the item(s) in the repository restricts the saved search results and thus the user running the search will only have access to items for which they have been granted permissions.
2.1.2.2 Workflow and Data Searches
The Workflow and Data search functionality is related to data held in the repository rather than items and so the results do not have the same permission constraints. Access to the saved search however is still restricted by the permissions model.
A workflow search or data search is built by using the data link on the categories screen; see Manage Category Information in the System Administration Guide.
This Data link gives access to a simple data search page that provides the user with the ability to conduct a search on all data using only one value as the criteria.
The Advanced Search button provides the ability to build a search using complex criteria and provides the ability to control the fields and the order of the results. The Advanced Search also provides the user with the features, including Hit Count and Save Search. Save Search provides the ability for the user creating the search to save it for subsequent use.
Once the search is saved it may be shared with other users of the repository by adjusting the permissions on the saved search to grant others ‘see’ and ‘see content’. Typically, permissions are assigned using groups rather than individual users to improve reduce future maintenance.
For workflows, the Advanced Search screen, allows the user to define the preferred state of the workflows to be included in the results:
- All - This relates to any workflow instances, irrespective of their current state and includes, completed, suspended, stopped and active workflows.
- Active - This relates to any workflow instances that are still being progressed and open for the user community to work on.
- Completed - Relates to any workflow instances that have been taken to their conclusion by the user community.
From the Advanced Search results the user may also extract the data into a Microsoft Excel Spreadsheet and therefore have the ability to generate simple reporting functionality without the need to purchase large reporting tools, such as Crystal Reports.
2.1.3 Workflows / Business Processes
A workflow is a process-map, which intelligently routes a work package between groups of users, individuals, or application programs. A work package consists of electronic business forms with associated documentation and attachments that are required to complete a business process.
In order to turn a business process into a workflow map, you must create a workflow map using the Workflow Designer. The Workflow Designer is a Java applet launched in the browser, which allows a user to map each of the tasks or stages of a process into a workflow map by ‘dragging and dropping’ the processes onto a workflow diagram.
The Workflow Designer provides a set of tools that can be dragged onto a palette to represent each stage of a business process. For each stage / point in the process, you can define the specific details to be displayed on an electronic form, and control the attribute fields a user is required to complete or amend. In addition, you may also define the performer of each stage, any template or referenced documents that are made available at each stage, any email or escalation criteria or whether the workflow performer is granted the ability to generate a word document based upon a template already in the repository. The full description of all this functionality is provided in the Workflow Designers Guide.
For the programmer, some of the information contained in the Workflow Designer Guide is repeated in the discussions about customizing workflows in a later section of this document in order that the programmer may understand in greater detail the functionality already provided and to ensure that any customizations do not adversely impact the standard functionality.
2.1.4 Mime Types
Each repository is delivered with a standard set of Mime types / icons for the document objects that might be placed in a repository. These standard Mime icons are held for each Mime type in the virtual directory of the application, initially.
When the system administrator elects to make changes the Mime types, normally by listing the current Mime Types, then the Mime icons are copied into a custom vault so that they can become part of the repository data set, just like the repository documents. The Mime icons will then have their own vault and will not be changed during subsequent upgrades of the application.
IMPORTANT: The mime icon vault path is based upon the hostname for the vaults and so it is important to ensure that this is defined correctly in the system parameters page, else the Mime icons will not be displayed correctly.
The Mime type for a document added to the repository is normally established by either the Mime type that is passed as part of the stream during upload or as the extension that is associated with the uploaded file. Typically, the Mime type associated with the uploaded file is tested first but should this not indicate the Mime type, beyond application/octet stream, then the file extension is tested against those listed in the repository.
The system administrator can add, edit, or delete Mime types and their associated icons; for further details refer to the Mime Types section of the System Administration Guide. It is advisable to establish all the required Mime Types immediately after creating the repository because typically the introduction of a new Mime type will only impact documents that are subsequently added to the repository. Those created before the Mime type is defined will remain listed but with the reference to the Mime type assigned when they were added.
Previously assigned Mime types can be adjusted manually by versioning the document, as the Mime type is detected for each version of the document. The Mime type is held as part of the document / version information held in the Document Table of the repository.
2.1.5 Manage Page Text
Each installation of the KnowledgeWorker® application is delivered with a standard set of text, for each page, held in an XML file. There is an XML file for each language that can be selected at the login screen and there is also a properties page built into the workflow designer.
The system administrator can examine the content of the XML file from the system administration pages by selecting the Manage Page Text link from the ‘System Administration’ menu. Each page in the application is listed and the text for a given page is displayed by selecting the Action Edit.
The Edit action will display the current labels and the text associated with the label.
Change the relevant text and then select Save to write the changes to the XML file.
IMPORTANT: Although the system administrator appears to be able to change the text associated with a label, this will not be saved unless the file properties of the relevant XML file have been changed to editable. This is a precaution to prevent accidental changes being saved. It is advisable for the system administrator to retain control of changes made to the XML file and remove write permissions when not conducting changes.
Please note that the XML file(s) may be updated the when the application is upgrade. Therefore, a careful note must be made of all changes made to the XML files so that these can be subsequently reapplied.
2.2 Additional Components
This section describes several components that can be utilised with the application and / or repository with minimum programming / configuration. These components are not typically discussed in other documents but may provide a toolkit for other functionality that a programmer / developer is asked to implement.
2.2.1 HTMLPage.html
This can hardly be described as an additional component, as it is a built-in feature that is available to any folder. It has been included as an additional feature because typically it requires someone with knowledge of HTML to configure it.
Each folder in the repository can have a document added with the node name and document name, ‘HTMLPage.html’. The effect of adding this page is that the HTML defined in the page is used to create a title bar for the folder. This can be used to add mission statements, logos, and even more complex HTML to aid repository navigation (see the illustration below).
By adding a separate HTMLPage.html to each folder then it is possible to create a separate look for each folder in the repository.
The HTML inside HTMLPage.html must conform to some basic rules, including:
- All URL’s must have an absolute address, this is because the page is processed from the vault.
- All HTML must be well structured and tested, as the application support team cannot be held responsible for any badly constructed HTML.
- All images must have absolute addresses.
- Any scripting (Javascript / Vbscript) must be contained within the page and not make external references, such as to include files, except via absolute addresses.
Note - that all users of the folder must be granted See and See Content permissions to the HTMLPage.html in order that folder is displayed correctly.
To create the above illustration:
- Create a text file and add text illustrated in the HTML source below:
<html>
<head>
<title>Example Header Page</title>
</head>
<body>
<image src="http://www.reddot.com/images/logo_us_reddot.gif">
<a href="http://www.reddot.com/company_awards.htm">Latest News</a>
</body>
</html>
- Save the text file as HTMLPage.html on the local machine.
- Add the file HTMLPage.html to a folder in the repository with the title HTMLPage.html.
- Change the permissions on the file to ensure that all users of the folder have ‘see’ and ‘see content’ permissions.
2.2.2 KWXMLUploader
The component KWXMLUploader provides the means to import multiple documents into the repository, including assigning permissions and category data. KWXMLUploader assumes that each document to be imported has an XML file that describes:
- Title.
- Security Model.
- Location of the file to be uploaded.
- Location in the repository.
- Category / attribute data to be assigned to the uploaded document.
The KWXMLUploader is an executable program that can be run as a scheduled task to upload a defined maximum number of files. It is frequently used to support data migrations and integrations with other systems that can generate the XML and source document. For example, KWXMLUploader has been utilised with a legacy ERP system that uses FormScape to generate the XML and PDF document file, before KWXMLUploader automatically adds the information the repository. It is highly likely that KWXMLUploader will be an integral part of functionality that developers use to integrate with scanning software, such as Kofax, because of its flexibility. The illustration below shows the primary components of the KWXMLUploader:
The KWXMLUploader executable is launched with a parameter that represents the source that it will be handling. Each source is defined in the configuration file that exists in the same directory as the KWXMLUploader executable. For each source, the configuration file identifies a number of important elements including directories, the user credentials and the repository, the email to / from address for any error message(s), the security category that will define the permissions and the maximum number of items to process in each run.
The security category is the identifier for the data category that holds the permissions that are to be applied to the documents in the repository. The Security Data category has the following fields:
Attribute Name |
Display Name |
Input Type |
Description |
userorgroupid |
User or Group Identifier |
Users and Groups list |
Data type is the identifier for a single user or group. |
documenttype |
Document Type |
Dropdown |
Can be a category lookup dropdown but defines the document types that might be found in the XML files |
access |
Permissions |
Checkboxes |
The permission is created by combining the values: |
There can be multiple entries in the data security category for each document type, each row representing the permissions for a user or a group that will be applied to each document of the defined document type that is added by KWXMLUploader.
The holding directory is where any problematic XML files are moved to and the error log holds the error message and this error message is part of the email that is sent to notify the administrator that a problem has been identified.
The XML file for each document supports the following elements of information:
- Source Application - This allows the KWXMLUploader to be used for multiple sources, such as FormScape, CAD etc.
- Document Type - This is used with the security category to establish permissions automatically for each document type, such as Invoice, Purchase Order.
- Category Name - This will hold the display name of the category to be added.
- Attribute Name - Each attribute in a category will be represented by the internal attribute name, not its attribute display name.
- Attachment - The full path to the file that is to be imported to the repository, typically a local or mapped drive on the application server.
- Library Access - Will the file be present in the library (Yes or No).
- Library Parent - The node to be used as the location / parent for the item.
- Title - Name for the item in the Library.
- Version - Check for existing item of this name (Yes/No).
- Version max - Maximum of versions to be supported, exceeding the version maximum will cause the automatic purge of the oldest version.
These elements are represented in three groups:
- The Header information required for every document.
- The Library information required for every document for whom navigation is required.
- The Category information.
The header information will contain the details necessary to create the file regardless of its location in the repository and this information would be included in any error report should the upload fail. There can be only one header element per file: -
<HEADER>
<SOURCE>FormScape</SOURCE>
<DOCTYPE>Invoice</DOCTYPE>
<TITLE><![CDATA[Invoice 602239-51E]]></TITLE>
<ATTACHMENT>
<![CDATA[D:\KW-outputs\Invoice 602239-51E.pdf]]>
</ATTACHMENT>
</HEADER>
The library information will be an optional element and if present will be used to define the library location for the attachment. There can be only one Library element per file. The version related information must be provided for future consistency: -
<LIBRARY>
<TITLE><![CDATA[The document title]]></TITLE>
<PARENT>5039</PARENT>
<VERSION>Yes</VERSION>
<VERSIONMAX>5</VERSIONMAX>
</LIBRARY>
The category information element can appear any number of times depending upon the category and attribute data to be associated with the item in repository. Typically, a category with a file scope will be used as the first category followed by media categories, as these will be defined as the category on shortcuts / links for the document.
Each category element will include the category name and any attributes element. The attributes element will contain the attribute tags for the category, and these will correspond to the attribute names given in repository. The file category will not be added automatically and therefore must be defined, if required.
<CATEGORY>
<CATEGORYNAME><![CDATA[PDF Archive File]]></CATEGORYNAME>
<ATTRIBUTES>
<doctype><![CDATA[invoice]]></doctype>
<refno><![CDATA[254125/25P]]></refno>
<owner><![CDATA[PSR]]></owner>
<prime_no><![CDATA[254125/25P]]></prime_no>
<stroke_list><![CDATA[25P 26W 27T]]></stroke_list>
<account><![CDATA[123542J]]></account>
<name><![CDATA[Matthew Hall Limited]]></name>
<cust_ref><![CDATA[XP45/DRN-Epoch 23]]></cust_ref>
<cust_ord_no><![CDATA[212548/251-2001]]></cust_ord_no>
<mnth><![CDATA[September 09]]></mnth>
<yr>2002</yr>
<cal_date><![CDATA[27/9/2002]]></cal_date>
</ATTRIBUTES>
</CATEGORY>
The following assumptions are made as part of the implementation of KWXMLUploader:
- All XML exported for consumption by KWXMLUploader will conform to the XML format defined in this document.
- Any category information provided can only applied if the category name exactly matches that defined in the repository.
- Any problems with the category information provided will cause the attribute information associated with the category not to be added but will not halt the import of the document. The file will still be created but the category and attribute values will be missing, and a warning report will be issued in the log file.
- Any attribute information provided for a category can only be entered if the attribute name exists in the repository and exactly matches that defined in the category.
- Any problems with attribute information failing to match the data type defined in the repository will cause the failure of the category to be added to the imported file. Such failures may be the result of invalid data types or fields indicated as mandatory not being present in the import data.
- The Document Type in the HEADER is used together as a key to define the security permissions to be applied in the category Document Type Security. The entries for the Document Type will be used to control the security permissions applied to the uploaded files along with the inheritance from the parent folder, if in the library.
- The version details are checked on submission of the document and the new version will be added before purging the oldest versions but only if the <version> tag is set to yes. If the <version> tag is set to yes but there is no <VERSIONMAX> tag, then it will be assumed that there is no version limit. The <version> and <versionmax> tags are only provided to complete in case version management is required, in most cases it is not. Version management slows the process as the document name will require the use of a text matching to locate the document.
- The category name is validated via a text match. The text match has the risk that if someone changes the name of the category in repository it will cause KWXMLUploader to cease to function.
This sample defines the content of the XML file for the import of a file called “Invoice 602239‑51E.pdf” looks like. Although some data is replicated it ensures that regardless of the category elements or library element there is sufficient information to add the file to repository and apply the relevant security permissions: -
<XML>
<HEADER>
<SOURCE>FormScape</SOURCE>
<DOCTYPE>Invoice</DOCTYPE>
<TITLE><![CDATA[The document title]]></TITLE>
<ATTACHMENT>
<![CDATA[D:\KW-outputs\Invoice 602239-51E.pdf]]>
</ATTACHMENT>
</HEADER>
<LIBRARY>
<TITLE><![CDATA[The document title]]></TITLE>
<PARENT>5039</PARENT>
<VERSION>Yes</VERSION>
<VERSIONMAX>5</VERSIONMAX>
</LIBRARY>
<CATEGORY>
<CATEGORYNAME>FILE</CATEGORYNAME>
<ATTRIBUTES>
<NODE_NAME><![CDATA[The document title]]></NODE_NAME>
</ATTRIBUTES>
</CATEGORY>
<CATEGORY>
<CATEGORYNAME><! [CDATA[PDF Archive File]]></CATEGORYNAME>
<ATTRIBUTES>
<doctype><![CDATA[invoice]]></doctype>
<refno><![CDATA[254125/25P]]></refno>
<owner><![CDATA[PSR]]></owner>
<prime_no><![CDATA[254125/25P]]></prime_no>
<stroke_list><![CDATA[25P 26W 27T]]></stroke_list>
<account><![CDATA[123542J]]></account>
<custname><![CDATA[Matthew Hall Limited]]></custname>
<cust_ref><![CDATA[XP45/DRN-Epoch 23]]></cust_ref>
<cust_ord_no><![CDATA[212548/251-200]]></cust_ord_no>
<mnth><![CDATA[September 09]]></mnth>
<yr>2002</yr>
<cal_date><![CDATA[27/9/2002]]></cal_date>
</ATTRIBUTES>
</CATEGORY>
</XML>
2.2.3 Import Vault
The import tool is an executable tool that can be used to adopt whole file system areas as vaults in the repository. This approach assumes that the client has large areas of clean well-structured data that they wish to introduce them to repository.
The import tool will reproduce the folder structure it encounters in a selected location within the repository and introduce the documents into the folders. It can only be run once on any given file system area and will cause all files in the area to be renamed using the KnowledgeWorker® file system naming convention. Permissions to the area must be restricted to the KnowledgeWorker® user following import as changes to the file system external to repository can damage the ability to access documents and any additions will go undetected.
The import tool requires that the database be updated with the scripts provided before the tool is run as this enables the file path to be stored. The import process can then be started, and it will prompt the user for:
- Login in details to the repository, including username, password, and server.
- The location in the repository in which to place the imported structure.
- The name of the vault that will also name the virtual directory.
- The location in the file system of the file structure to be imported, such as C:\exstaff or a UNC path.
- The username and password needed to access the file structure, this is mandatory if the UNC path or remote location is provided.
The Import process walks the folder structure and introduces the same structure into the selected location in the repository (the library, a project area or personal workspace). The files are renamed with the appropriate node identifier and the folder structure is flattened as per a normal vault area. The folder structure is also published as a virtual directory available to the Microsoft Index Server so that it can be searched in the same manner as any other application vaults.
Once the import process has completed, any files that have not been added are reported to the user. The user is then responsible for adding the omitted file manually and applying the appropriate permissions and category / attribute data.
Once the import process completes the area has effectively been adopted into the repository and control of documents should be via the repository. The permissions on the imported items and category / attribute data must still be applied manually after the import by the system administrator of the repository.
2.2.4 External Search
This section outlines the installation and use of the External Service provided as part of the application to allow items to be accessed from an external web site. There is often a need to publish or make available documents and process maps from within the repository either on an Intranet, Extranet, or Internet site. In most instances, it is not a requirement for the users to login to these sites or be exposed to the KnowledgeWorker® interface.
The External Service is aimed at reducing the tedious administration task to copy the relevant documents to the site or the Content Management System and maintain these as separate document structures.
Having encountered this issue with a number of clients, the developers have introduced the External Service as a means by which data can be retrieved by the web site from the repository in a controlled manner.
The following diagram illustrates the functional components involved in the External Service.
The external user communicates with the External Pages for the External Service (ES-External Pages) and typically these pages are located on a separate server to the core application, such as a web server in the DMZ.
These ES-External Pages provide:
- Content Search Functionality (WebServices/EXT/search_ext.asp).
- Category Search Functionality (WebServices/EXT/KW_WS_Category.asp).
- Document Retrieval (WebServices/EXT/getdoc_ext.asp).
The ES-External Pages present the external users’ request in structured XML format to the Internal pages for the External service (ES-Internal Pages).
The structured XML does not include any user credentials as these are held via the ES-Internal Pages. The request from the ES-External Pages to the ES-Internal Pages is carried on HTTP or HTTPS similar to a web service. As with any web service the XML content and the page defines the actions that are carried out, for ES-Internal Pages this is limited to delivering output to the ES-External Pages.
The ES-Internal Pages unpack the structured XML request, validate that the XML request is recognised and then login to the repository as a defined user known only to the ES-Internal Pages to get the request fulfilled by the application.
The application fulfilling the request acts as an anonymous user (IUSR_<Computer Name>) to the web server and the user credentials are limited to creating session token. Once the user credentials are used to create a session token, this session token is used to limit the supplied information to the external user. The external user does not have access permissions to any documents that would not be provided to the user credentials in a normal application session with the repository.
The results of the successfully processed request are packaged into structured XML format by the ES-Internal Pages, the logged in user session token is terminated and the structured XML package is sent to the ES-External Page. The ES-External Page receives the response, unpacks the structured XML, and presents this to the external user.
2.2.4.1 Installation & Use
As a pre-requisite for the installation of the External Service where the user community is open to the Internet there are a number security considerations:
- the external web server should be hardened / locked down to prevent unauthorised access, such as removing or disabling unnecessary services, user accounts and websites.
- the external web site should be placed in a DMZ with adequate firewall rules to limit possible attack on the web server from the Internet, such as using NAT.
- the ES-External Pages should be placed in website with a fixed IP and it may be worth creating a website that is not the default web site as this is the normal target for attacks by hackers.
- a port must be opened to allow the ES-External Pages to communicate with the ES-Internal Pages via HTTP, although the port number does not need to be port 80 as an alternative port designation may be given in the address to access ES-Internal Pages.
- the website for the ES-Internal Pages should be in a separate website with a fixed IP address.
- the ES-Internal Pages should be secured to limit communication to requests from the external web site running the ES-External Pages.
There is always a risk associated with an inbound connection to the internal network from the DMZ but with careful configuration this can be reduced to a minimum.
Installation of the External Service functionality involves several steps:
- The definition of a public access user and the granting of access to documents in the repository.
- The customization of the ES-External Pages for the clients branding.
- The creation the external web site and internal web site.
- The configuration of the ES-External Pages to the address of the ES-Internal Pages, including any port mapping to get through the firewall.
- The configuration of the ES-Internal Pages for the defined user and repository.
- The configuration of the firewall rules to allow communication between external and internal website.
It should be noted:
- Multiple external and internal sites can be made so that various levels of access can be accommodated.
- The ES-Internal Pages use the concurrent licence in a transitory manner, therefore if the ES-External Pages are likely to be very heavily used it is important to monitor licences carefully.
2.2.4.2 Configuration of ES-External Pages
In the ES-External Pages edit the file common.asp so that the line below is the URL to connect to the ES-Internal Pages:
const URL_PATH="http://localhost/int/"
2.2.4.3 Configuration of ES-Internal Pages
In the ES-Internal Pages edit the file common.asp so that the constants for the defined user and repository are configured:
const USER_LOGIN="publicusr"
const SERVER_NAME="kworker"
2.2.5 Workflow Trigger
The Workflow Trigger has the potential to be widely used and has caused some excitement with clients who have forms on their web sites or third-party applications that need to launch business processes.
Often when a form is completed on a web site the data collected is sent via email to an address for manual processing. The user receiving the email then must copy the data, by retyping it or using cut-and-paste, into an internal system and track the request through a number of stages in a process.
Workflow Trigger offers the opportunity for the data, in XML format, to be directed into a web service associated with the repository. The web service with the support of an executable application can then identify the appropriate workflow process and trigger the workflow process in the repository. The form data, captured in the XML file, is copied into the workflow process reducing the manual intervention required and speeding the requests journey into the business process.
This mechanism can also be considered for situations where events from third party applications need to trigger workflow process.
The Workflow Trigger functional is designed to enable an XML file supplied by a third party in the appropriate format to be used to trigger a workflow inside the repository based upon the criteria defined in a configuration file.
This illustration shows the functional components involved in the workflow trigger.
The Workflow Trigger consists of the Workflow Trigger Web Service and the Workflow Trigger Execution Engine which both utilise information held in the Workflow Trigger Configuration file. The Workflow Trigger Configuration file contains a separate definition for each source that may provide XML content to the Workflow Trigger Web Service.
The Workflow Trigger Web Service receives the XML from the third-party application, which may be a web page or service, and this includes the category, attribute data and an indication of the data source, such as ‘catalogue request’.
The Workflow Trigger Web Service identifies the source name from the XML content and then uses the details in the Workflow Trigger Configuration file for the source name to determine the directory into which to copy the XML content. Each source name defined in the configuration file has a separate location defined in the configuration file for the XML content, holding, error and success log.
The Workflow Trigger Web Service is only responsible for placing the content into a file in the XML Content directory for processing and it does not process the XML further. Restricting the Workflow Trigger Web Service in this way reduces the risk of an external application overloading the application and denying service to the repository users.
The Workflow Trigger Execution engine is started using a batch file to trigger an instance of the engine for each of the active sources. The Workflow Trigger Execution engine will every 5 seconds look to identify if there are XML files in the XML Content directory to be processed for the defined source name.
When an XML file is present in the XML Content directory, the Workflow Trigger Execution engine, using the execution rules held in the configuration file, will identify if the trigger clause has been defined. When the trigger clause is defined and true then the workflow whose node identifier matches that defined in the tag <WFID> for the source name will be invoked.
The Workflow Trigger Execution engine will populate the initial step of the workflow with the data provided in the XML file from the third-party application. Only attributes that are present and editable in the initial step of the workflow will be populated.
The Workflow Trigger Execution will the launch / trigger the workflow instance in the repository and record the outcome, including any errors in the locations defined in the Workflow Trigger Configuration file.
The Workflow Trigger Execution engine will then check for further XML files to process in the XML Content directory and then process each in turn.
2.2.5.1 Installation & Use
A pre-requisite for the use of Workflow Trigger is that a normal workflow has been created against a category in the repository. The workflow should include the ability for the initial step to be populated by someone other than the process performer and have all the attributes that might be supplied externally as editable in the initial step.
The workflow should have been tested and have the permissions so that the user that is defined in the Workflow Trigger Configuration file has the permissions of ‘See’ and ‘See Content’ on the workflow map.
Important: The workflow map must have been launched at least once to ensure that the content of the map file has been read into the database and this process must be repeated if the workflow map is subsequently versioned.
Installation of the Workflow Trigger functionality involves a number of steps:
- Edit WEBSERVICES\WF_XML_TRIGGER\WebServiceConstants.asp to identify the location of the Workflow Trigger Execution engine (WF_XML_Trigger.exe) typically found in the directory Addins\WF_XML_TRIGGER and location of the Workflow Trigger Configuration file (KWXML.XML) that must co-exist with Workflow Trigger Execution engine.
- The creation the Workflow Trigger Web Service (WEBSERVICES\WF_XML_TRIGGER).
- Edit the Workflow Trigger Configuration file (KWXML.XML) in order to set-up the necessary information for source name, such as user, repository, folder locations and trigger clause.
- Make a copy of the sample batch file WF_XML_TRIGGER.bat in Addins\WF_XML_TRIGGER and edit it so that it will launch the Workflow Trigger Execution engine for the relevant source name.
- Execute the batch file.
The Workflow Trigger Configuration file (KWXML.XML) has a key role to play in allowing the Workflow Trigger Web Service and the Workflow Trigger Execution engine to function. An example of the configuration file is:
<XML version='1.0' encoding='iso-8859-1'>
<KW_WF_TRIGGER_INI>
<SOURCES>
<SOURCE NAME = "ONE">
<USERNAME>system</USERNAME>
<PASSWORD>system</PASSWORD>
<SERVER>reddot040315</SERVER>
<HOLDINGAREA>C:\temp\WFT1\XMLHold</HOLDINGAREA>
<XMLCONTENTPATH>C:\temp\WFT1\XMLFiles</XMLCONTENTPATH>
<ERROR_LOGPATH>C:\temp\WFT1\Error\Error.log</ERROR_LOGPATH>
<SUCCESS_LOGPATH>C:\temp\WFT1\Error\Success.log</SUCCESS_LOGPATH>
<WFID>5056</WFID>
<WFCATEGORY>CategoryLookup_Cat</WFCATEGORY>
<TRIGGER_CLAUSE>
<ATTR_NAME>CatLookUp_Attr</ATTR_NAME>
<ATTR_VALUE>1</ATTR_VALUE>
</TRIGGER_CLAUSE>
<SET_EMAIL_FLAG>ON</SET_EMAIL_FLAG>
<FROMEMAIL>wfte@adimn.com</FROMEMAIL>
<TOEMAIL>sysadmin@admin.com</TOEMAIL>
</SOURCE>
</SOURCES>
</KW_WF_TRIGGER_INI>
</XML>
Each source name represents a processing definition, and the following defines the purpose for each of the elements:
- The source name<SOURCE NAME = "ONE"> provides the link between the XML Content files and Workflow Trigger Execution engine. The source name is case sensitive case. There can be multiple source names supported in a single configuration file but there must be a separate Workflow Trigger Execution engine launched to handle each.
- The user credentials <USERNAME> and <PASSWORD> are used to connect to the repository and initiate the workflow.
- The repository name <SERVER> is used to define the relevant repository for the source name.
- The holding area <HOLDINGAREA> is a pending area for problematic XML Content files and is used when the Workflow Trigger Execution engine identifies an error, rather than deleting the XML Content file it is moved to the holding area so that the system administrator may resolve the error and return it to the XML Content directory.
- The XML Content directory <XMLCONTENTPATH> is where the Workflow Trigger Web Service places the XML Content files for processing when their data indicates that they are for processing by this source name.
- The error directory <ERROR_LOGPATH> defines the location in which an error log will be created.
- The success directory <SUCCESS_LOGPATH> defines the location in which a success log will be created and maintained.
- The identifier for the workflow map <WFID> to be used if the trigger clause is met. The identifier can be found from the status line of the browser when the mouse is moved over the workflow map name. The identifier will remain consistent across versions of the workflow map, but it is important that each version of the workflow map is at least launched into the initiation screen before being used by workflow trigger.
- The workflow category <WFCATEGORY> is the category that is defined on the workflow, and it must match the category name defined in the XML Content file. The Category is identified by name rather than its identifier to permit portability between repositories with the same category definition. Note: Export and Import category can be used to ensure that the same category and attribute details are present in more than one repository.
- The trigger <TRIGGER_CLAUSE> is used to define the attribute name <ATTR_NAME>, not the attribute display name, and the attribute value <ATTR_VALUE> that will cause the workflow to be triggered. If the attribute does not exist in the XML Content file or the value does not match the trigger clause then no workflow instance will be triggered by the Workflow Trigger Execution engine.
- A selected user can be notified when the <SET_EMAIL_FLAG> has the value of ON.
The XML Content file provided by the third-party application will include the workflow category and the attribute names that will be represented as tags in the XML. Below is a sample XML Content file, from the third-party application, which would match with Workflow Trigger Configuration file (KWXML.XML) defined earlier:
<KW_WF_TRIGGER>
<CATEGORIES>
<CATEGORY NAME="CategoryLookup_Cat">
<CatLookUp_Attr>1</CatLookUp_Attr>
<CatEntry_Attr>This is the text</CatEntry_Attr>
</CATEGORY>
</CATEGORIES>
<SOURCE>ONE</SOURCE>
</KW_WF_TRIGGER>
Note:
- The source name, category name and attribute are consistent between the two samples.
- Only attributes that have been defined as editable in the initial step of the workflow will be updated. Any attributes that are omitted from the XML Content file will be assumed to be null.
2.2.5.2 Testing Workflow Trigger
To ensure that Workflow Trigger has been configured correctly a test harness has been provided that will allow the injection of XML into the Workflow Trigger Web Service. The XML Content file will then be created, and the Workflow Trigger Execution engine will run and trigger the relevant workflow.
The test harness takes consists of a form (TestXML.asp) that can be launched in a browser and a sample XML file. These are both located in the Workflow Trigger Web Service directory (WEBSERVICES\WF_XML_TRIGGER).
To prepare to use the test harness there are several steps:
- Edit TestXML.asp to define the location of the Workflow Trigger Web Service by editing the WEB_SERVICE_URL.
Const WEB_SERVICE_URL = "http://localhost/WebServices/ProcessWFTrigger.asp"
- The ProcessWFTrigger.asp is the page that will receive the XML input from the third-party application and process it into the relevant XML Content directory. The Workflow Trigger Web Service directory fulfilling the request acts as an anonymous user (IUSR_<Computer Name>) to the web server and must have permissions to write to the XML Content Path.
- Assuming that a workflow has been created based upon a category and the appropriate changes have been made to the Workflow Trigger Configuration file to include the workflow identifier, category name, attribute name and attribute value.
- Create a sample of XML Content file for the external application based on the sample file SampleKWXMLFormat.xml (see below).
<KW_WF_TRIGGER>
<CATEGORIES>
<CATEGORY NAME="CategoryLookup_Cat">
<CatLookUp_Attr>1</CatLookUp_Attr>
</CATEGORY>
</CATEGORIES>
<SOURCE>ONE</SOURCE>
</KW_WF_TRIGGER>
- Replace the category name with that used by the workflow, and define the Workflow Trigger Configuration file, before ensuring that at least the attribute for the trigger clause is included in the sample XML file.
Note - workflow attributes will not be updated unless they have been defined as editable.
To run the test, launch TestXML.asp in a browser then copy-and-paste in the sample XML into the input dialogue and observe the output dialogue. Examine the error and success log files defined in the Workflow Trigger Configuration file.
In the repository, examine the workflow actions and instances from Admin to see that the workflow has been triggered successfully.
2.3 Workflow
The workflow designer and engine provide several built-in options that provide some of the common customizations. Whilst this built-in functionality is free, it is important to understand something about its implementation.
2.3.1 Generate Workflow Title
A frequent requirement is to provide the naming of a workflow instance based upon some attribute populated in the workflow initiation screen. Previously this requirement involved the creation of a custom form and handler for the initial step of the workflow but now a customization will only be required if a unique sequence number from an internal or external database is required to be included in the workflow instance name.
The workflow designer is responsible for defining the rules for the workflow instance name, but the name is only generated when the workflow instance is created.
The first stage is in the workflow designer, workflow properties page to indicate that workflow instance name generation is required, and this is achieved by setting the property “Rule based Instance Name” to true.
The workflow designer must then specify the rule for building the workflow instance name and this is done by selecting the for the Instance Name field. This causes the Instance Name Generator dialogue to be displayed.
The workflow instance name can consist of:
- Workflow Title.
- System Date.
- System Date and Time.
- Initiator First Name.
- Initiator Last Name.
- Initiator Login Name.
- Workflow Instance Identifier.
- Separator characters added by the designer.
- Text added by the designer.
- One or more attributes from the workflow category.
The Data Field is used to select from the available fields, including attributes, and each is added to the Rule for the Instance Name using the + button. It is worth observing that the selected data fields are represented by an internal name or the attribute name, not the attribute display name in the field Rule For Instance Name.
Text can be added directly into the Rule For Instance Name field but should not be added in the Data Field entries that are surrounded by square brackets ([]) and must not include square brackets ([]) or double hyphens.
It should be noted that the instance name is only generated at the initiation of the workflow, when the initiate button is pressed, so the inclusion of attribute fields in the workflow name that are not populated at initiation stage will add little value to the workflow instance name.
It is the workflow handler page and not the FlowServer COM component that performs the calculations to define the workflow instance name. If there are any program only attributes included in the workflow instance name, then programmer should create a custom handler that sets these data values before the call to SetWFNameByRule.
Changes after the call to SetWFNameByRule or at subsequent tasks in the workflow will require the programmer to use ReadWFInstanceProps to get the current workflow instance name based upon the workflow instance identifier, then to perform any modification to the string prior to using SetWFInstanceProps based upon the workflow instance identifier. The programmer must be careful to ensure that if the workflow instance name is changed post initiation that the task performer has sufficient permissions to change the workflow instance properties.
2.3.2 Generate Letter
It is often necessary to generate a letter or some other correspondence at a workflow task and functionality has been incorporated into the application that allows the automatic creation of a letter using a Microsoft Word™ document with bookmarks. The Microsoft Word™ document can be one of a number of documents held in the repository for which the task performer has been granted access.
Each Microsoft Word™ document, used as a template, will have within it bookmarks that are to be populated with data from the workflow attributes. When the document is added to the repository it will be assigned a category that includes at least two attributes:
- An attribute to define the usage or purpose.
- An attribute called bookmark_mappings that will define the relationship between a bookmark in the document and a workflow attribute, based upon the attribute name.
The usage or purpose attribute in the category assigned to the document provides the means to identify the relevant documents that should be associated with a workflow task. This attribute will allow the workflow designer to provide some filtering on the documents that are listed. For example, there might be documents that relate to sales enquiries and separate documents that relate to invoice management in the repository, but the sales enquiry process would not need visibility of the invoice documents.
The bookmark_mappings attribute is a multi-line text entry field with a data type of varchar (1024). Each line in the multi-line entry field would have an entry like:
<MyBookmark>:<myattribute>
Where <MyBookmark> is the name of a bookmark that has been placed in the Microsoft Word™ document in the appropriate location with the appropriate formatting and <myattribute> is an attribute name in the workflow category that will be used to populate the bookmark with data. It is important to note that <myattribute> is the internal attribute name, not the attribute display name, for the data field in the workflow category. The same attribute name may be used against more than one bookmark, if this is deemed appropriate, but bookmark names in Microsoft Word must be unique to the document. For example, you might want the company name to appear twice in the Microsoft Word document and would need to create two bookmarks (COMPANY1 and COMPANY2), but you could associate the same attribute name from the workflow category against each.
An important note for programmers / workflow designers is that the bookmark_mappings attribute is limited to 1024 characters therefore it is recommended that you keep your Microsoft Word™ bookmark names and internal attribute names short. It would be a shame to be unable to populate all the fields in a form or letter because the bookmark names or attributes names were too long.
In the Workflow Designer the Generate Letter functionality is part of the Custom Items tab that is available on any task except the initiation task.
Selection of the Generate Letter button presents the dialogue to define the criteria for selecting documents that the user has access to in the repository. The criteria is based upon the category that was assigned to the Microsoft Word™ document that was added to the repository and the attribute value that defines the document(s) to be used by this workflow task.
In the following illustration you will see that the task will be looking for documents that have the category Template Docs and an attribute called ‘lookupattr’.
The Operator field lists the valid operator's dependent upon the data type that was assigned to the attribute name.
To define the Value field for the ‘lookupattr’ the workflow designer must either select the button to be provided with a list of possible values or add an entry into the field. It is advisable to only enter data into the entry field when there are no values available.
The workflow designer must elect the criteria for each task using generate letter and no assumptions are made about the criteria being the same between tasks.
The task performer in a workflow instance will be unaware of the preparation work that has been carried out by the document creator and workflow designer. The Generate Letter function presents a button on the task screen during workflow execution, which when pressed allows the performer to select a Microsoft Word™ template document from a list of documents that meet the workflow designers’ criteria. The programmer should note that the action button Generate Letter causes the task to be updated so that data and comments are written to the database and file store. Action buttons normally cause the function to be set to task done (wf.taskdone) but the action Generate Letter sets the function to wf.generatetaskletter and this function shares the same case statement in the workflow handler as update (wf.taskupdate).
When selecting the Microsoft Word™ template document the performer is presented with the ability to select multiple documents from the available documents list. If the wrong document is selected, then the performer may remove the document using the arrow buttons to move selected documents between two lists: available documents and selected documents.
When the template document is selected, using the Submit button, the workflow engine launches the document inside Microsoft Word™ on the performers machine and automatically populates any bookmarks in the template document with corresponding information contained in the workflow attribute fields. This assumes that the performer has Microsoft Word™ installed.
The task performer will see that the bookmarks in the Microsoft Word™ documents have been pre-populated with data from the workflow instance. The task performer is then free to add text to the document but should avoid changing the content of bookmarks. The task performer can carry out the normal actions associated with any other Microsoft Word™ document.
With Office Integration installed on the performers machine, the performer may use Save Document to cause the generated letter saved and automatically attached to the workflow task from which it was generated. Using Version Document will allow the attached document to be versioned, should the task performer wish to make further changes.
Both Save Document and Version Document utilise the KWGendoc virtual directory and the FlowServer COM component to transfer the document back to the relevant workflow task.
The task performer, upon returning to the workflow task, should refresh the task information by using the right mouse button in the task and selecting refresh. The attachment that has just been saved will then be visible.
The developer may wish to create a custom handler to change the generate letter functionality to provide a calculated attribute value or to merge multiple documents into a single document.
2.3.3 Select Performer
The workflow designer typically defines the performer of a task when the workflow map is created but there are some instances where the performer can be assigned based upon the selection made by the initiator or a relationship to the initiator.
In a typical workflow, the task is assigned to a group and the first member of the group that opens the task is expected to complete it. Should the performer need to be reassigned then the workflow manager would open the task list and make the appropriate adjustments to the workflow instance.
The most frequent customizations to performer that were requested with the previous version were to make the task:
- Return to the group.
- Have the same performer as a previous task in the workflow instance.
- Set the performer in relation to the initiator of the workflow instance.
All these requirements have been implemented in the current version and therefore developers will have less requests to set the performer of a task.
The most significant change is that of setting the performer based upon the relationship to the initiator (see description in Workflow Designer Guide). The relationship to the initiator assumes that there is some information associated with the initiator that can be utilised.
It is assumed that the user has some category information associated with their user details that holds the relationship information. This involves creating a category with the scope of user, defining its attributes, and then assigning it to the relevant users of the repository. This is not a difficult task when the user community is small but becomes a more difficult task when the user community is larger.
A developer may wish to write a simple script, and this is possible, but it is important to understand the database relationships that are being established:
- Each user has an identity (User_id).
- Each user has a personal space that causes an entry to be made in the node table.
- The user category identifier is allocated against the user’s personal space in the node table.
- The user category is a separate user defined table located through the category table.
- A row must be defined for each user who has the user category assigned.
- The user category will typically have an attribute that relates to a user list, such as for their manager, and this must have an appropriate user identifier defined.
The relationship to initiator functionality can be built on an attribute that is not a user list, for example it could be a look-up on another category, such as a data category, but the value returned by the field must relate to a user identifier. In addition, the default performer name must be defined and related to a username in the format <first name> <user last name>. The default performer will be used in the circumstances where the initiator does not have the category or attribute set and the default performer must always be defined as the user.
When testing the relationship to initiator it is worth noting that the performer is not calculated until the task becomes ready.
2.3.4 Email Notification
Email Notifications are often required to trigger users not normally in active in the repository to undertake actions or be made aware of events in the repository. There is a provision to define an email notification, if required, based upon an event, such as Task Complete or Overdue.
The recipient of the email notification can be based upon:
- Active Performer / User.
- A defined User.
- A defined Group.
The sender of the email notification can be based upon:
- Active Performer / User.
- A defined User.
For both the recipient and sender of the email notification the user / group is selected from the “…” button that allows the workflow designer to select a user or group.
For the email notification, the workflow designer is provided for the subject and message text with a dialogue launched from the button located adjacent to the field. The dialogue, called Email Text Generator is used to define the text and allows the user to build a complex expression that can consist of any one or more of:
- Workflow Instance Title.
- Task Title.
- System date.
- System date & time.
- Active Performer First Name.
- Active Performer Last Name.
- Active Performer Login Name.
- Workflow Instance Identifier.
- Task Identifier.
- Task Tag.
- Separator characters and / or text added by the designer.
- One or more attributes from the workflow category.
The Data Field in the Email Text Generator is used to select from the available fields, including attributes, and each is added using the + button. It is worth observing that the selected data fields are represented by an internal name or the attribute name, not the attribute display name.
Text can be added directly into the field but should not be added in the Data Field entries that are surrounded by square brackets ([]) and must not include square brackets ([]) or double hyphens.
All email notifications are handled by the task escalation executable looking for the event being triggered.
2.3.5 Escalation
Escalation is another specific customization that has been requested by many clients who want to have handling that can be related to a task becoming overdue. The types of escalation that are available include:
- Email Escalation - sends an email notification when the task becomes overdue.
- Task Redirection - to another user / group, reassigns the task when it becomes overdue.
- Task Completion - automatically processes the task as complete when it becomes overdue.
All types of escalation are handled by the task escalation executable looking for the overdue event.
For Task Completion on escalation, the task will be automatically moved on to the next task. It will be the workflow designer’s responsibility to ensure that any conditions that could result in error are resolved. For Task Completion on escalation there is no Escalation Data to be defined
For Task Redirection on escalation the workflow designer must choose a Performer Type (user or group) and make a selection from the User and Group dialogue.
For the Email Escalation the email notification dialogue is similar to that for the email notification, with the email send, recipient, subject, and title all being defined by the workflow designer. The email notification Subject and Body are defined from any one or more of:
- Workflow Instance Title.
- Task Title.
- System date.
- System date & time.
- Active Performer First Name.
- Active Performer Last Name.
- Active Performer Login Name.
- Workflow Instance Identifier.
- Task Identifier.
- Task Tag.
- Separator characters and / or text added by the designer.
- One or more attributes from the workflow category.
3. Architecture
The application architecture is based upon Microsoft Windows technology although the vaults and the SQL database may be run on alternate platforms. For example, the SQL database can be Oracle run in a Unix environment or the vaults can be on any file system accessible from the windows environment.
The components of the application architecture are:
Users have three distinct interfaces into the application:
- Their browser for full functionality.
- Microsoft Office Applications to search, open, save, version and check-out / in documents.
- Their browser for search and retrieval functionality from an external site.
The application, consisting of the web server(s), COM components, SQL database and file storage (Vaults), can be a deployed on a single server. Alternatively, to produce a more scalable application components can be implemented on multiple servers. This can result in the web server(s) and COM components on one server, SQL database on a separate server and file storage on another server. Typically, the first component to be placed on a separate server is the SQL database as this allows for appropriate tuning to be carried out.
The application relies upon the session token from the IIS server and therefore any duplication of the application server will require users to be targeted at a single server or for the implementation of load balancing technology that supports sticky session.
3.1 User Interfaces
3.1.1 The Browser
The browser is the primary interface through which functionality is delivered to the user community. The user interacts with the interface pages on the web site that are located in the web/scripts directory. The interface pages interact with the handler pages of the web site, located in the web/handlers directory, and these interact with the COM component services that communicate with the repository database and vaults. The handler page typically then redirects the user to the relevant interface page.
The vast majority of the functionality is delivered to the browser using HTML, JavaScript and Cascading Style Sheets. The Workflow Designer and Workflow Status map are delivered as Java Applets and there are a small number of search results that have the option to be delivered as XML.
3.1.2 Office Applications
The Microsoft Office™ applications (Word, Excel, PowerPoint, and Outlook) can also be used to interact with the repository. These as implemented as add-in components to the Microsoft Office™ applications:
- Office Integration for Word, Excel, and PowerPoint.
- Outlook Integration for Outlook.
The add-ins can be deployed into the machines running Microsoft Office 2000 SP3 and Microsoft Office 2002 SP3. The components have been separated as although the Microsoft Office suite is frequently used not all clients utilise Microsoft Outlook as their email client.
The Office Integration add-in provides the user with the ability to upload, version and check-in documents directly from their Microsoft Office applications. In addition, the user may search and view documents that are in the repository for which they have access.
The Office Integration add-ins communicate with the KWOffice virtual directory, using XML to transfer the data to the pages. The pages in the CCS Office virtual directory construct a transient session for the user and then interact with the COM components that communicate with the repository database and Vaults. The pages then provide the response in XML format before tearing down the users’ transient session.
This Office Integration interface is useful for users whose primary function is to publish documents into locations in the repository, as it improves productivity. The user can still use the manual approach of saving the document to file and then launching a browser to upload the file.
The Outlook Integration add-in provides the user with the ability to move or copy active messages into the repository and navigate the repository without leaving their Microsoft Outlook application.
The Outlook Integration add-in communicates with the OutlookWSDL virtual directory and uses the COM components that communicate with the repository database and vaults.
Both interfaces provide access to the same areas that the user would have access to through their browser interface including Personal Workspace, Library folders and Projects.
3.1.3 Search and Retrieval
This is the External Service, described earlier in this document. It provides the ability to have search pages that are part of an external web site, such as an Intranet, Internet or Extranet perform searches to gain access to documents held inside the repository.
The External Search provides:
- Content Search Functionality.
- Category Search Functionality.
- Document Retrieval.
3.2 Server Components
3.2.1 Interface Virtual Directory
The main virtual directory contains the pages that provide the user interface and handling of user requests. These are Active Server Pages written in VBScript with a generous helping of HTML and JavaScript.
The pages are separated into several folders within the virtual directory.
The following table describes the typical pages that can be found in each folder.
Folder |
Typical Functionality of Pages |
Cab |
Used for downloadable packages, such as those that might be made available to Microsoft Word™ to permit the user to save back a document. |
ClientScripts |
Commonly used Javascript functions, including the context menus and calendar controls. |
Css |
The common style sheets |
CustomForms |
For developers to place their form customizations |
CustomHandlers |
For developers to place their handler customizations |
CustomHome |
For developers to place their home page customizations |
CustomIncludes |
For developers to place their include customizations |
CustomWFForms |
For developers to place their workflow form customizations |
DBAdmin |
Administrative pages and database scripts, including those to create a repository. |
Functions |
ASP support functions that may be included in other interface pages. |
Handlers |
Pages to handler the results of form posts / gets and communicate with the COM components. |
Images |
Images used to brand the interface and the context menus |
Includes |
Common include files and constants that may be included in other pages |
Java |
Java and ASP code used to present the workflow designer and status map. |
Reports |
The administration of custom home page entries |
Scripts |
User interface pages that provide the functionality to users. |
Search |
The Item Content Search functionality |
Timeimages |
Contains the subdirectory used for the images used for the default mime types |
Xml |
The XML held in this directory determines the languages that are supported and the page text for each language. The content of this directory must have modify permissions if ‘Manage Page Text’ from the System Administration menu is to be used to make changes to the supplied text. |
Note - There are several directories that have been supplied explicitly for customizations and developers are strongly advised to utilise these directories and keep changes in other directories to a minimum as this will reduce the work required when new versions on the application are released.
3.2.2 COM Components
The application makes use of the Microsoft COM+ Component Services functionality and when the application is installed a COM package is created for the application and the following component files are added.
The COM Package that is created has an identity defined. This identity is a user, and it is this user that controls access to the database and the vaults. When there is more than one repository the same identity / COM components will be used for all repositories.
It is important to understand that if the user account defined as the identity for the COM Package is disabled or the password is changed then this will cause the application to malfunction.
This table defines the functionality contained within each of the files
Component File |
Functionality |
DataSavedSearch.dll |
Used by the Saved Search functionality to produce the XML results. |
FlowServer.dll |
The workflow engine. |
KCFile.dll |
Handles the HTTP Stream for the passing file and form data together |
KWEXTENSION.dll |
Used to support login functionality when integrating with other systems, such as CMS. |
KWSession.dll |
Used for session management and login |
OutProcessWSEvent.dll |
A workflow external event handler to trigger communication to a web service |
RegDatum.dll |
Handles Licence Management |
Webwerx.dll |
Handles main functionality for document management, project collaboration and system administration. |
WF_TRIGGER.dll |
Supports the Workflow Trigger functionality |
WWFSO.dll |
Handles file system communications and is used when the file system object is likely to be on another server. |
WWVAULT.dll |
Used to support vault management software. |
These component files provide component objects that are listed in the COM package. Selecting a component object and expanding it will reveal the many interface methods that are in the component object. A full description of all these interface methods is not included in this document but examples of use can typically be located in the Active Server Pages of the virtual directory.
The following table outlines the functionality that is in each of the component objects and the interface methods that types each contains.
Component Object |
Description |
DataSavedSearch.CDSavedSearch |
Used to produce the XML results for data, media and workflow saved searches |
OLE File Property Reader Class |
Used to read the properties of a file and provides the support for the fast checkout and generate letter. |
FlowServer.Utils |
Hold the generic utilities for flowserver, only the Login Method is made public. |
FlowServer.WorkflowInstance |
Used to read and process instance data associated with workflows |
FlowServer.WorkflowMap |
Used to get the workflow map file data |
KCFile.FileUpload |
Used to handle the stream data through the HTTP server, including the uploading of file and form data together |
KWEXTENSION.Emails |
Email notification message construction that involves multilingual support, mainly used for the executables notify, notify interests and vault manager. |
KWSession.sess |
Session management functionality including user validation checks for available licences and the session integration for CMS. |
OutProcessWSEvent.OutProcess |
Used for the sample workflow event handler that communicates with a Web Service. The methods include those for each of the event stubs. |
RegApp.RegDatum |
Used to handle presentation of the serial number and the application of the unlock code. |
WebWerx.Admin |
Used to read the audit information from the audit table |
WebWerx.Attribute |
Used to add, delete, list, read and set attribute data associated with the Attribute table |
WebWerx.Category |
Used to handle the category (via the category table) and retrieve the attribute data, mainly through the view AttributeDetail |
WebWerx.DirServices |
Used as part of the implementation of the LDAP module. |
WebWerx.Discussions |
Used to list and delete forum / discussion items |
WebWerx.Document |
Used to manage the document information, usually held in the Document table. This includes adding versions, check-in / out and version references. |
WebWerx.Form |
Handles forms, category, and attribute information, mainly unused. |
WebWerx.FSProvider |
Handles the methods required to support the File Storage provider, only Windows NTFS is supported at this time. |
WebWerx.Group |
Group management, including groups and the membership. Groups are held within the node table and can be identified because they have a node_type of 12. There is no concept of Groups within groups in the application as carried to extreme this can adversely affect system performance. |
WebWerx.Node |
Used for all activities that involve the node, based on the node table; adding nodes, deleting nodes, listing / browsing nodes, managing associated values (properties, permissions, category, and extra node information) and related items (tagged nodes, and associations). For browsing and a number of the node functions it is important to understand the node_type and node_subtype. |
WebWerx.NVProvider |
Used to manage document versions, where replacement is required. |
WebWerx.Project |
Handles the projects (add, delete and list). Projects (node_type=7) are held in the node table and consist of folders (node_type=5) and groups (node_type=12). The components of the Project (Items, Forums, News, Tasks, and groups) are all associated to the project by information held in the ExtraNodeInfo table. |
WebWerx.Query |
Category query methods. |
WebWerx.Schedules |
Methods to support the use of calendars for task due date calculations. |
WebWerx.SessionManager |
Session management functionality including licence token management. |
WebWerx.Thread |
Used to manage the discussion threads |
WebWerx.User |
Handling for user details and the user home page information. |
WebWerx.Utils |
A number of public utility type methods to handle strings, connection strings, recordsets and server elements, such as the virtual directories and web server. |
WebWerx.Vault |
Handles the all the activities that relate to the vaults associated with the repository |
WF_TRIGGER.Trigger |
Handles the functionality for the workflow trigger executable. |
WWFSO.Explore |
Used in relation to file system objects so that there are no problems when the vaults are on a remote server |
WWVAULT.AutoVault |
Used to manage the vaults by the Vault manager executable |
Note - that all methods assume that input validation has been completed and therefore any customizations that make use of these methods should ensure that only validated input is supplied.
3.2.3 Using COM Methods
This document does not cover in detail all interface methods or provide a description for each of the methods. Instead, this document provides a sample of the methods used to read and write data to the repository. The interface methods can be defined into the following groups based upon their action on the information in the database tables:
- Add.
- Set / Update.
- Get / Read.
- List.
- Delete.
The interface methods that fall into these groups have similar parameters requirements when they are utilised from any of the component objects. The following sections will for each of these groups provide some examples that will help a developer understand how to utilise these interface methods.
3.2.3.1 Adding Data using Interface Methods
Those methods that are used for adding data require the column data that is to be added to the relevant table followed by three parameters:
- An identifier, that if successful, will contain the identifier for the new created record.
- The session information for the user invoking the add method.
- Status information to report the outcome.
The session information allows the connection to the relevant repository and a check to be made that the user has the relevant permissions to carry out the requested add. The status parameter reports the outcome and will if the method fails report the reason.
Here are a sample of the Add methods including the location of an example of their use:
WebWerx.Attribute:
Definition:
Add (categoryID As Long, Name As String, displayName As String, inputType As Integer, dataType As String, formatValues As Variant, defaultValue As Variant, mandatory As Boolean, Description As String, newAttributeID As Variant, session As Variant, status As Variant)
Example: In the case statement cat.addattr in Handlers\Category.asp
Note: The category identifier is included because the method will also update the CategoryAtrribute relationship table.
WebWerx.Category:
Definition:
Add (categoryName As String, Description As String, newCategoryID As Variant, session As Variant, status As Variant)
Example: In the case statement cat.add in Handlers\Category.asp
WebWerx.Group:
Definition:
Add (Name As String, groupType As Integer, Description As String, newGroupID As Variant, session As Variant, status As Variant)
Example: In the case statement group.add in Handlers\Usergroups.asp
Note - The group type is set to zero (0) for normal groups and only changed for project groups.
WebWerx.Node:
Definition:
Add (objType As Long, parentID As Long, objName As String, objNewID As Variant, session As Variant, status As Variant)
Example: In the case statement folder.add in Handlers\node.asp
Note:
- The ObjectType relates to the node_type that will be allocated in the Node Table. Some of the node types are for internal use only (marked with an *) but have been include in this list so that any customization avoids the developers predefined node types:
Root Node * = 0
Library Node * = 1
System Folder Node * = 4
Folder Node = 5
Document Node = 6
Project Node * = 7
Discussion Node = 8
Topic Node = 9
Reply Node = 10
User Node = 11
Group Node = 12
Book Node * = 14
Saved Category Search Node = 15
Vault * = 16
Text Note Node = 17
URL Node = 18
Workflow map Node = 20
Workflow instance Node * = 21
Workflow Instance Task Node * = 22
News Channel Node = 30
News Node = 31
Workflow Comment Node = 32
Action Map Node = 33
Email Node = 40
Email attachment Node = 41
Record Node = 42
Data Search Node = 43
Workflow Saved Search Node = 44
Calendar Root Node * = 100
Calendar Node = 101
Media Root Node * = 200
- The ParentID relates the node identifier for the parent folder / location in the repository and this Node Identifier can be located with a read or a list / browse.
WebWerx.User:
Definition:
Add (login As String, password As String, baseGroup As Long, privileges As Integer, loginStatus As Integer, firstName As String, middleName As String, lastName As String, newUserID As Variant, session As Variant, status As Variant)
Example: In the case statement user.add in Handlers\Usergroups.asp
Note:
- The privilege values are defined but those with a * are only suitable for users who are also members of the system group:
User and Group Administration = 2
Create Users = 4
Create Groups * = 8
Create Projects = 16
Category Management = 32
Manage User Categories * =64
Manage Privileges * =128
- Adding the user is not sufficient the properties also need to be set for the remaining user details and the home page reports need to be defined for the user.
A detailed description of the WebWerx.Node Add method is below:
Add (objType As Long, parentID As Long, objName As String, objNewID, session, status)
objType Input Either a defined or custom object type for the new node
parentID Input The parent node identifier to which the node will be added
objName Input The textual title of the node object that will be displayed in the library
objNewID Output A valid node identifier, if successful
session Input The session token passed from the user interface
status Output Either success or failure, failure is an error identifier.
The method uses the session to check that the user has permissions (PERM_MODIFY) to add a node to the parent node object. If the user has permission, then a transaction is undertaken to add the node of the defined object type with the define object name. Only the owner of the new node will have full permissions. There is no concept within the method of permission or category inheritance from the parent node.
3.2.3.2 Set / Update Data using Interface Methods
In most of the component objects the set method is used to update information in a single row in the relevant table, if the user has permissions and privileges. Each set method defines:
- The identifier to update.
- An array of columns in the database to update.
- An array of corresponding values to update.
- The session information for the user invoking the add method.
- Status information to report the outcome.
The array of columns that may be updated are defined by the underlying database table. In the examples give the tables matches the object, for example the SetProps for the object WebWerx.Attribute relates to the Attribute table.
In some set methods, the functionality only allows the update of a single element, such as shown in the WebWerx.Node SetPermissions method.
WebWerx.Attribute:
Definition:
SetProps (attrID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In the case statement cat.addattr in Handlers\Category.asp
WebWerx.Category:
Definition:
SetProps (categoryID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In the case statement cat.disable in Handlers\Category.asp
WebWerx.Group:
Definition:
SetProps (groupID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In the case statement group.update in Handlers\Usergroups.asp
WebWerx.Node:
Definition:
SetProps(nodeID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
SetPermission(ByVal nodeID As Long, ByVal userGroupID As Long, ByVal permissions As Long, ByVal session As Variant, ByRef status As Variant)
Examples: SetProps in the case statement node.update and SetPermissions in the case statement node.uperm both in Handlers\node.asp
Note:
- The GrantPermissions and CascadePermissions methods tend to be used more than the Set Permissions method.
- Node also has a method for SetCategory but this differs slightly from the other set examples because as well as updating the node it also updates the relevant category table.
WebWerx.User:
Definition:
SetProps(theUserID As Long, colnames As Variant, colValues As Variant, session As Variant, status As Variant)
Example: In the case statement user.add in Handlers\Usergroups.asp
FlowServer.WorkflowInstance:
Definition:
SetWFInstanceProps (wfInstID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
SetWFInstanceTaskAudit (wfInstID As Long, wfTaskID As Long, auditText As String, session As Variant, status As Variant)
SetWFInstanceTaskComment (wfInstID As Long, wfTaskID As Long, taskComment As String, session As Variant, status As Variant)
Example: In the case statements in Handlers\workflow2.asp
Note: Workflow Instances often require the task identifier to be provided.
3.2.3.3 Get / Read Data Using Interface Methods
In most of the component objects the Get /Read method is used to read information from a single row in the relevant table if the user has the permissions and privileges. Each Read method defines:
- The identifier to read.
- An array of columns in the database to be returned.
- An array for the values to be returned in.
- The session information for the user invoking the add method.
- Status information to report the outcome.
In most read methods the identifier is used to define criteria for entries in the database table.
The array of column name for which information is required relates to the underlying database table. In the examples given the table matches the object, for example the ReadProps for the object WebWerx.Attribute relates to the Attribute table. When invoking the read method, it is possible to specify the value ‘*’ in the first element of the column array to get all the columns from the underlying table but this lazy approach can be harmful to performance if there are a large number of columns.
Here are a sample of the Get / Read methods including the location of an example of their use:
WebWerx.Attribute:
Definition:
ReadProps(attrID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In Scripts\AttributeInfo.asp
WebWerx.Category:
Definition:
ReadProps(categoryID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In Scripts\CategoryInfo.asp
WebWerx.Document:
Definition:
ReadProps(documentID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In Scripts\DocVerInfo.asp
WebWerx.Group:
Definition:
ReadProps(groupID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In Scripts\GroupInfo.asp
WebWerx.Node:
Definition:
ReadProps(nodeID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
ReadPermission(nodeID As Long, permissions As Variant, session As Variant, status As Variant)
ReadExtraInfo(nodeID As Long, attributeName As String, ExtraData As Variant, session As Variant, status As Variant)
ReadCategory(nodeID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Examples: ReadProps can be seen in scripts\NewFolder.asp. ReadPermissions and ReadExtraInfo can be seen in scripts\FolderInfo.asp. ReadCategory can be seen in functions\catForm.asp.
Note - ReadCategory is reading the relevant category table for attribute columns and not the node table, although it does read the node table to identify the category identifier for the node.
WebWerx.User:
Definition:
ReadProps(UserID As Long, colnames As Variant, colValues As Variant, session As Variant, status As Variant)
Example: In Scripts\ChangeUserInfo.asp
FlowServer.WorkflowInstance:
Definition:
ReadWFInstanceProps(wfInstID As Long, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: In the case statements in Handlers\workflow2.asp
3.2.3.4 List Data using Interface Methods
In most of the component objects the List Data method is used to read information from one or more rows in the relevant table or view if the user has the permissions and privileges. Each List method defines zero or more columns that will be added to the criteria:
- A clause that effectively defines the Where clause that will be applied to the database table or view.
- An order by clause that will determine the order.
- An array of columns in the database to be returned.
- An array for the that rows of data will to be returned (a multidimensional array).
- The session information for the user invoking the add method.
- Status information to report the outcome.
The session information allows the connection to the relevant repository and a check to be made that the user has the relevant permissions to carry out the requested add. The status parameter reports the outcome and will if the method fails report the reason.
Here are a sample of the List methods including the location of an example of their use:
WebWerx.Category:
Definition:
Attributes (categoryID As Variant, browseOrder As String, clause As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
ReadValues (categoryID As Variant, clause As String, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
List (browseOrder As String, clause As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: Both ReadValues and List can be found in Include\GenFuncs.asp. Attributes can be found in Functions\catform.asp
Note: The attributes method is running against the view AttributeDetail and therefore the columns, clause and order parameters must reflect this.
WebWerx.Group:
Definition:
Find (searchFld As String, searchStr As String, clause As String, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
List (browseOrder As String, clause As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
ListGroupMembers (groupID As Long, memberTypes As Integer, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
ListUserMembers (groupID As Long, membership As Long, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: Find, List and ListUserMembers can be found in Scripts\FindUserGroups.asp and ListGroupMembers can be found in Scripts\GroupInfo.asp.
WebWerx.Node:
Definition:
Browse (nodeID As Long, clause As String, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Search (nodeID As Long, FilterByPerms As Integer, MaxCount As Integer, TotalCount As Variant, clause As String, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: Browse can be found in Scripts\outofoffice.asp and Search can be found in Includes/LibraryNode.asp. The
Note: Browse normally causes the columns of the array to be extended with permission columns so that the results can be permission processed by the caller. Search is more efficient as the caller only receives the results that match the permissions filter.
WebWerx.Project:
Definition:
List (browseOrder As String, clause As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: Can be found in Scripts\Projects.asp
WebWerx.Threads:
Definition:
ListThreads(parentID As Long, browseOrder As String, rgColNames As Variant, rgColValues As Variant, session As Variant, status As Variant)
Example: ListThreads can be found in Scripts\TopicDetail.asp
WebWerx.User:
Definition:
List(browseOrder As String, clause As String, colnames As Variant, colValues As Variant, session As Variant, status As Variant)
ListHomePageReports(browseOrder As String, clause As String, colnames As Variant, colValues As Variant, session As Variant, status As Variant)
Example: List can be found in Scripts\outofoffice.asp and ListHomePageReports can be found in Scripts\userhome.asp.
3.2.3.5 Deleting using Interface Methods
Those methods that are used for deleting data require the identifier for the row to be deleted to be provided for the relevant table followed by the parameters:
- The session information for the user invoking the add method.
- Status information to report the outcome.
The session information allows the connection to the relevant repository and a check to be made that the user has the relevant permissions to carry out the requested add. The status parameter reports the outcome and will if the method fails report the reason.
Here are a sample of the delete methods including the location of an example of their use:
WebWerx.Attribute:
Definition:
Delete (categoryID As Long, categoryName As Variant, attributeID As Long, attributeName As Variant, session As Variant, status As Variant)
Example: Can be seen in Handlers\category.asp
Note: The category identifier is included because the method will also updates the CategoryAtrribute relationship table.
WebWerx.Category:
Definition:
Delete (categoryID As Long, session As Variant, status As Variant)
Example: Can be seen in Handlers\category.asp
WebWerx.Group:
Definition:
Delete (groupID As Long, session As Variant, status As Variant)
DeleteMembers (groupID As Long, members As Variant, session As Variant, status As Variant)
Example: In the case statement group.delete and group.delmems in Handlers\Usergroups.asp
WebWerx.Node:
Definition:
Delete (ByVal nodeID As Long, ByRef session As Variant, ByRef status As Variant)
Example: In the case statement node.delete in Handlers\node.asp
WebWerx.User:
Definition:
Delete (UserID As Long, session As Variant, status As Variant)
Example: In the case statement user.delete in Handlers\Usergroups.asp
3.2.4 Database
The database for each repository consists of several tables, views, indexes, and stored procedures. When each database is created it is created with a number of SQL scripts and populated with data by the administration pages.
When a subsequent version or build is released then an upgrade script is provided that will ensure that any database changes are applied. It is advisable in a production environment to ensure that a backup of the current database has been made before the upgrade script is run.
This table outlines the tables that can be found in the repository and their uses.
Table / View |
Description |
ACL |
Where there are more permissions on a node than Owner, Base Group and Guest / World then the ACL tables holds the rest with each permissions listed against the node identifier. |
ASE |
This identifies the nodes that have been added to the Advanced Search Engine. It is used to determine those nodes that need to be synchronized. |
Associations |
This holds the relationship between nodes in terms of association links. |
Attribute |
Every attribute in the repository is held in this table including the attribute identifier and all the data defined in the attribute screen. Attributes details are typically utilised through the view AttributeDetail as this also includes that category identifier. |
AttributeDataType |
The available data types for attributes. |
Audits |
The audit log for the actions that take place in the repository, outlining the Node and user identity. The node identity is typically used to list the audit details for a particular node |
Calendar |
The calendar data for workflow task due date calculations |
Category |
The details for each category including the identifier, the display name for the category and the actual table name in the database |
CategoryAttribute |
This holds the relationship between categories and attributes, including the attribute position. |
Document |
The document / version information for every object added to the repository. The document table includes the node identifier, and the node table relates to the document table through the node_content_id but only where the node_type=6 and sub_node_type =1. |
ExtraNodeInfo |
This holds Name / Value pairs of additional information to associated to the node identifier. It is used mainly for projects but is not the most efficient method to store related data. |
Fast_CheckOut_Lookup |
The data for fast check-out and document generation accessed using the MD5 identifier for the item that is passed in the properties of the file. |
GroupMembers |
The relationships between group identifier and user identifiers. The group identifiers are in the node table (node_type=12) and the users are in the KWUser table. |
InterestNotifications |
The table through which interest notification messages are generated. |
Interests |
The interests that are registered by a user, including the node identifier and type. |
KCSystem |
General system information held in a single row |
KWResources |
Notify text for each of the languages to provide multilingual support |
KWUser |
User information including user identifier, all user from the create user screen and the number of failed login attempts |
MediaFiles |
The system category for File |
MimeTypes |
This holds the Mime Types, file extensions and the file names for each size of mime_icon |
MyFavourites |
Identifies the favourites that a user defines to appear on their homepage. |
Node |
Central table for all activities in the repository |
Notes |
The system category for Notes |
outofoffice |
The Out-of-office nominations that a user makes for workflows |
outofoffice_audit |
The audit for Out-Of-Office activities. |
Provider |
Outlines for the provider the supporting functions for file management |
ReportType |
The sections that can be defined to appear on a user's home page. Identifiers below 99 are reserved for the product developers. |
RoutingMaps |
The system category for Workflow Maps |
Schedule |
The schedule information from calendars |
Search |
Temporary table for the content search results |
TaggedNode |
This outlines the items for notify and is also used to display these items (news, discussions, and workflow comments) on the user's homepage. |
TaskAttribute |
The example workflow category for a task template, such as task with sign-off |
TaskTemplates |
The system category for Actions / Task Templates |
URLs |
The system category for URLs |
UserHomeReport |
For each user this holds the reports / bars that will appear on their home page, include their location and status (expanded / collapsed). |
Vault |
Holds the vaults and their physical location for the repository |
WFAudit |
The Audit details for every workflow instance and task, including the user identifier for the performer. |
WFComment |
The comments that are associated with a workflow instance and task including the visit that they relate to. |
WFInstance |
The workflow instance information including the map file identifier, initiator, workflow status, started and completed dates |
WFMap |
The workflow details that have been derived from the map file, such as workflow title, category and rule for instance name generation |
WFMapAttachment |
The attachments that have been defined on the workflow map |
WFMapAttribute |
The attributes identifiers that are in the map, including the task identifier and the permissions |
WFMapExpression |
The condition expressions contained in the map file |
WFMapManager |
The manager / workflow community details from the map file |
WFMapPath |
The path information for routing between the elements (steps, conditions, sub-workflow and aliases) in the map file |
WFMapTask |
The task details that have been derived from the map file, such as task type, title, location, performer type, event scripts and custom form |
WFTaskExpressions |
The expressions for relationship to initiator and generate letter. |
WFTaskInstance |
The task details for workflow instances, including visit, mapfile and event details |
WFTaskInstanceEvalStatus |
The result of a condition step evaluation, held by workflow instance and task identifier |
WorkflowArchive |
The system category for the Workflow Archive details |
WorkflowEmails |
The email notification details from the workflow map for a task |
WorkflowForms |
The workflow map presentation information |
AttributeDetail |
This view combines the category identifier with the information in the attribute table and is used for all attribute details. |
AuditDetail |
This view combines the audit data with the user table information |
DocumentNode |
This view brings together the Document and the Node table information |
GroupACL |
This view provides the access control list for groups |
GroupGroup |
A view of invited group membership |
GroupUser |
A view of user group membership |
NodeAlias |
This view identifies those nodes that are shortcuts / aliases |
NodeDocMime |
This view gives the mime type information for a document / node in the repository. This view is used less, the preference being a stored procedure. |
NodeGroupACL |
This view shows for the node the Group Access Control List |
TaggedNode_Detail |
A view of the node details for each tagged node |
UserACL |
A view that provides the users details for the Access Control List |
UserDetail |
A view of the user details and who created the user |
UserProfileReportType |
Provides a view that combines the userhomereport and the reporttype information |
WFAuditDetail |
Provides a view of the workflow audit for the instance, including user and instance detail |
WFCommentDetail |
Provides a view of the comments for a workflow instance |
WFInboxDetail |
This view provides the tasks for each user using the workflow instance data |
WFInstanceDetail |
This view gives the instance and map details together |
WFMapAttributeDetail |
This view gives the details for attributes in the workflow map |
WFTaskInstanceDetail |
This view combines instance data for the task and workflow |
The tables and views defined above are those created as part of the standard repository but there are tables that can be defined by user actions. When a user creates a category, this creates a user-defined table in the repository and the real database table name can be identified from the column category_def_name in the system table Category. Each category has its own user-defined table and the category_def_name typically includes the category identifier as part of the user defined table name.
This user defined category table begins life with the following default columns:
- Node_id.
- Sub_id.
- Node_type.
- User_id.
- Aux_no_1.
- Modified_date.
As Attributes are added to the category, so the number of columns grow, and each attribute name defined by the user becomes a column name in the user defined database table. The user must ensure that the attribute name adheres to the column naming conventions for the database, considering such things as reserved words and length limitation (Oracle limits names to 30 characters).
There are no indexes created automatically on the user-defined table because the system-defined columns may not offer the best indexes for this user defined data.
Often a developer needs to understand how data is stored in the category in order to either report on its content or to utilise it for customizations. There are a number of scopes that are defined for categories (see the Ayastem Administration Guide) and these define the format of the data that you can expect to find in the user defined category table. The scopes influence the system defined columns as well as the number of rows you might find.
3.2.4.1 Category with a User, File or Media Scope
Examining the user defined database table for a category with a file scope will illustrate that each node that has this category defined will have a row of data in the table that is linked to the original node table entry by the node_id. The user who created the category data and the last modification time stamp is recorded automatically.
Note - The developer should be familiar with the concept that each object, such as document, folder, url, note or record, has an entry in the node table.
The same link can be observed between the node_id and the user_id when this is a user category.
3.2.4.2 Category with a Data Scope
Examining the user defined database table for a category with a data scope will illustrate that there is no unique identifier populated in the system defined columns and it is up to the user to define their own attribute to provide the unique identifier. The user who created the category data and the last modification time stamp is recorded automatically.
A data category can have the node_id defined but this is done through a developer customization. Typically, such a customization would be used to support one to many relationships for workflow instances where the node_id is set to the workflow instance name and the sub_id to the task identifier.
3.2.4.3 Category with a Workflow Scope
Examining the user defined database table for a category with a workflow scope will provide a very different set of data because the category holds a row of data for each task that exists in the workflow instance and one for the current task of the instance (sub_id=-1). The node_id relates to a workflow instance in the repository and the sub_id relates to the task_identifier. The performer who completes the task and the last modification time stamp is recorded automatically.
Reports should only be run against the current task (sub_id=-1) and no assumption should be made regarding the task identifiers as these may change in subsequent versions of the map file.
3.2.5 Vaults & Vault Manager
The repository does not store documents in the database but uses file system directories. This is because the retrieval and maintenance of documents is much faster than managing the database Binary Large Objects (BLOBs).
The file system directories are categorised into 12 different types of vault in repository. Each vault type can have a number of vaults and each vault is a virtual directory so that it can be indexed via Microsoft Indexing Service.
The most used vault types are Files, File Versions, Discussion Content, Workflow Content and New Content. The custom vault type allows for the introducing of other vaults and typically this is where the mime icons vault is located after the system administrator edits the Mime types that are supported by the repository. The custom vault type differs from the other vault types in that it may have more than one vault active at a time, all other vault types may have multiple vaults on-line but only one active vault to receive content.
The selection for a vault type will display the number of vaults used for the repository and each vault will relate to a file system directory. The selection of an individual vault will list the files that are present.
As each vault relates to a file system directory. There is a limit to the number of files that can be held in a file system directory without adversely affecting performance. The limit on performance is defined by a number of factors but is controlled by the Microsoft Operating System. To help the system administrator to stay within the performance limits there is an add-in tool included called Vault Manager. Vault Manager is an executable which is set-up as a scheduled task to monitor the number of items each active vault / file system directory for each vault type. When Vault Manager detects that the vault has exceeded its maximum that was defined at its creation it will endeavor to make a new vault based upon the file system path of the previously active vault and set this new vault as the active vault for the vault type. Vault Manager will not move any of the documents from the old vault that exceeded its limit or change file system path automatically. It still remains a system administration task to monitor the available storage and decide when new disk capacity is required. Indeed, it is the system administrator’s responsibility to create a new vault manually if there is a need to move future vaults to a different file system path (refer to the System Administration Guide on how to do this).
Important: Under NO circumstances should it be assumed that a file will remain in the same vault / file system directory. The application must be free to move documents between vaults, such as when a document is relegated to a version.
3.2.6 Notifications
There are times when it is essential to notify a user(s) of events that occur inside the repository, such as new tasks or changes to items upon which they have a registered interest. There are three main functions each with their own area of responsibility:
- Notify.
- Notify Interests.
- Task Escalation.
Each can be triggered using a schedule task so that frequency can be adjusted to suit the deployment needs of the client. The schedule is generic and impacts all users of the repository.
Notify provides email notifications for the following events:
- New Tasks.
- New Workflow Comments.
- New Project News Items.
- New Project Discussion Topics.
Each user may control the events for which notification is received using their user profile. Only those users who have a valid email address and have elected to receive email for the event will be sent a notification when the event is detected by the scheduled notify execution. Only one email notification will be generated for each of these event types, as a notification may contain multiple links. It is a deliberate design decision not to send a separate mail for each new event for the event type; this is to help reduce the email overload that is experienced by users.
Each user may control their registered interests and determine those items in the repository that they wish to keep a watch on. NotifyInterests checks for items that have changed since the last time that it was run and then sends an email notification message to those users who have registered an interest on one or more items that has changed in the repository. The NotifiyInterests effectively provides a kind of subscription service that produces notification on a regular basis of changed items. The user or system manager decides the level of registered interest on the item, selecting to be notified of all changes or just those on the content of the item. Interests registered on a folder will provide details of changes to existing items, new items and deleted items.
When the scheduled task involves NotifyInterests, the executable examines the interests registered for each user and then constructs an email notification of changes, if any. The email notifications will include links to directly access the relevant item in the repository. Note - if the link is selected and user already has a current session in repository then a new window will be launched to display the item concerned, where there is no valid session then a login prompt will be displayed before the redirection to the relevant item.
In the workflow designer, there is the ability to configure escalations and email notifications for tasks and events that are all processed by the task escalation executable. The task escalation executable examines the workflow instances and dispatches an email notification to the relevant parties using the rules that have been defined in the workflow map.
4. Rules for Customizing
Once the need has been recognised to customize the application then it is important to consider how the customization is to be implemented to receive the benefits from the customization without losing the standard functionality.
It is recommended that developers always consider adding to standard functionality and consider those users outside the immediate user community for whom the customization is required. For example, changing the behavior of a piece of standard functionality may satisfy the users requesting the change but may be devastating to another group of users for whom the standard functionality was perfectly adequate.
Due to the significant investment that needs to be made to purchase and deploy applications it is not inconceivable that a client over time may wish to expand its use beyond the initial pilot community, therefore developers should make customizations that leave the original application functionality intact.
Adding to functionality already provided is not difficult, for example perhaps the user requires different processing for a particular type of document that they wish to add. The developer would be well advised to copy the existing add file functionality, change the copy to support the user's requirements and offer it as a new context menu item rather than changing the functionality of the existing context menu item.
Any customizations that are made should, as far as possible, be placed in the custom directories provided. These custom directories include:
- CustomForms.
- CustomHandlers.
- CustomHome.
- CustomIncludes.
- CustomWFForms.
These directories, when the application is installed, initially have only sample content and therefore should an update be subsequently applied to the system, customizations placed in these directories will not be overwritten.
On the rare occasions where customisations must be made outside these custom directories, then the customizations must be carefully documented and so that the system administrator can be sure that these are securely placed in a backup. The original files must also be retained on a backup as should an issue arise with the application the system administrator may be asked to restore the original system to test that the issue still exists with the standard pages delivered by the product developers.
Before application upgrades or patches are applied it is essential that the system administrator make a backup, preferably of all components, database, and vaults, as the product developers may be unaware of the customization that has been made by a third party and overwrite the customisations with standard pages.
5. Areas of Customization
There are many areas into which customisations can be made and the following sections will explore the following topics:
- Branding.
- Custom Forms.
- Custom WF Forms.
- Custom Handlers.
- Event Handlers.
- Custom Menus.
- Custom Includes.
- Custom Reports.
- Text Changes.
- COM Components.
- External Web Services.
It is important to consider the clients’ future requirements when implementing any customization but often developers only consider the immediate requirement and then find themselves constantly revisiting the customization for the client. This does not make the client happy because they are constantly paying for minor changes or the developers happy because they are constantly being asked for minor tweaks that distract them from other major client projects. Careful design and the use of data categories as stores for the configuration information can relieve this frustration. For example, if a client's requirement is that each time they create a project they wish to have 10 named folders implemented within the project library then these 10 named folders could be coded into a custom form / handler for creating the project. If the named folders are created based upon the folder names held in a data category, then the client can increase / decrease the number of folders created and change the folder names to be created any time they wish without bothering the developers.
5.1 Multilingual Support
A general topic that should be considered, as part of any customisations, is the need for multilingual support. In some instances, this will not initially be a customer requirement because the customization might be for a specific department or region. It is however the ideal time at the point of designing the customization to consider its future uses and applications, such as might it be useful for other clients, or could it be deployed as a turnkey solution into other situations that have the same functional requirement but different terminology.
The interface holds all the text that is represented on the screens in an XML file or repository database; therefore, it already has the multilingual support and the ability to customize the text to suit the individual client deployment (refer to Manage Page Text). The options for multilingual support in customisations are:
- Ignore the need for multilingual support and code the text directly.
- Create a separate language file for populating the customization text.
- Modify the existing language files to include text representations for the customization.
To create a separate language file for the customization is not as difficult as it sounds because it is easy to copy the functionality that has already been implemented and modify it to utilise a different XML file. Most of the main components that load and extract data from an XML file are held the file Includes/LanguageFuncs.asp and this should be copied to the directory CustomIncludes before modification. The CustomIncludes directory has been specifically created for developer customisations and it contents will not be overwritten during subsequent upgrades or patches of the application.
Examination of the LanguageFuncs.asp will indicate how to set the location of the customised XML file. The function GetPageText is used to read content from an XML file for the text elements of the customization. For example:
set oPageText=GetPageText("new_doc")
sPageTitle=oPageText(PT_PAGE_TITLE)
The option to add to the existing XML file seems easier because it reduces the need to maintain separate code for the LanguageFuncs.asp but has significant drawbacks, including:
- Changes to the XML file will need to be reapplied following subsequent updates of the core application, which may include patches.
- There is a risk that core pages might be added with the same names as those used for the customization pages.
- The client will be free to modify the text through Manage Page Text, as the page will appear as a core page and may cause support difficulties or misunderstandings.
5.2 Branding
The interface has a particular branding, but clients will often want to introduce their own branding so that the product appears to become part of the company’s own product suite. The application provides the following components that can be used to change the branding:
- Includes.
- Cascading Style Sheets.
- Left Navigation bar.
- Frame Sets.
- Context Menus; Images.
Before embarking on a customization, it is important to make a backup of the existing files to ensure that the system can be restored and to accept that any subsequent upgrades will require the branding to be re-implemented. It will become the clients’ responsibility to ensure that any changes made to the standard pages are applied to their branded pages.
There is a standard header and footer for each page, located in the Includes directory, which can be changed to include a company logo, text to define the company / repository name or other such information. Although it is important that the logo is small as it will be reloaded each time.
Cascading style sheets, located in the CSS directory, impact the colours and the fonts used throughout the system, including those used in the context menus. The colours are the most obvious things to change to match the corporate image. These do not necessarily impact the left navigation bar and column heading as these are derived from image files. The images, text and colours used in the workflow designer and workflow status map are contained within the java applet.
The images, located in the images directory, include the icons used for the context menus but not the Mime Icons, as these are stored separate directory (timeimages/fileicons). When replacing images, it is important to use images of the same size and type.
The interface consists of two framesets, one for the left navigational bar and one for the main frame. Changes to framesets are more time consuming to implement and test but can provide the opportunity to use the application as a doorway to other web-based applications, such as Outlook Web Access, CRM and HR systems. For example, some clients have elected to add access to their own generic applications to the left navigation bar. It is important to ensure that all users have access to the functionality if it is added to a generic area such as the left navigation bar to avoid frustrating users who would like to use the functionality but do not have access.
5.3 Custom Forms
The custom pages / forms provide the developer with the ability to introduce new functionality and to customize versions of existing functionality for customers who require that little bit extra. Workflow custom forms / pages that are most common customisations, as this is typically where there is requirement for additional functionality to process specific validations or the handling for one-to-many relationships.
It is important that custom forms are placed in the custom directories provided so that they are not overwritten by subsequent upgrades or patches to the application. Where this cannot be done then a backup copy of the original must be maintained. Should there be any errors reported to product support you will be asked to replace the original to validate the core functionality. Custom forms become the responsibility of the organisation undertaking the customization and not the product developers.
There are two custom directories supplied for forms:
- CustomWFForms, for workflows.
- CustomForms, for other areas of the interface.
In previous versions of the application, it has been essential to create custom forms to be able to invoke a custom handling mechanism in workflows but in the current version a custom handler can be declared within the workflow designer without the need for a custom form.
It is advisable that where a custom form replaces a standard form that the functionality in the form retains the ability to implement the standard functionality. This reduces the need to retrain users who have previously been using the standard functionality.
Custom forms that handle the presentation of category / attribute information should make use of processing provided by display_attribute.asp which is a generic routine used widely in the interface. Changes to the handling of attributes should not be implemented inside display_attributes.asp as this may adversely damage the standard pages within the interface. In most cases, display attribute is called from within a for-next loop and the developer is free to add a conditional processing, such as if-then or case statements, to the for-next loop. Should a developer still wish to make changes to display_attribute.asp handling then it is recommended that a copy of the original is made and placed in the custom directories to avoid making their own version impact all the standard functionality.
It is important when designing custom forms to consider the performance and usability of the custom form. Custom forms may be required to implement additional validation checks prior to submission, such as when attributes are interdependent. Here the developer needs to consider if the page can have sufficient information to conduct the validation without a round trip to the server or whether the check can be better performed on the server. The developer should also consider error messages and whether these need to be interpreted into something meaningful to the user and where the error text is held (refer to multilingual support).
5.4 Creating a Custom Workflow Form
Custom workflow forms are the most frequent request for customization experienced by the product developers. In the current version, the requirement for a number of the customisations has been reduced as several common customisations have been built into the workflow designer and workflow engine.
There will still remain a number of areas where customization will be essential, including:
- Changes to the present attribute information.
- Changes to the capture attribute information, such as the launching of a separate form to search / lookup information from a remote source.
- User input validated before submission.
- Enable attributes to have complex relationships.
- Enable attributes to be auto populated, based on calculation attributes.
- Merge of more than one letter / document to generate an output.
- Enable the population and management of one-to-many relationships, such as time records to a timesheet or purchase order items.
The customization of workflow forms also depends upon the stage in the workflow process because there are certain activities that are only valid once an instance exists for the workflow. As an illustration of this the developer should look at the differences between the workflow initiation (scripts/wfinit.asp) and workflow task forms (scripts/wftask.asp).
All custom workflow forms defined from within the workflow designer are assumed to exist in the directory CustomWFForms.
The following sections will describe in more details the components that make up each of these forms. Typically calling a customised form will involve calling the standard form that will then implement a number of actions before redirecting to the custom form.
Sample custom forms have been provided in the CustomWFForm directory so that the developer can avoid some of these stages, but they are outlined in case the sample forms become corrupted. For both workflow forms the stages of making a custom form are the same:
- Take a copy for the existing form (WFInit.asp or WFTask.asp) from the scripts directory and place it in the directory CustomWFForms.
- Rename the form in the CustomWFForm directory to reflect the name of the workflow concerned and its function, to easily identify it when there are multiple customisations present.
- Make changes to the form in the CustomWFForm directory. See the following sections on the changes that need to be made for each type of form ((WFInit.asp or WFTask.asp).
- Use the Workflow Designer to modify the appropriate task step in a workflow map.
- Launch the task properties page for the relevant step.
- Enter the filename of the custom form from the CustomWFForm directory into the custom items tab of the task properties dialogue.
- Save the map and add it to repository, either as a new map or version of an existing map.
- Test the map to see your custom form.
Note - Workflows and customisations can be developed against one repository and then moved to others for user acceptance testing (UAT) and final deployment. This activity is supported through the use of category export / import (see system administration guide) and workflow synchronisation.
When a task is launched the standard page is triggered and this completes a number of defined activities before reading the properties for the task in the workflow map. These task properties define if a custom form should be used, or the standard page should be presented.
Irrespective of the workflow stage, initiation or task step, the processing of the elements of the custom form will remain consistent and it will be the developer who swaps one or more of the processing elements for their own implementation:
- The header for the form:
WriteHeader (WFMapNodeID, obj, method, mode, oWFPageText("init_workflow"))
- The styles and JavaScript scripts for the form:
WriteTaskScripts (isInitTask, oTaskProps, oWFPageText, oWFForm)
- The title for the form:
WriteWFTitle (isInitTask, oTaskProps, oWFPageText, oWFForm)
- The opening of the form in HTML including the standard hidden fields:
OpenWFForm (isInitTask, oTaskProps, oWFPageText, oWFForm, WFMapNodeID, sFunc, isAssigned, NextURL)
- The standard task information, such as task due dates:
WriteTaskInfo (isInitTask, oTaskProps, oWFPageText, oWFForm, oUser)
- The output of the task instructions:
WriteTaskInstructions(isInitTask, oTaskProps, oWFPageText, oWFForm)
- The output of the task comments and the input field, if permitted for the addition of comments:
WriteTaskComments (isInitTask, oTaskProps, oWFPageText, oWFForm, oUser)
- The task attributes based upon the permissions defined in the workflow designer:
WriteTaskAttributes (isInitTask, oTaskProps, oWFPageText, oWFForm, oCat, strValidations)
- The existing attachments, if present and the processing to add further task attachments:
WriteTaskAttachments (isInitTask, oTaskProps, oWFPageText, oWFForm, oWF)
- The processing of any actions that have been defined to replace the send-on button:
WriteTaskActions (isInitTask, oTaskProps, oWFPageText, oWFForm)
- The close of the HTML form:
CloseWFForm ()
- The JavaScript for the standard validations:
WriteValidations (strValidations)
The main processing element that is likely to be customised on the form is that for handling attributes. The workflow map defines the category that is associated with the workflow, as specified by the workflow designer, and defines for each task the attributes and their permission / permitted action. The following illustration indicates the permitted actions for an attribute.
The permissions for each attribute can be defined as follows: -
Not Visible to this Task – The attribute will not be visible to the user at this task step and will not accept any changes to its data.
Calculation Only – The attribute will be hidden from the user but will be available on the form for custom form developers to use for calculations. These attributes will not accept any changes to the data and are effectively read only.
Program Entry – The attribute will be hidden from the user but will be available on the form for custom form developers to use and populate.
Read Only – The attribute will be visible to the user, but will not be editable and no data changes will be accepted. This option is best suited to an attribute that has been filled in from a previous step.
Editable – The attribute will be visible to the user and will be editable. This option is best suited to an attribute that has been filled in from a previous step.
Entry Required by Task User – The attribute is visible to the user and requires a mandatory entry.
The text that follows shows the code that is found in WriteTaskAttributes located in directory includes/WFFormFuncs.asp and the bold text provides additional information for the developer.
Useful constants that relate to the permission in the workflow designer
const ATTRPERM_NONE=1
const ATTRPERM_MAND=2
const ATTRPERM_EDIT=3
const ATTRPERM_READ=4
const ATTRPERM_CALC=5
const ATTRPERM_PROG=6
Read permissions for the task attributes from the workflow map held in the database tables
oWF.ReadWFTaskAttributes CLng(WFMapFileID), Clng(TaskID), _
array("wf_attr_ID", "wf_task_ID", " ' ' AS attr_name", "wf_attr_permission"), _
vAttrDisp, session("sess"), vStatus
call HandleError(vStatus,false)
Read all the active (attr_status=1) attribute details from the category (WFCatID) associated with workflow map and order the results by the display position.
oCat.Attributes CLng(WFCatID), "display_position", "attr_status=1", array("*"), vAttrVals, session("sess"), vStatus
call HandleError(vStatus,false)
When this is not the initiation task, fetch the previous values for the attributes.
if not isInitTask then
redim vAttrCols(iAttrColCount)
redim vAttrData(iAttrColCount)
for k=0 to iAttrColCount
vAttrCols(k) = vAttrVals(2,k)
next
oWF.ReadWFInstanceTaskCategory CLng(WfInstID), CLng(-1), _
vAttrCols , vAttrData , session("sess"), vStatus
call HandleError(vStatus,false)
end if
Begin processing each of the attributes in the category to determine if it is displayed and if so, in what format.
for j=0 to UBound(vAttrVals, 2)
AttrID = vAttrVals(1,j)
AttrPerm = 1
AttrDefault=""
strOnFocus=""
strOnChange=""
strOnClick=""
strCond=""
strAction=""
AttrMand=""
‘Find the attribute from the attribute information for the workflow
if not isempty(vAttrDisp(0,0)) then
for z = 0 to ubound(vAttrDisp,2)
if clng(vAttrDisp(0,z)) = clng(AttrID) and clng(vAttrDisp(1,z))= clng(TaskID) then
AttrPerm=cint(vAttrDisp(3,z))
end if
next
end if
‘ Based upon the permission prepare to make the presentation of the attributes in the form
if AttrPerm > ATTRPERM_NONE then
if AttrsShown = 0 then Response.Write "</td></tr>"
AttrsShown = AttrsShown + 1
AttrName = vAttrVals(2,j)
AttrDispName = vAttrVals(3,j)
AttrType = vAttrVals(4,j)
AttrInputType = vAttrVals(5,j)
AttrMand = vAttrVals(6,j)
AttrDesc = vAttrVals(7,j)
GetTypeVal AttrInputType,AttrName,request,varValue,blnSet
if blnSet then
AttrDefault=varValue
elseif not isInitTask then
AttrDefault = vAttrData(j)
else
AttrDefault = vAttrVals(8,j)
end if
strOnChange=vAttrVals(13,j)
AttrDDVals = vAttrVals(12,j)
if TaskStatus=4 then 'if review task then force attribute permission to be read only
AttrPerm = ATTRPERM_READ
end if
if AttrPerm = ATTRPERM_MAND then
isMand = "*"
else
isMand = ""
end if
if AttrInputType = 23 or AttrInputType = 24 or AttrInputType = 25 then
AttrDefault = FormatDateForForm(AttrDefault,AttrInputType)
end if
‘ Make the attributes hidden if set to calculation or program
if AttrPerm=ATTRPERM_CALC or AttrPerm=ATTRPERM_PROG then
Response.Write "<input type=hidden name='"&AttrName&"' value='"&AttrDefault&"'>"
else
Response.Write "<tr><td class='cellTitle'>"&isMand & AttrDispName&"</td><td class='cellElement'>"
vFormat=allow_update
if cint(AttrPerm)= ATTRPERM_READ then
vFormat=plain_text
strOnChange=""
Response.Write "<input type=hidden name='"&AttrName&"' value='"&AttrDefault&"'>"
end if
strToolTip=AttrDesc
if AttrPerm <> ATTRPERM_READ then
if AttrPerm = ATTRPERM_MAND then AttrMand=true
UpdateValidations strValidations, AttrName, AttrType, AttrMand, strOnChange,_
AttrInputType, strCond, strAction
end if
displayAttribute "myForm", AttrName, AttrDispName, AttrInputType,_AttrDefault,_
AttrDDVals,_true, false, vFormat, strOnChange
Response.Write "</td></tr>"
end if
end if
next
Be sure to indicate if there are no attributes to display
if cint(AttrsShown) = 0 then
Response.Write oWFPageText("none")&"</td></tr>"
end if
5.4.1.1 Making a Custom Workflow Initiation Form
Having made the copy of the workflow initiation form (scripts/wfinit.asp) to the CustomWFForm directory and renamed it, then it is time to edit the form.
The first activity is to remove redundant activities that are not required in the custom form because these will have been completed by the standard page, before it detects that a custom page is to be used for presentation. Most of this code is surrounded by comments that clearly indicate that it is to be removed for a custom form and this code can either be deleted or commented out using a single quote at the beginning of the line.
- Remove the reading of the workflow map file into the database tables of the repository, as the custom form does not need to repeat this activity.
Rem CreateMap entries in table
oWF.CreateWFMapFromFile CLng(WFMapFileID), 0, false, session("sess"), vStatus
call HandleError(vStatus,false)
Rem Redirect Page to task assign if required ******NOTE This can be commented in Custom Form*****
if isAssigned<>"y" then
session("taskassignees")=""
oWF.ListWFTasks CLng(WFMapFileID),_
"(wf_task_performer_type=4) AND (wf_task_performer_type=4 AND (wf_task_type=2 OR wf_task_type=3))",_
"", array("wf_task_ID, wf_task_title, wf_task_performer_type"), vVals,_
session("sess"), vStatus
call HandleError(vStatus,false)
if not isEmpty(vVals(0,0)) then
Response.Redirect "WFTaskAssign.asp?wfID=" & WFMapFileID & "&node=" & WFMapNodeID
end if
end if
- Remove from the custom initiation form the redirection to the custom form. Failure to remove this will result in the form entering an infinite loop.
if sCustomForm<>"" then
Response.Redirect "../CustomWFForms/" & sCustomForm & "?aliasnode="& AliasNode_
&"&docID=" & WFMapFileID & "&node="& WFMapNodeID
end if
- Modify the custom form to present your own text in the title. This will make it easy to confirm that it is your custom page that is being called and not the original.
5.4.1.2 Making a Custom Workflow Task Form
Having made the copy of the workflow task form (scripts/wftask.asp) in the CustomWFForm directory and renamed it, then it is time to edit the form.
The first activity is to remove redundant activities that are not required in the custom form because they will have been completed by the standard page before it detects that a custom page is to be used for presentation. Most of this code is surrounded by comments that clearly indicate that it is to be removed for a custom form and this code can either be deleted or commented out using a single quote at the beginning of the line.
- Remove from the custom task form the handling of setting the task status to open as the standard form will have done this already. There will be no actual harm by leaving this in although the audit log will show two entries for each time the form is opened.
Rem Raise task opened event
if sFunc = "wf.taskopen" then
oWF.TaskInstanceOpened CLng(WFInstID), CLng(TaskID), session("sess"), vStatus
call HandleError(vStatus,false)
end if
- Remove from the custom task form the redirection to the custom form. Failure to remove this will result in the form entering an infinite loop.
if sCustomForm<>"" then
Response.Redirect "../CustomWFForms/" & sCustomForm & "?" &_
Request.ServerVariables("QUERY_STRING")
end if
- Modify the custom form to present your own text in the title. This will make it easy to confirm that it is your custom page that is being called and not the original.
5.5 Custom Handlers
A custom handler page is often a copy of an existing handler page from the directory handlers that has been modified to support the changes required to satisfy the user requirements. The handler page receives the data that has been submitted from the form using either POST or GET methods.
Custom handlers are typically server-based pages that can process the data before it is written to the database and include additional details, if required and permitted. For example, the custom handler page may provide additional validation on the data and reject the page because if fails server-based validation rules.
Custom handlers do not typically communicate directly with the repository tables, instead they communicate with COM methods. The custom handler will have a transaction associated with it therefore any errors will cause the rollback of all actions previously undertaken in the page. Transactions are committed when the following code is encountered:
ObjectContext.SetComplete
It is important, when creating a custom handler to understand that once the data processing is complete then the user will be re-directed to an interface page to continue using the interface to the repository. Careful consideration should be given to redirecting the user to an appropriate page and passing that page the relevant information to be displayed correctly.
The custom handler should be created in the directory CustomHandlers. The majority of custom handlers are produced to process information for workflows and a sample custom handler has been provided.
It is advisable that any custom workflow handler is written to avoid the dependency upon the task identifier. Task identifiers, whilst unique in the workflow map, are never reused, so if the user inadvertently deletes and recreates a task, any code based on the task identifier will no longer function. A more reliable option is to base any customised code on the information entered into Tag field of the task properties form in the workflow designer.
5.6 Creating a Custom Workflow Handler
A sample custom handler has been provided in the CustomHandlers directory so that the developer can avoid some of these stages, but they are outlined in case the sample handler becomes corrupted. The stages of making a custom handler are:
- Copy the file (workflow2.asp) from the handlers directory to the CustomHandlers directory.
- Rename the copied handler file.
- Make changes to the handler in the CustomHandlers directory; see the following sections on the changes that need to be made.
- Use the Workflow Designer to modify the appropriate task step in a workflow map.
- Launch the task properties page for the relevant step.
- Enter the filename of the custom handler from the CustomHandlers directory into the custom items tab of the task properties dialogue.
- Save the map and add it to repository, either as a new map or version of an existing map.
- Test the map to see your custom form.
Note - Never user task identifiers, use the tag names to avoid the problems if the task is deleted and recreated.
The text that follows shows the code that is found in workflow2.asp located in directory scripts and the bold text provides additional information for the developer.
This section handles the attributes that have been defined in the workflow and any values determined by the customisation would be processed in this section.
if (clng(catID) <> -1) and (catID <> "") then
''get attributes for this category
Set objCat = Server.CreateObject("WebWerx.Category")
redim rgColNames(0)
rgColNames(0) = "*"
objCat.Attributes CLng(catID), "attr_display_name","attr_status=1", rgColNames, _
rgColValues, session("sess"), status
if not isStatusTrue(status) then
ObjectContext.SetAbort
errHandler status
end if
numOfRecs = UBound(rgColValues, 2)
intSubCatCount=0
if not (isEmpty (rgColValues(0,0)) or isNull (rgColValues(0,0))) then
for j=0 to numOfRecs
redim preserve attrData(2,j)
attrID = rgColValues(1,j)
attrName = rgColValues(2,j)
attrInputType=rgColValues(5,j)
attrData(0,j) = attrID
attrData(1,j) = attrName
GetTypeValNew attrInputType,attrName,request,varValue,blnSet,true
if blnSet then
attrData(2,j)=cstr(varValue)
else
attrData(2,j) = Null
end if
if instr(1,attrName,"sub_cat")<>0 or instr(1,attrName,"subcat")<>0 or _
clng(rgColValues(5,j))=18 or clng(rgColValues(5,j))=500 then ' 18 Sub work flow type
redim preserve arrSubCat(intSubCatCount)
arrSubCat(intSubCatCount)=rgColValues(8,j)
intSubCatCount=intSubCatCount+1
end if
'Response.Write "<br>id=" & attrData(0,j) & " name=" & attrData(1,j) & " val=" & attrData(2,j)
next
end if
end if
The case statement is used to determine the appropriate code to implement:
wf.init this is used for the workflow initiation.
wf.taskupdate this is used for the save of data in the task.
wf.taskdone this is used for the completion / send-on of a task.
Each of these cases will then permit the developer to implement custom handling that is relevant for the tag name in the workflow. This handling may include processing and updating a data category with sub-category data or performing a calculation to complete a sequence number.
5.7 Event Handlers
To facilitate the integration to external systems, the workflow engine can execute server-side compiled Visual Basic scripts at pre-determined points in the workflow process. These pre-determined integration points are configured through the workflow designer on the ‘Events’ tab in either the workflow or tasks properties dialogues.
The events that are exposed are illustrated below.
All these events are typically called to perform their action in a synchronous manner, and this is an important consideration to be considered when designing integrations.
A sample project is available that has the stubs for each of the events illustrated and the sample code just sends an email when each event is called. This sample code is typically compiled into a dynamic linked library (.dll) and then added to the COM Package for the application using Component Services.
A workflow is then created, and the workflow events are defined using the name of the component that was compiled and added to the COM Package. So, for example, in the task properties event tab each of the events would have, in the entry field WFEvent.EvtHandler, if the sample code were used.
The events will then be triggered when the relevant action occurs in the workflow instance. The most used events are those involving the task done and workflow completion.
5.7.1 Example Use of Workflow Events
The following example illustrates the use of a workflow event to pass information to an external web service. This was a customization for a client who had a long running activity that could not be completed as part of a synchronous action and so a dummy user was given the task until the external action indicated that it was complete. The external application indicated that it was complete through a call to a web service that then updated the workflow instance and task.
This example illustrates several important features of customization that have been discussed in this document as well as providing an example of a workflow event in action. The important features include:
- Designing and documenting customisations.
- Using category export / import.
- Synchronising workflows created on another system.
- Using a data category to hold constants / configuration information.
- Using task tags and not task identifiers.
- Using a data category to hold log / tracking information.
- The importance of a test harness for integrations.
To install the example that implements the synchronous call to the external web service application from a workflow event and the response web service the following activities must be completed:
- Installation of the workflow event handler dll in the COM Package, that has been built using the event handler stubs.
- Loading of the categories, using category import, into the repository that will hold the configuration and tracking information.
- The creation of a web service to receive responses from the external application.
- Configuration of the workflow task to invoke the event to communicate with the external application.
- The population of the configuration data into the category to define the web services involved.
5.7.1.1 Installation of the Event Handler COM Component
Once the Event Handler COM component has been built it must be installed in the COM. In this example the Event Handler COM component was called OutProcessWSEvent.dll and to deploy the example this must be introduced just like any other COM component using the Component Services from the Administration tools.
It is advisable to copy the Event Handler COM component (OutProcessWSEvent.dll) to the same location as the other COM components for the application as this is likely to prevent an overzealous system administrator from accidentally deleting the file.
To be sure that the Event Handler COM component (OutProcessWSEvent.dll) has been copied to the same location as the other components in the COM Package, you can identify the location of the other components by examining the properties on a component in the COM Package whilst in the Component Services interface.
Open Component Services, expanding the options until the COM Package for the application is visible. Shutdown the COM Package for the application and, using the context menu exposed, use the right click to add a new component.
5.7.1.2 Create the Categories
The categories are typically designed and then created on a test repository. These categories can then be transferred to the production repository by exporting the categories from the test repository and importing them into the production repository (refer to the System Administration Guide). This process saves time and ensures that the consistency is maintained between the two repositories. It is important to note that the category and attribute identifiers will not remain the same between the two repositories. Therefore, the customizations must use the category and the attribute names and not identifiers.
Login to the productions repository as a system user then navigate to list categories in the Admin menu. From the upper dialogue select import and import the following categories: Workflow Pending Web Services and Workflow Web Services Lookup from the file kworkerExport.xml found in the WebServices\External\Category Files folder.
The workflow category can also be added from the export if it has been included.
The following tables outline the category definitions and can be used as an example of document category and attribute design.
Definition: Workflow Web Services Lookup
Type / Scope: Data Category
Description: This category will hold the information for web services that can be triggered from workflow event handlers and is effectively a configuration file.
Attributes:
Display Name |
Attribute Name |
Data Type |
Description |
Workflow ID |
Wf_mapfile_id |
Numeric |
The node identifier for the workflow map. This will remain consistent for all workflow instances triggered from this map irrespective of the versions of the workflow map. |
Tag |
task_tag |
varchar |
This is the tag on the general tab of the task properties for the relevant task and will be used make a unique key when there is more that one web service called for a given workflow map. |
Out Process Web Service |
out_webservice |
varchar |
This is the URL for the external third party web service that will be provided with information by the workflow event handler. |
In Process Web Service |
In_webservice |
varchar |
This is the URL for the web service that will receive the response information from the external third party application. The default for this attribute can be set. |
Error Action |
error_action |
integer |
The Error Action option buttons define the action that should occur when the receiving web service detects an error. Irrespective of the selected action the error information will be attached to workflow as a comment. The two options to proceed are: 1. Complete the current task irrespective of the error 2. Record the error and return the task to the previous performer. |
Definition: Workflow Pending Web Services
Type / Scope: Data Category
Description: This category will hold the information about the task that is sent to external web services, effective tracking the activity to the external third-party web service.
Note - The information in this category will be populated by the workflow event handler automatically and will enable system administrators to monitor communications between repository and the external application.
Attributes:
Display Name |
Attribute Name |
Data Type |
Description |
Workflow Instance ID |
wf_instance_id |
Numeric |
The identifier for the workflow instance |
Tag |
task_tag |
varchar |
This is the tag on the general tab of the task properties and will be used make a unique key when there is more that one web service called for a given workflow map |
Task Performer |
task_performer_id |
Numeric |
The original performer of the task is recorded so that on error the task can be returned to the original performer. |
Date Sent |
datesent |
datetime |
This is that timestamp from the system that indicates when the event was sent. This can be used to identify events that cannot be completed, as the external application has not responded. |
5.7.1.3 Create the Response Web Service
The response Web Service is used to receive the response from the external application and update the workflow task with the information provided by the external application.
This ‘In Process Web Service’ is created using, from Administration Tools, the Computer Management application. A new virtual directory based on the ‘External’ folder found in the directory WebServices\External, should be created and it must be assigned anonymous access.
5.7.1.4 Change the Workflow and Set-up the Category Data
Use the workflow designer to create a new workflow or edit an existing workflow. If the workflow is being moved from the test repository it is at this point that it can be opened as a file in the workflow designer and synchronized with the production repository.
At this stage, an appropriate task in the workflow should be defined where the workflow event is to trigger the sending of the information to the web service of the external third party. This task should have two elements defined:
- tag name on the General Tab of the task properties.
- event handler for “Task is Done” by entering OutProcessWSEvent.OutProcess (see the illustration that follows) on the Events tab of the task properties page.
Note - The tag name plays an important part in any customization, as it is a way of identifying a task in the workflow irrespective of the task identifiers. Task identifiers are unique in the workflow but if a task is accidentally deleted it cannot ever be recreated with the same task identifier. Task identifier can be recreated and therefore are a safe basis upon which to build customizations, particularly if clients are to maintain their own workflows.
When the task details have been set, save the changes to the workflow map and add the workflow to the repository as either a new workflow or a version of an existing workflow. It is advisable at this point to launch the workflow so that the information in the workflow file is read into the respective database tables but there is no need to trigger an instance.
To use the event handler OutProcessWSEvent.OutProcess there must be information defined in the category Workflow Web Services Lookup. To set-up this information navigate to the category, Admin / System Administration / List Categories / Workflow Web Services Lookup and select the “Data” link.
Click the icon at the top of the screen and select New Entry. This provides the screen to add data to the category that is effectively holding the configuration information.
For the Workflow ID field enter the node identifier for the workflow that has been added to the repository that contains the task tag and the event handler. To find the identifier for the workflow in the browser interface mouse over the workflow name and observe the status bar to identify the node for the workflow map.
For the Tag field enter the Tag Name that was entered into the task properties of the workflow where the event handler is called for the out process web service.
For the Out Process Web Service field enter the URL for the external third-party web service that will be provided with the information by the workflow event handler.
In the field In Process Web Service enter the URL for the ‘External’ virtual directory that was created earlier in the set-up. This will be the web service that will receive response and information from the external source. (external/Inprocess/Inprocess.asp).
5.7.1.5 Test Event Handler
To enable the testing of the event handling software and the In Process Web Service (‘External’ virtual directory) a test harness has been created to emulate a third-party Web Service. The test harness is in the directory WebService/external/test and must be published as virtual directory.
To use the Test Harness from a workflow that has already had a task done event configured it is only necessary to change the Workflow Web Services Lookup category so that the web services are defined as:
- Out Process Web Service as http://localhost/external/test/optest.asp.
- In Process Web Service as http://localhost/external/InProcess/InProcess.asp.
Where the http://localhost should be changed to the full address to access the web service in a real deployment.
To test the workflow event and the communication of the workflow data to the third-party web service, invoke the workflow. Then step through the workflow until the task is completed that has the task done event handler defined. This should then cause the event handler to send attribute information to the third-party Web Service (Out Process).
Using the Out Process alone will cause an XML file of the data sent from the workflow to be written into a file in the directory C:\temp\inprocess.XML. This will also place a record in the category Workflow Pending Web Services to track that a request has been sent to the third-party Web service for which no reply has been received.
Examining the XML file (C:\temp\inprocess.XML) and the data in the category Workflow Pending Web Services will establish that the outbound communication through the event handler is working.
To test the response path back to the workflow instance in the repository it is necessary to simulate a reply from the third-party web service. To simulate the reply, change the page InProcess.asp to receive input from a file. This can be done in the original page or in a copy placed in the webservices/external/test directory. To receive input from a file, edit the file InProcess.asp found in the directory external/inprocess:
- comment out using a single quote at the beginning of the line the text oXMLDomRequest.Load Request.
- uncomment by removing a single quote from the beginning of the line the text oXMLDomRequest.Load "c:\temp\inprocess.XML".
If using a text editor that supports a line count, then these lines can be found at line 55 and 56 respectively. The code should then look like:
'oXMLDomRequest.Load Request.
oXMLDomRequest.Load "c:\temp\inprocess.XML".
Where C:\temp\inprocess.XML is the file created by the test harness web service with the XML data supplied from the event handler for the workflow task.
Edit the XML file (C:\temp\inprocess.XML) produced by the test harness web service and then invoke the edited InProcess.asp in a browser. This will inject the content of the XML file back into the task in the workflow instance via the In Process Web Service.
Note - that only those attributes that are editable in the task will be updated. Injecting the XML back into the task will cause the record to be removed from the category Workflow Pending Web Services and the workflow instance to progress to the next task if no errors are detected.
Once the testing is complete then it is important, if the original page was used, to revert the changes made in InProcess.asp to restore the service to using the request information instead of the content of the XML file. This will ensure that the In Process Web Service functions correctly.
5.8 Custom Menus
The user interface has a number of pop-up menus and there are times when user requirements suggest that it would be best to extend the functionality through the menu options. Although it is not impossible to change the menu options this is one area of customization that is likely to require a revisit following an upgrade of the application.
It is important to understand that most of the menus in the application are built using the type, permissions, or privilege model to determine the functionality that is available to the current user. The functions in the context menus are also logical for the item or area of the application in which the user is working. Both these aspects should be considered when modify the context menus.
As a rule, the context menus should be enhanced, and existing functionality retained. It is not unusual for one area of a business to request a change without considering that another area of the business may still wish to use the standard functionality. It is also good practice to retain the standard functionality in the context menu because if there is a problem, it is easy to identify if there is a problem with the standard functionality or the customization.
The existing menu functionality is in the directory ClientScripts and it is advisable that if changes are to be made to these files that a backup copy is made of the originals.
5.9 Home Page Entries
The users home page has several sections that can be configured by the user or by the repository system administrators.
There are times when a developer may wish to add sections to the home page to provide easy access to customised functionality. The addition of a section to the home page involves the developer in the following activities:
- Defining the identifier and label for the section on the home page.
- Coding of the information to present in the section for inclusion in the home page and this must take into account the various configurations (mini report, expanded, collapsed).
- The mechanisms to present the section on some but not necessarily all user home pages.
The definition of the identifier and the label for the section is undertaken by adding entries to the database table ReportType for the repository. All current home page sections are identified within the ReportType table, and each has a name and a unique number. A developer can see the existing entries by examining the content of the ReportType table for the repository.
Important: Report_type identifiers up to 99 are reserved for the original system developers and other developers are free to begin their own custom report type entries from 100.
Once the report type has been defined the developer must implement the code to permit the custom report type to be displayed on the home page. The system developers have prepared for this to be done by other developers and created the directory CustomHome. Within the CustomHome directory there is a file called CustomReports.asp this file must have an include line for your custom report page, such as: -
<!--#INCLUDE FILE="../CustomHome/rep100_0.asp"-->
There must also be an entry placed in the select case or switch, such as
case 100 'My Custom Home Page Report Code
The CustomHome directory contains samples of custom pages to illustrate the coding required to add a custom section to the user's home page. The developer should also consider if there is a need for multilingual support in his custom page as this was not included in the sample pages.
Once the custom page has been written then it is time to enable the custom section on the home page of one or more users. The custom section will not display on any users until entries have been made into the UserHomeReport table. Entries can be made into the UserHomeReport table from the system administration option Manage Reports.
It is advisable to use Manage Reports because this feature has been written to manage the delivery of custom reports onto the user's home page (refer to System Administration Guide). It is often the case that not all repository users need access to the custom report and utilising the Manage Report functionality enables this to become a local administration task and not one that the developers need to become involved in.
The Manage User Report link is available from the System Administration menu.
The Report Type field will list those report types that have been added to the ReportType table and whose identities are over 99.
The Users field lists on the left the users without access to the custom report type and on the right the users with access to custom report type. Those users listed with access will have the custom report type displayed on their home page.
The buttons between the two user lists in the Manage Reports screen are used to provide the following functionality respectively:
- Remove all users from the list ‘Users with access’, effective removing the custom report from the home page of all users.
- Remove individual selected user(s) from the list ‘Users with access’, effectively removing the custom report from the home page of the selected user(s).
- Add individual selected user(s) to the list ‘Users with access’, effectively adding the custom report to the home page of the selected user(s).
- Add all users to the list ‘Users with access’, effectively adding the custom report to the home page of all users.
The Submit button causes the changes to be saved and affect the user(s) homepage on next reload.
5.10 COM Components
The COM components of any application provide the business logic and middle tier between the user interface and the database. The KnowledgeWorker® application is no different and the source of the standard COM components will not be supplied to external developers.
Developers are free to make use of the standard COM components of the application and to build their own applications / pages utilising them but should consider:
- COM Components should provide the business logic.
- COM Components should be tested each time they are modified and rebuilt, or the application COM components are updated.
- Changing a COM component requires that the COM package is shutdown to ensure that the old components are removed from memory.
- The supplied COM components provide methods to other COM, VB or ASP on the Web Server.
- The supplied COM components assume that data validation is undertaken by the calling application, such as the ASP form or handler, therefore this must be done by any component that makes a call on the supplied COM.
- The supplied COM components rely upon the session information to identify the user and the repository therefore any third party using the supplied COM method must ensure that this session information is available.
- The vast majority of the supplied COM component method utilise transactions.
- Third party applications and client components should NOT directly update the repository database tables to avoid integrity errors.
5.11 Debug Hints
As a developer, there are times when it is helpful to be able to debug our code to establish the reasons for an error. The most basic techniques include:
- When a page displays incorrectly in the user interface page; click in the main page and then right click on the page to bring up a menu, from the menu select view source. This method sometimes provides more details than can be seen in the page and allows the developer to establish the values that are held in the hidden fields.
- When a page does not show recent changes; it may be because the browser or a caching device is delivering the previous page. In the main page use right click to bring up a menu and select refresh. When this does not help consider changing to another page before clearing the temporary cache of the browser and ideally change the browser settings for “check for newer versions of the stored pages” to “Every visit to the page”. If this does not help, then consider restarting IIS as ASP caching may be enabled.
- When there is an error in the page then consider adding debug statements that will help identify the cause of the error:
- In ASP, consider adding Response.Write statements to output values directly into the viewed page and Response.end to terminate page execution.
- In HTML, consider the addition of HTML tags and output the values of relevant fields to help diagnose the problem.
- In JavaScript, consider adding alert messages to output the relevant values or identify the line in which a JavaScript error is raised.