This article discusses the various options and considerations for integrating with Workday.
The Workday application offers a number of options for integrating data with other systems. Which option to choose will depend on your functional and nonfunctional requirements. All data in and out of Workday goes through web services. All of Workday’s packaged integrations as well as the integration tools available in Workday ultimately leverage the web service API through the Object Transaction Server (OTS). However the interaction with those web services can be performed in a variety of ways.
Let’s look at the options for integrating with Workday.
Workday Web Services - SOAP API
Depending on the integration scenario, vendors/client applications can call the Workday APIs to push/pull data using Workday’s SOAP API. The Workday SOAP API is well documented in the Workday Community and supports a large number of transactions. To utilize the Workday Web Services SOAP API, the Dell Boomi Workday Connector should be used.
The Workday Connector allows a developer to connect to a target tenant by:
- Single set of user credentials to be utilized across all Workday connections. Note that you should configure your Integration Security User (ISU) in Workday to have all permissions needed across your Workday integrations.
- Including version in the connection, which allows this to be extended and set through environment variables
- Configuring the Workday service in the operation component. This allows a single Workday connection to be used against different service endpoints for the same tenant.
Retrieving from Workday
Below are some techniques and considerations when retrieving data using the Workday Connector.
Use Transaction Log if available, processes more efficiently when utilized. For example, if querying for modifications, instead of querying all data between two date points and comparing for changes, Workday can utilize the transaction log for the same period and only compare those records that had transactions in the specified range. The former will query ALL records and compare, the latter approach will only select those that had transactions and then compare.
Otherwise call is run against all data records (will take longer).
Effective date vs Entry date
Understand the difference when selecting records.
For example: Getting all users since last run date (Entry date) without considering for effective date could result in missed data from data that was keyed in earlier but not effective until the current date range. i.e. future dated employees not yet effective.
Alternatively, I may want all records, regardless of when they become effective. In this case the as of date in the request can be current and as of effective date can be set to the distant future.
May want to store both Last Entry Date and Last Effective Date parameters as persisted process properties.
SOAP API paging
The client application (i.e. AtomSphere process) is responsible for handling. In other words, you must build this into your integration. Always build to scale.
Adjust paging criteria in your request from 1-999 records per page. Tune to use case. Workday defaults to 200 per page
Subsequent calls needed to get additional pages, see Design Pattern: Handling API Pagination.
Calculated fields and custom fields can be retrieved as well but may require additional setup in Workday to expose them through one of the delivered web services
Visit the Workday community for how to add additional fields to a service call through the use of Field Overrides
Most of the considerations above refer to polling the Workday application for data. This API can also be utilized to retrieve individual queries and lookups as well. The Workday delivered APIs are a reliable and efficient way to retrieve data ad-hoc in integration processes as there is little overhead compared to the other mechanisms for getting data out of Workday.
Sending to Workday
Below are some techniques and considerations when sending data (create or update records) using the Workday Connector.
Your reference IDs will need to match. Reference IDs are used to refer to other object instances. Although these are static strings, they may vary across tenants during an implementation. Consider setting up mapping in Boomi, or setup a report in Workday that can be called within the integration, if this should be controlled in Workday.
If the API you are using allows for batched imports, this should be taken advantage of. However, Workday will fail the entire batch if one record in the batch fails. Be sure to handle for this in your process. For example, you can attempt to send records as a batched request and then if an error is encountered, break up the batch and send each record individually.
The Workday SOAP API also has an Integration Service endpoint to execute a Workday Core Connector or custom Studio or EIB integration. This API call can be paired with a file based or outbound call approach to exchange data. The request is answered by a 200 OK and the integration system spins up to execute its logic. See Recommendations below.
Custom REST-ful APIs
The Workday APIs for Reports as a Service, Message Queues, and Studio listeners can be invoked as well. These APIs are not available out-of-the-box in Workday and require some Workday custom development to be exposed.
Reports as a Service (RaaS)
Custom reports can be setup in Workday and then be web service enabled. This allows AtomSphere to make a REST-ful calls to retrieve the report, in XML or JSON format. Custom reports can often be setup by business users and then exposed through a toggle within Workday. Reports are an easy way to retrieve custom objects and fields, leverage calculated fields for summary data, and join multiple objects into a single data source.
To call a RaaS service within AtomSphere you will use the HTTP Client Connector, not the Workday Connector. The HTTP connection is configured with Basic Authentication and the service URL of the report.
Reports can also be setup with filters that can be passed in the GET request. This allows the same Workday reports to be parameterized to retrieve for example changes between two dates or a specific record. Filter options will vary based on the data source for the report.
Be mindful of time it takes Workday to run the report. Long running reports may need to be optimized to prevent HTTP timeouts. Consider using indexed data sources, data type, sorting, and reducing number of calculated fields to optimize. Visit the Workday community for more on optimizing custom reports. Workday reports are not as quick as using the Workday SOAP API, there is some overhead for initializing and running the report. One helpful trick is to shorten the XML element names in the generated XML, which can greatly reduce the size of large XML files.
Workday has message queuing available within the Workday tenant. Message queues reside within the tenant and must utilize the API to place messages on the queue. This means you will need a REST-ful client to place events on the queue. Typically this can be accomplished using Workday Studio to build the mechanism to place events transactions generated within Workday on the queue (for outbound consumption) or use AtomSphere to place inbound data on a queue where it can be asynchronously processed later by an internal integration.
Below are some considerations for using message queuing.
Limits on the number of queues available in the tenant (10), messages expire after a week, maximum number of messages on a queue (64K).
Queue can be accessed through REST-ful API calls. Workday only needs to place events on the queue. An AtomSphere process and then retrieve and process events off the queue.
Decouples Workday from the delivery and processing of messages to the various destinations. Workday only needs to compile and expose the events.
For outbound integrations where Workday Studio places events on queue consider:
How often will it run?
How quickly does it run?
Leave validation logic in the client applications or downstream integration layer.
Separate queues per event or consolidate and route in subscriber? Consider designing a single AtomSphere process to receive all messages and route based on queue or payload metadata.
Note that if Workday is unavailable, no messages will be added to the queue.
If the Dell Boomi atom is unavailable, the process will resume picking up messages from the queue when it becomes available again.
Once the process has successfully executed records into destination applications, a subsequent call is made to Workday to delete the message from the queue.
Using Subscriptions in Workday, you can configure outbound messages for events that occur within the Workday tenant to achieve real-time processing. On the AtomSphere side, one more or web services listener will be developed to receive the outbound messages from Workday.
IMPORTANT: A disadvantage to this approach however is the subscription is automatically disabled by Workday if the messages cannot be delivered to the endpoint after several attempts. When evaluating this approach, carefully consider the reliability of the connection, availability of the destination, and assured delivery design implications.
Below are some considerations for using outbound subscriptions.
The standard outbound message payload only contains the event notification details, not the full business record. Within the listening process, a subsequent query back to Workday is usually required to retrieve the full record.
Multiple endpoints can be configured for a subscription.
No error handling if the event is not able to be delivered. This will have to be managed in the integration.
Use with environment restrictions to configure test vs. prod endpoints.
Within the HCM and Financials domain, file based integration is often the only option for vendors and legacy applications who do not expose web services. In a typical outbound scenario from Workday, data is extracted from Workday using one of the aforementioned methods, transformed to an agreed upon format (often a delimited/CSV or fixed length text file), and then delivered to an external location mutually accessible to both Workday and the target application. Because all data in and out of Workday is through web services, files ultimately are produced from or broken down into the Workday APIs discussed above.
Using this approach requires the setup of an external shared location such as an FTP or SFTP site that is accessible to both the source and target applications.This pattern requires coordination from the target application process of when to retrieve the file. This intermediary state created--although only for a short duration of time--can become outdated once written if both applications are simultaneously updated directly from a different mechanism during the coordination window.
Also, although having an intermediate file is convenient for integration debugging, it can also be a liability. Files get misplaced, sent to the wrong place, left on local machines, and sent over insecure mail during troubleshooting.
There is also the limitation of not being able to process records that “last mile” into the endpoint. Were the records within the file actually loaded successfully into the destination? Lack of end-to-end visibility can be challenging to manage.
- Reduce point-to-point interfaces by utilizing AtomSphere as your central hub for integration processing. With this you can achieve an end-to-end view of integration status while maintaining all business logic and error handling in one place.
- If your source and target endpoints have APIs available, utilize these direct connections over file based approaches.
- When sending data to Workday:
- Use the Workday Connector and leverage the Workday SOAP API.
- AtomSphere can connect directly external applications, either on-premise or in the cloud, and process directly into Workday.
- Depending on the external application, look to write back a record status to the source application (e.g. synced = true/false).
- Although Workday may throttle API requests under high load, there is no set governance limits for using the Workday API and they are robust enough to handle high volume processing.
- When retrieving data from Workday:
- Options for real-time integration:
Outbound Subscriptions to an AtomSphere web service listener process. However the fragility of the connection should be carefully weighed.
If the desired changes are associated to Workday business processes, build a Studio integration to fire and deliver the event via HTTP callout once the event hits a completed state in the business process.
Options for near-real time integration:
Schedule an AtomSphere process to run as frequently as desired (down to every one minute) that polls the Workday API transaction log for changes.
Develop a Workday Studio integration (set up as described above) that places events on a Workday message queue. Then schedule an AtomSphere process to poll the message queue every minute to process to downstream applications.
Options for full record synchronization:
If you need data from multiple objects in Workday, create a custom report in Workday and expose it as a RaaS service then schedule an AtomSphere process using the HTTP Client connector to periodically extract the data in a single call.
Otherwise retrieve data with an AtomSphere process using the Workday connector to extract records for a single object. Additional data can be retrieved in subsequent Workday SOAP API or RaaS calls if needed.
Options for detecting and syncing changes only:
If a Transaction Log is available, use the Workday connector to call the transaction log to identify changes.
If the changes can be filtered using delivered fields for the target object consider setting up a RaaS that uses time stamps in the filter then have the AtomSphere process call the RaaS and transform and deliver.
If there is a Workday Core Connector integration available for the object in Workday, take advantage of the packaged change detection. Have the AtomSphere process call the Workday Integrations API and fire the integration. Then either have it delivered to an AtomSphere listener endpoint or poll the integration API to retrieve the output documents for that event.
If the above options are not available, use an AtomSphere process with two queries. First query records by the saved As Of Date, then query records by the current As Of Date. Next compare and filter. Each call would pass in the updated As Of Date.
- Options for real-time integration: