We are looking to define a standard workflow around process deployments and managing environment configuration across multiple molecules. Many of our local molecules are nearly identical. If it weren't for a small handful of location-specific connections, we would be able to manage the molecules within the same Environment. Instead, when we deploy processes, we must then go and manually update Environment Extensions for multiple Environments.
Thanks to the AtomSphere API, we have built a reliable and automated process for deploying Processes (eagerly anticipating the ability to do so with API and Certificate components). This will ensure that a process is properly attached and deployed to all intended Environments.
The next step would be to do something similar with managing Environment Extensions. We have a team of developers building new integrations and staging for deployment. After the deployment is complete, they then need to go into Atom Management and set their own extensions. Or, for production, a separate deployer would do the same for potentially many developers. This can be very time consuming and error prone.
Instead, if developers could "stage" Environment Extension updates like they do with Processes, then we could integrate these together. This would provide a number of benefits:
- Ensure updates are made (and are consistent) across all necessary Environments.
- Free deployers from time consuming tasks which could be staged by developers.
- Open door for potentially fully automated deploy/configure pipeline.
- Identify property values by Environment more quickly. For example, we're told "https://some-partner.com" is having an upgrade, which processes will this impact. This is not very easy without using AtomSphere API.
Does anyone have experience with using AtomSphere API for this (i.e. EnvironmentExtensions Object)?