Terraform Enterprise vs Maestro Terraform as a Service - Part II.

In the first part of this article, we talked about Workspaces, Private Module Registry, and VCS Connections in Terraform Enterprise and Maestro Terraform. 

In this part we will proceed, discussing such important features as API Endpoints, Policy as a Code, Variables, and Runs and State. 
  • Variables
TFE: Granular variables allow easy reuse of code to scale resources, regions, etc. All variables are securely stored and retrieved as needed during the provisioning process.
M3: Maestro fully supports working with variables. Needless to say, Terraform templates are impractical if we create them for one-time use. For example, if the template is stored in a GitHub repository, it can be reused by other users. To reuse an existing template, you should set the specific configuration for your environment.
Generally, when creating a template it is a best practice to move such a configuration into variables. For example, with variables, we can indicate, the name of our instance, resource group, region, etc. This is a strong point of the variables’ functionality.

Terraform Enterprise allows you to manage the variables. Maestro also allows you to do it. However, unlike Terraform Enterprise, Maestro also automatically generates a UI form - a wizard, based on your Terraform Variables, allowing you to specify the values for them right before the execution. Thus, when you download a Terraform template, you do not need to specify these variables (as required in Terraform Enterprise).

On the Maestro UI upon selecting the Apply Terraform option a wizard will be created dynamically based on those variables which were collected by the system, of any type – tables, drop-down lists, text fields. This is the strength of the Maestro – the application knows how to build these wizards dynamically. You can specify them manually or when installing a Terraform template by specifying a JSON file in which these variables to be indicated.

There is one peculiarity in Terraform Enterprise - when installing a Terraform template, it cannot allow the user to run this template, as it is required to specify the credentials on behalf of which this Terraform template will be executed. Accordingly, in Terraform there are several ways to specify these variables - one can specify them directly in the template or use environment variables to set these values.

In Maestro a different way is selected. For users, the system offers a transparent scheme in which the question of credentials is not broached at all. When the tenant is activated in Maestro, one can simply upload the Terraform template here, and if Terraform is activated on this tenant, then the credentials will be automatically substituted for this template. The user will not see these credentials and will never think about how to add them. In such a way Maestro simplifies the work with Terraform templates.

And from here Maestro can also control with what level of access these Terraform templates are applied. To launch the Terraform, we use our own set of credentials, and we can manage it separately. Besides, we also apply our policies here, namely, we check whether a specific user can launch specific resources in a particular region, as set in this template. Terraform Enterprise, on the other hand, does not know the user, so the system required to somehow solve this problem dynamically at the moment of the launch (auto user, etc.). The system wants to know it on behalf of the user, who logged in to the Terraform Enterprise.

This allows the Enterprise Security, IT department to control and limit the operation of Terraform as a Service from the security perspective on the one hand, and on the other hand, Maestro is a tool for creating market places, a tool for creating small services, and for us, the key is to make this procedure as simple as possible.

Imagine a situation when working on the marketplace, Maestro users are asked to indicate their SSH key or enter some system parameters containing 45 symbols – no surprise that users would not be able to do this.

This is the ideological difference between Terraform Enterprise in terms of working with variables and Maestro, which does the maximum job so that an unskilled non-DevOps could use Terraform. As for Terraform Enterprise – this solution is an ideal workplace for a very skilled DevOps specialist.
  • Runs and State
TFE: Offers two-phased provisioning automation: a plan (dry run) & apply (execution). Output stored in a state file. Remote runs (GUI, CLI, or API executed) and state storage.
M3: Maestro provides the same two-phased provisioning automation. Before applying any template we recommend to dry run it - run the Terraform template in the planning mode. This will check which infrastructure is going to be modified, deleted, or run if we apply the same template with the same settings, using the Apply function this time.
Maestro supports both phases, and like Terraform Enterprise, provides access through the user interface and the API. Terraform Enterprise also has a CLI, but at the moment the Maestro team is also developing the unified CLI, which works similarly.

Our team was asked lately if Maestro saves state files, and the answer is “yes”. When working with Terraform Enterprise and selecting the Terraform Apply option, you deploy the infrastructure from this Terraform template. The respective data about the deployed infrastructure, all IDs, and resources references are stored in the state file. There are few ways to store these state files:
  • if only one DevOps specialist is using Terraform on a local instance and then checking the results, it is better to store the state file locally.
  • if several developers are working on the same Terraform template, the state file becomes the place containing the information about the infrastructure current state and into which condition it should be brought to. Then, according to the best practices, the state file should be stored in the database or other place allowing team collaboration.
By default, several places are supported for storing the data, and one of them is Amazon S3. This condition can be specified in the configuration of the Terraform template.

We do not exclude this opportunity, allowing Maestro users to use the method that they like. If they want to use DynamoDB and Amazon S3, they are free to use it. Additionally, we also store these state files in Maestro storage, and based on them we perform analytics and check the created resources, as well as the ownership mechanism, notifications, and other functionality provided by Maestro.

In Maestro the state files are called stacks, or rather, stacks collect information from these state files. In the stacks, you can see what infrastructure was deployed. In addition to what is written in the state file, stacks also contain other information: we expand the audit and the ownership, etc. Stack info is stored in the database, state file info is stored in secured private file storage.
  • Policy as Code
TFE: Sentinel, a Policy as Code framework, is intended to automate a policy controls into workflows. Creates every provisioning run to enforce security, compliance, and operational best practices.
M3: Indeed, Sentinel is one of the main advantages of Terraform Enterprise. It is a proprietary language and tool that allows you to create policies with simple code. You can go to Terraform Enterprises UI, write policies based on the state files that will be received at the time of planning, and make some decisions based on this. But there is a problem in Sentinel - it is a closed proprietary language and tool, we cannot use it as-is for our purposes In Maestro.

Therefore, we wrote an analog of Sentinel for our application - the Policy Engine. Currently, the Policy Engine contains several policies based on which, for example, we can check regions in which it is possible to deploy resources. For example, we have an AWS tenant that has only one eu-central-1 region activated. As we know, Amazon has a lot of regions. But in Maestro only one is activated. Accordingly, it would be logical to disable the option to run instances and other infrastructures for all users in other regions except eu-central-1.

Besides, quotas policies are also being implemented. For example, if the quota is exceeded, based on the policies Maestro will send a notification on whether this infrastructure can be started by this user. The notification will be sent to the manager, and he, accordingly, will be able to decide on approve/reject of this action.

Thus, Policies as Code functionality are already present in Maestro, though only a few policies are currently supported. If necessary, other policies can be implemented based on users’ requests.
  • API Endpoints
TFE: Enables remote calls. Allows external tools and clients to interact with Terraform remotely.
M3: Maestro has a unified Java SDK. Thus, it is possible to integrate Maestro with other systems and perform all actions available on the UI through the Java SDK, such as activating Terraform in the tenant, applying a Terraform template, scheduling, installing a Terraform template, deleting and destroying it.

Terraform Enterprise became widespread because it offered the developers/DevOps community access to review methods and the way they function inside templates. The language of the template itself is DSL (domain-specific language), thus you can program the Terraform logic on the language of the Terraform provider.

Maestro allows you to do everything that can be done through the API of a particular provider. However, Amazon API and Azure API are significantly different, and to work within the hybrid cloud in a unified way we need to search for certain abstract concepts. Terraform Enterprise does not support such a kind of infrastructure.

This way, through the unification of shapes, regions, image names, and through its own terraform provider, Maestro has created a consistent unified language. And precisely because we have a unified provider, it is possible to write a unified template, and via the unified API to apply this template on Amazon, Azure, Google, or any private clouds without any specific changes.

This option is very important for marketplaces. Thus, in short, Maestro has one tool to create a marketplace for hybrid clouds.

Proceed with reading our article here: Part I, Part III.
;)

Comments

Popular posts from this blog

Maestro Analytics: Essentials at the Fingertips

Maestro: Greeting the Green Dragon

Maestro Orchestrator: Product? SaaS? Framework!