Tuesday, December 19, 2017

Azure ARM Infrastructure as code deployment using VSTS - Part 2

In Part 1 of the blog series , we created the build configuration for CI/CD pipeline of ARM template deployment. In this blog, we will create the release pipeline and deploy the VMs across dev and test environments in Azure when a commit is triggered

Create the release pipeline by clicking on the release option once the build is completed

In the Release definition, select a blank template to start with








And provide name of your first environment














Now we have our first environment ready, lets add the tasks. Select the created environment.


















Start adding tasks from Tasks->Deploy->Resource Group deployment








Edit the settings of the added task. You need to add your Azure subscription to VSTS as a service endpoint









Click on Manage and it will open the service end point configuration in a different tab. Add your Azure subscription here.Click on New Service Endpoint->Azure Resource manager
























Add Subscription name , when you click ok you will be ask to login to your Azure subscription to authenticate.



Once that is done, Your subscription will be listed in the release task drop down list and you can select it







Select action as "Create or update resource group".Select the target resource group and the location, which is essentially the target Azure region, from the drop down list. Select template location as linked Artifact.











Browse and select  the ARM JSON file from the linked artifact

















Depending on the Target environment select the parameters JSON file as well.Select deployment mode as incremental.











Save settings once done

Now you can add the next environment to the pipeline. Click on the plus sign below the existing environment

















Provide a differentiating name











Add the deployment tasks like it was done for the first environment. You can change the subscription/resource group/location etc to differentiate it from the dev environment. Also the parameter file should be selected as per the target environment.


Save the changes and with that the release configuration is completed. This is how the continuous deployment pipeline will look like











If you click on the pre-deployment condition for the environments, it will be configured to be auto triggered. For eg, the test environment deployment will be triggered once the dev environment deployment is completed. You can also choose to change it as per your release process requirement









Every time the code is committed in the source repository, the Continuous integration trigger in the build config will start a build. The artifacts of this build includes the JSON file and the target environment parameter files. The continuous deployment process is defined in the release process, which will be auto triggered after the build is completed.

















Depending on the target environment settings, the template will be deployed in multiple Azure environments.









PS: If you are interested in Cloud Automation, do check out my book on Azure Automation available in Amazon : https://www.amazon.com/Azure-Automation-Using-Model-Depth/dp/1484232186/ref=tmm_pap_swatch_0?_encoding=UTF8&qid=1513676067&sr=8-2












Share:

Sunday, December 17, 2017

Azure ARM Infrastructure as code deployment using VSTS - Part 1

Infrastructure as code , at high level is how you can configure and manage your infrastructure the same way that you would manage your application code .It leverages the concepts of continuous integration and deployment to update or provision your environment based on the changes made to the code. In Azure you can leverage ARM template , which is essentially a json file to implement this concept.

 In this blog series we will explore the concepts of Infrastructure as a code deployment for Azure environments using ARM templates. The  Continuous Integration and Deployment pipeline leverages  VSTS for build and deployment and the source code repository will be Git. The code in this case is the ARM template json file and the related parameter files.  The following blog gives a nice explanation on how to get started with VSTS and integrate  it with the Git repository  :https://blog.kloud.com.au/2017/06/24/azure-build-pipeline-using-arm-templates-and-visual-studio-team-services

We will be following an approach for Build and Deployment configuration that provides provide more control over the Release process.The build process will produce an artifact whenever a change is made and committed to the ARM template. The release pipeline will leverage this artifact and deploy it to target environments, which in effect creates/updates your Azure environment.

Let us start with the Build configuration

Build configuration

 Create a new projects and a Build associated with it. Choose the "empty process" option to start with your build



Give a name for the Build and choose agent as hosted.









Add the following tasks in order from  Tasks ->utility












Let's configure these tasks one by one

Copy files configuration:













In this step, the files from the build directory are copied over to a staging directory. The build directory could contain all relevant files required for your deployment, for eg the ARM json and dependent files .


Publishing artifacts configuration:




Here we are publishing the contents of the staging directory as artifacts of the build, which will be used by the release pipeline. Next step is to enable the trigger for continuous integration. Once the trigger is enabled, a build will start as soon as a code commit is made to the repository

















Our build configuration is completed now and the required contents for deployment, ie the json file and the dependent  files are produced as artifacts of the build configuration. If you make any edit in the source files and commit it, the build will be triggered. On successful completion of the build, you can see the artifacts in VSTS. In the build status page, select Artifacts. You can also start creating the release by clicking on the release option in the top pane











We will explore the release process to implement continuous deployment in part 2 of this blog...

PS: If you are interested in Cloud automation, please do check out my book on  Azure Automation available in Amazon : https://www.amazon.com/Azure-Automation-Using-Model-Depth/dp/1484232186/ref=tmm_pap_swatch_0?_encoding=UTF8&qid=1513676067&sr=8-2











Share:

Monday, October 2, 2017

New services in Microsoft Azure : Data Box and DDOS protection standard

We saw a lot of new exciting features for Azure being announced in the recently concluded MSIgnite conference. Many of the long anticipated features or products were released in preview and in this blog series I will try to explore some of these features and how you can get started with them.

Azure Data box

This can be considered as an extension to the Storage import/export services already available in GA in Azure. While using the Import/export service, customers have to copy their data to HDDs of a given specification and then send them off to Azure datacenters, where the data will be moved to your storage of choice . This services targets migration scenarios where customer have to transfer several terabytes of data to Azure storage and over the network transfer would take  a very long time or is practically impossible.More details about this service can be found here : https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-service
 With Azure Data box we move one step ahead, where you can order an Azure Data box to copy and transfer your data to Azure. It is described as a 'secure,tamper-resistant' option of transferring your data to Azure cloud. It uses a plug and play model supporting NAS protocols such as CIFS & SMB and data is encrypted using 256 bit AES encryption. The services is available in preview and you can signup for the same from the Azure portal.As shown below, you need to provide some details such as Azure Subscription, Azure of data to be transferred and , targeted Azure region and service, data transfer time frame and frequency.














Azure DDOS protection standard

Azure platform has  DDOS protection built in at platform level.This protects against platform level DDOS attacks and is now being named as Azure DDOS protection basic.This isolates any resources being attacked when a predefined  threshold is reached. However this service is platform managed and controlled. DDOS standard is a new offering in preview that helps integrate DDOS at Virtual network level. Services that have public IPs associated with it like VMs, load balancer ,application gateways and service fabric instances are protected against DDOS attack . DDOS standard provides protection against volumetric attacks and protocol attacks (layer 3 and layer 4) . When used with Application gateway WAF it also provides protection against layer 7 attacks like HTTP protocol violations,SQL injection and cross-site scripting . The service can defend and protect against 60 types of DDOS attacks . DDOS protection standard is integrated with Azure monitor and provides visibility of the metrics. Alerts about any kind of attacks happening in your environment can be generated using Azure Metrics, OMS integration etc. The service offers turnkey protection and can be used for always on traffic monitoring. More details about the service is available from this document : https://docs.microsoft.com/en-us/azure/virtual-network/ddos-protection-overview . You can sign up for the preview from this link :http://aka.ms/ddosprotection 

Share:

Total Pageviews

About Me

Cloud Solutions expert with 17+ years of experience in IT industry with expertise in Multi cloud technologies and solid background in Datacentre management & Virtualization. Versatile technocrat with experience in cloud technical presales, advisory, innovation , evangelisation and project delivery. Currently working with Google as Infra modernization specialist, enabling customers on their digital transformation journey . I enjoy sharing my experiences in my blog, but the opinions expressed in this blog are my own and does not represent those of people, institutions or organizations that I may be associated with in professional or personal capacity, unless explicitly stated.

Search This Blog

Powered by Blogger.