Skip to main content

Deploying D365 Finance and Operations Build Machine | F&O | Dynamics

 Hello Everyone,

We have a requirement for the deployment of D365 Build machine. Build machine are normally created as an additional machine with an additional cost to make/create deployable packages which will be further deployed on different environment.

Prerequisites:

1) Create Team foundation version control(TFVC) version control setup on DevOps as this is required while creating build machines otherwise It will result in below error.

Article content

To create it, navigate to Repos(You should have correct user roles than you will be able to see it) in devOps.

Article content

Open the dropdown and you will see "+ New repository " option.

Article content


2) On LCS level Permission to manage and create VMs should be assigned

Steps to create Build machine

  • Open LCS and navigate to cloud hosted environments after entering into the project.

Article content

Then press + to add VM

Article content


Article content

  • Select DevTest

Article content


  • Again select the build and test

Article content


  • Based on requirement, select Size of the VM in my case its B8ms(standard)

Article content




  • Name your build agent.

Article content



  • Set dataset to None.

Article content



  • Disable premium storage in case you dont want additional storage

Article content



  • Assign a title to virtual network in my case it was XYZ-vNet

Article content



  • Once all the steps are performed, Our build machine will move to queuing state and then deploying. It will further take upto 10 hours

Article content



Hope you find this blog helpful.

Shayan Arshi

Comments

Popular posts from this blog

Update record set using update_recordset in d365 F&O | AX 2012 using X++

Hello Devs! One the most common condition that we being a developer faces is to update existing records set that is mostly done on report level when we need to add new fields in the Temporary table and we don't wants to make extensions on class level. Here is a small piece of code that will assist you in updating existing record sets.  update_recordset custAgingReportTmp                 setting                 CMTCustInternalCollector = hcmWorker.PersonnelNumber,                 PaymDayId                         = custTable.PaymDayId,                 PaymTermId                       = custTable.PaymTermId,                 CashDisc      ...

Import DB backup from .BacPac file in D365 F&O using CMD

Hello Devs! This is one of the most quickest way to import DB from .Bacpac file. Bacpac files are those which are generated from Tier 2 or higher environments as a DB backup.   Alternate Approach ----> Using wizard Prerequisites: 1) Access to LCS folder Asset library --> Database Backup Incase you don't see the folder ask your project owner to give you the access. Step 1:  Download the file and save it in local non-user directory.  Step 2: Download SqlPackage.Exe.  Download link :  https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver15 Extract it and Open CMD as admin.  Step 3 :  Navigate to directory where you have extracted the SqlPackage.Exe folder using cmd and execute the command. In my case the command look like this, C:\SqlPackageExe>SqlPackage.exe /a:import /sf:C:\DB\AFP-D365-UATbackup.bacpac /tsn:LOCALHOST /tdn:AxDB /p:CommandTimeout=6000 /TargetEncryptConnection:False Note: You can repla...

Upload and Download file to BLOB storage Account in D365 F&O using X++

Hello Devs! This blog will be a source of help for many Devs who are working on Azure Blog Storage account. We normally use azure blob storage account to hold thing that need to be further processed and stored on cloud. Scenario:  We have been asked to * Upload file to azure * Download file from azure "We will be using Share File Management technique" Case #1 : Uploading File to Azure  Note : Here we are using  PurchParameter  table to extract urls and other folder related details you can use your own folder name to access the azure blob folder Further, Credential details have already been extracted i.e vault key,authorization key and has been stored in our custom table and extracted from d365 through azure parameters  storageCredentials  = new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(SystemParameters::find().CMTStorageAccountName, purchParameters.CMTTankKeyVaultSecret); Or alternatively, CloudStorageAccount storageAccount = ...