Skip to main content

Compile Model in D365 using PowerShell Script

 Hello Dev!

We have a scenario when we need to compile ISV solution that has been locked so in that case you build or compile that specific model using PowerShell scripts. Alternate approach is you can reference that model with you existing model and then build that model with "Build model with reference". 


Script: 
Note : FlexProperty is my model

param (

[switch]

$Incremental,

[switch]

$Ref,

[string]

$Metadata = "C:\AOSService\PackagesLocalDirectory",

[string]

$CompilerMetadata = "C:\AOSService\PackagesLocalDirectory",

[string]

$XRefSqlServer = "localhost",

[string]

$XRefDbName = "AxDB",

[string]

$ModelModule = "FlexProperty",

[string]

$OutPut = "C:\AOSService\PackagesLocalDirectory\FlexProperty\bin",

[string]

$XmlLog = "C:\AOSService\PackagesLocalDirectory\FlexProperty\BuildProjectResult.xml",

[string]

$Log = "C:\AOSService\PackagesLocalDirectory\FlexProperty\BuildProjectResult.log",

[string]

$AppBase = "C:\AOSService\PackagesLocalDirectory\bin",

[string]

$RefPath = "C:\AOSService\PackagesLocalDirectory\FlexProperty\bin",

[string]

$ReferenceFolder = "C:\AOSService\PackagesLocalDirectory",


[string]

$ApplicationPath = "C:\AOSService\PackagesLocalDirectory\bin\xppc.exe"

)

$arguments = @(

" -metadata=`"$Metadata`""

" -compilermetadata=`"$CompilerMetadata`""

" -xrefSqlServer=`"$XRefSqlServer`""

" -xrefDbName=`"$XRefDbName`""

" -modelModule=`"$ModelModule`""

" -output=`"$OutPut`""

" -xmllog=`"$XmlLog`""

" -log=`"$Log`""

" -appBase=`"$AppBase`""

" -refPath=`"$RefPath`""

" -referenceFolder=`"$ReferenceFolder`""

)

if ($Incremental) { $arguments += '-incremental' }

if ($Ref) { $arguments += '-xref' }

Start-Process -FilePath $ApplicationPath -ArgumentList $arguments -Wait -NoNewWindow






Comments

Popular posts from this blog

Update record set using update_recordset in d365 F&O | AX 2012 using X++

Hello Devs! One the most common condition that we being a developer faces is to update existing records set that is mostly done on report level when we need to add new fields in the Temporary table and we don't wants to make extensions on class level. Here is a small piece of code that will assist you in updating existing record sets.  update_recordset custAgingReportTmp                 setting                 CMTCustInternalCollector = hcmWorker.PersonnelNumber,                 PaymDayId                         = custTable.PaymDayId,                 PaymTermId                       = custTable.PaymTermId,                 CashDisc      ...

Import DB backup from .BacPac file in D365 F&O using CMD

Hello Devs! This is one of the most quickest way to import DB from .Bacpac file. Bacpac files are those which are generated from Tier 2 or higher environments as a DB backup.   Alternate Approach ----> Using wizard Prerequisites: 1) Access to LCS folder Asset library --> Database Backup Incase you don't see the folder ask your project owner to give you the access. Step 1:  Download the file and save it in local non-user directory.  Step 2: Download SqlPackage.Exe.  Download link :  https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver15 Extract it and Open CMD as admin.  Step 3 :  Navigate to directory where you have extracted the SqlPackage.Exe folder using cmd and execute the command. In my case the command look like this, C:\SqlPackageExe>SqlPackage.exe /a:import /sf:C:\DB\AFP-D365-UATbackup.bacpac /tsn:LOCALHOST /tdn:AxDB /p:CommandTimeout=6000 /TargetEncryptConnection:False Note: You can repla...

Upload and Download file to BLOB storage Account in D365 F&O using X++

Hello Devs! This blog will be a source of help for many Devs who are working on Azure Blog Storage account. We normally use azure blob storage account to hold thing that need to be further processed and stored on cloud. Scenario:  We have been asked to * Upload file to azure * Download file from azure "We will be using Share File Management technique" Case #1 : Uploading File to Azure  Note : Here we are using  PurchParameter  table to extract urls and other folder related details you can use your own folder name to access the azure blob folder Further, Credential details have already been extracted i.e vault key,authorization key and has been stored in our custom table and extracted from d365 through azure parameters  storageCredentials  = new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(SystemParameters::find().CMTStorageAccountName, purchParameters.CMTTankKeyVaultSecret); Or alternatively, CloudStorageAccount storageAccount = ...