Skip to main content

Custom Service Integration to sending response as a list in D365 F&O || Inbound integration

 Hello Devs,

Custom service is very common when it comes to third party integration. Here I will explain using inbound integration to design a custom service and expose the service for integration.

Requirement :

Need to send list of employees who are on leave today

This service will be used by third party Application

We will start from creating Response class,

1) Response class

Article content

2) A Helper class to hold Response within a list

Article content

3) The main Service class to hold logic

Article content

4) Simply, Once populating all response values and adding them into list, Return List Response.

That is in my case,

return ResponseList;


Article content

5) Creating service for your service class

Article content

5) After service, create service group

Article content

6) Assign Service class to Service element

Article content


7) Now assign our Service to Service Group (You can drag & drop also)

Article content

8) Test your service (On any browser)

Structure :

Environment URL/API/Services/YourServiceGroupTitle/YourServiceTitle/YourMethod


In my case it looks like this,

Article content

Testing your API

Tool : PostMan

Pre-requisites:

Register your application

Refer to my blog :

https://www.linkedin.com/feed/update/urn:li:activity:7172833064203628544/

  • Generate Authorization token

Article content

  • Use that token to call your API


Again using the same API structure with POST call

URL/API/Services/YourServiceGroupTitle/YourServiceTitle/YourMethod

Article content


Hope you find this blog helpful and informative :)


Shayan Arshi

Comments

Popular posts from this blog

Update record set using update_recordset in d365 F&O | AX 2012 using X++

Hello Devs! One the most common condition that we being a developer faces is to update existing records set that is mostly done on report level when we need to add new fields in the Temporary table and we don't wants to make extensions on class level. Here is a small piece of code that will assist you in updating existing record sets.  update_recordset custAgingReportTmp                 setting                 CMTCustInternalCollector = hcmWorker.PersonnelNumber,                 PaymDayId                         = custTable.PaymDayId,                 PaymTermId                       = custTable.PaymTermId,                 CashDisc      ...

Import DB backup from .BacPac file in D365 F&O using CMD

Hello Devs! This is one of the most quickest way to import DB from .Bacpac file. Bacpac files are those which are generated from Tier 2 or higher environments as a DB backup.   Alternate Approach ----> Using wizard Prerequisites: 1) Access to LCS folder Asset library --> Database Backup Incase you don't see the folder ask your project owner to give you the access. Step 1:  Download the file and save it in local non-user directory.  Step 2: Download SqlPackage.Exe.  Download link :  https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver15 Extract it and Open CMD as admin.  Step 3 :  Navigate to directory where you have extracted the SqlPackage.Exe folder using cmd and execute the command. In my case the command look like this, C:\SqlPackageExe>SqlPackage.exe /a:import /sf:C:\DB\AFP-D365-UATbackup.bacpac /tsn:LOCALHOST /tdn:AxDB /p:CommandTimeout=6000 /TargetEncryptConnection:False Note: You can repla...

Upload and Download file to BLOB storage Account in D365 F&O using X++

Hello Devs! This blog will be a source of help for many Devs who are working on Azure Blog Storage account. We normally use azure blob storage account to hold thing that need to be further processed and stored on cloud. Scenario:  We have been asked to * Upload file to azure * Download file from azure "We will be using Share File Management technique" Case #1 : Uploading File to Azure  Note : Here we are using  PurchParameter  table to extract urls and other folder related details you can use your own folder name to access the azure blob folder Further, Credential details have already been extracted i.e vault key,authorization key and has been stored in our custom table and extracted from d365 through azure parameters  storageCredentials  = new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(SystemParameters::find().CMTStorageAccountName, purchParameters.CMTTankKeyVaultSecret); Or alternatively, CloudStorageAccount storageAccount = ...