Recently we had a Customer request to bring Microsoft Lists into Datasphere.
The SAP Open Connectors is the closest option, but does not provide a Microsoft Lists capability:
The quickest option was to use Microsoft Power Automate to extract the Microsoft Lists contents and export it to a csv file on Azure, from which could then be imported into Datasphere using Data Flows.
The following is the process we went through to achieve this:
Access the Microsoft Lists location.
Locate the List Location on Sharepoint, so you know how to navigate to it via Power Automate.
For context, I have created a separate list dataset not related to the Customers data as follows:
Open Microsoft Power Automate.
My environment is via: https://make.powerautomate.com/
Create a new Flow from Blank
In our scenario, we used a schedule. Which then made it easy to allow the subsequent Dataflow, also run on a schedule to allow frequent updates.
Once you have a new Flow created, and the reoccurance is setup. Search for "get items" and select the Sharepoint option.
Once you have selected the Get Items, using the drop down on the Site Address - Select the Address for the List Location used above. The also select your list name:
Next add a Select
This is where Power Automate has some limitations and needs some trial and error around what you have as your list fields. Simple text is easy, but when it comes to lookups and multiple values some tweaking needs to be done.
This is an initial view of the mapping. Note you need to add them all manually. I hope your list is not too wide. I have just picked the columns I am interested in, and have some of the complexities.
Also with Power Automate - the addition of the Fow Each loop - is really slow, this comes in for each multi select field - Choice in my example. So changing this to an expression:
join(xpath(xml(json(concat('{"body":{"value":', item()?['Choice'], '}}'))), '/body/value/Choice/text()'), ', ')
Next we create a csv table to write it to the file store:
The CSV table generation just uses the Output from the Select:
Next we write directly to the Azure blob store. The connection setting is simply how you connect to your Azure Blob store.
Finally, Save the Flow and Test it, the go to the Run Flows page
If you have access, you can view the file in Azure Portal:
Now in Datasphere, assuming you have created a Space. Using a connection to Azure Data Lake Storage Gen2:
Finally, in the Data Builder, create a new Data Flow and connect to the ASDL Storage and simply bring the file into Data Flow and then create a target table to get populated.
Then you can deploy it, schedule it as needed based on the Power Automate schedule.
From here, you can use the Microsoft List data in your Datasphere instance and blend as needed.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
27 | |
24 | |
19 | |
13 | |
12 | |
11 | |
10 | |
8 | |
7 | |
6 |