Reputation: 1108
I am working on a project where we use Azure stack for data engineering and analysis. The main component for computation is Azure Databricks in which most of the code is written in python code.
Recently I got a requirement to work in a project where we have to process mf4(Measurement Data Files) files. To process mf4 files, we sort out the solution to use asammdf library and process the file.
Then the next phase came to migrate few formulas which are in IDL(Interactive Data Language). These formulas are stored in an Oracle database and we are able to connect to the database in order to get the formulas.
But the question arise here is How to run these IDL formulas on the data available in Azure Storage. Can we run these formulas through Python notebook file and use the data for analysis.
I have gone through few documentations and didn't get any Idea to implement a solution for this.
Any leads Appreciated! Thanks in Advance.
Upvotes: 0
Views: 151
Reputation: 8030
First of all, the product is licensed. After obtaining a license, follow the steps below.
Install the IDL from the steps provided in this link.
Before doing that, you should have downloaded the IDL .tar.gz
. Download it by creating an account and downloading it from the product download page.
Next, upload it to the DBFS in Databricks and start the installation.
After extracting, run the command below since Databricks doesn't support prompt input.
echo -e "y\n/usr/local/harris\ny\nUnix\ny\n" | ./install.sh -s
Your installation directory will be /usr/local/harris
.
Next, you need to activate this with the activation code you received earlier.
Then, according to this documentation, set up the path
and import.
import sys
sys.path.append('/usr/local/harris/idl/lib/bridges')
from idlpy import *
Here, your installation directory is /usr/local/harris
. Provide that.
If you want to use the command line, do
%sh
idl
Since I do not have an activation code, it is showing to activate it.
Upvotes: 0