Reputation: 376
Is there a way to import data into Neo4j from Azure Blob Storage?
Upvotes: 0
Views: 756
Reputation: 4563
There are a couple of options:
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview - creates a URL with a signature that allows you to access files directly over https
. You can then LOAD CSV WITH HEADERS FROM "<url>" AS row CREATE...
, etc... This has the benefit of not requiring any additional software, custom code, etc...
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-how-to-mount-container-linux - can be used to mount an Azure storage container to a folder in your Neo4j instance (e.g. /var/lib/neo4j/import/myazurecontainer
). This folder can then be used to access files in blob storage as if they're local.
I'd be hesitant to install an orchestration framework (e.g. GraphAware's Hume Orchestra) or ETL tool if you only want to load some data from Azure Storage.
Upvotes: 0
Reputation: 376
I got it done by using python azure-blob-storage and py2neo libraries. It worked like a charm.
Upvotes: 0
Reputation: 20185
I don't think there are any free tools.
On the commercial side, GraphAware Hume Orchestra has Azure BlobStorage connectors
There is also the possibility to create your own protocol for Neo4j LOAD CSV (for eg s3, azure etc,) .
I have written an example here : https://github.com/ikwattro/neo4j-load-csv-s3-protocol
Upvotes: 1