Reputation: 551
Is there a way to connect Spark-Sql with sqlalchemy I have legacy code that uses sqlalchemy How could i make it use spark-sql ,can i make sqlalchemy as the translation layer to spark-sql ?
Upvotes: 14
Views: 13835
Reputation: 569
While PyHive is a popular option, consider exploring SparkORM
. This library, available on PyPi and GitHub, simplifies schema maintenance and table creation in Spark-SQL, offering easy extensibility. It's worth checking out for a straightforward integration with your existing SQLAlchemy-based code.
Upvotes: 1
Reputation: 551
yes , look at this project https://github.com/dropbox/PyHive
There are some adjustments you will need to do , fortunately SQLAlchemy is build for that
Upvotes: 9
Reputation: 2392
Short answer: No! This would be the same thing as if we could use PostgresSQL with Spark-SQL. Spark-SQL has its own SQL dialect and follows more Hive style. You should convert your sqlalchemy code to conform with Spark-SQL.
Upvotes: -6