Reputation: 13
I have a project A which has all spark and hadoop dependencies. Another project B has been created which requires the code of project A to execute as well as it's dependencies. The reason for having project A and its dependencies is because the project B can be imagined as an extension of Project A with some additional functionality and also code reusability purposes. The version of scala for both the projects will be the same.
Basically Project B Dependencies = (Project A / Project A Dependencies)
Any help on how to achieve this would be appreciated.
Edit : The project A is published to an artifactory and Project B is supposed to use it and It's dependencies as libraries to work. If unmanaged jars method is used this works fine since the jars are present with the dependencies as well, but having these as libraries from an artifactory does not produce the dependencies only the project code
Upvotes: 1
Views: 141
Reputation: 5696
Make sure you don't have % Provided
at the end of your dependencies in project A, it is used to specify that you will provide the dependencies manually, so they are not being included.
Upvotes: 1