Reputation: 43
Instead of my local directory I want to transfer files from storage bucket GCP to remote server through SFTP in java using JSch or script .
try{
JSch jsch = new JSch();
jsch.addIdentity(privateSftpKey);
session = jsch.getSession(SFTPUSER, SFTPHOST, SFTPPORT);
java.util.Properties config = new java.util.Properties();
config.put("StrictHostKeyChecking", "no");
session.setConfig(config);
session.connect();
log.info("Host connected.");
channel = session.openChannel("sftp");
channel.connect();
log.info("sftp channel opened and connected.");
channelSftp = (ChannelSftp) channel;
String sftpDirectory = "/home/share";
File directory = new File("C:\\Users\\XYZ\\Desktop\\Learning\\Projects\\TransferStorageBucketToRemoteServer");
File[] fList = directory.listFiles();
for (File file : fList){
if (file.isFile()){
String filename=file.getAbsolutePath();
channelSftp.put(filename, sftpDirectory, ChannelSftp.OVERWRITE);
System.out.println(filename + " transferred to " + sftpDirectory );
}
}
log.info("File transfered successfully to host.");
} catch (Exception ex) {
log.info("Exception found while tranfer the response.");
log.info("Exception Message...: {}",ex.getMessage());
}
When I search in internet there is only transferring files from
google cloud storage API code , Reading files from storage bucket
Credentials credentials = GoogleCredentials
.fromStream(new FileInputStream(jsonKey));
Storage storage = StorageOptions.newBuilder().setCredentials(credentials)
.setProjectId("projectId").build().getService();
Blob blob = storage.get("bucket-name", "file.txt");
ReadChannel readChannel = blob.reader();
But not sure how to upload it in SFTP channel from directly GCP ReadChannel without writing it to another output file and then transferring it in SFTP channel
channelSftp.put(file_from_readChannel , sftpDirectory, ChannelSftp.OVERWRITE);
But I can't see transferring from storage bucket directly to Remote server .
Can any one assist on this and the possible way of doing it in either java or commands?
Upvotes: 0
Views: 1454
Reputation: 2612
As it is running on Kubernetes a posible way to do it for you can be to have the bucket mounted as a drive on the pod, and upload the files to the the FTP server as if they were local files.
To mount the bucket as a drive on the pod ypu can use GCS fuse, here is an example.
This can be achieved by adding this on the Dockerfile:
Dockerfile
RUN apt-get update && apt-get install --yes --no-install-recommends \
ca-certificates \
curl \
gnupg \
&& echo "deb http://packages.cloud.google.com/apt $GCSFUSE_REPO main" \
| tee /etc/apt/sources.list.d/gcsfuse.list \
&& curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add - \
&& apt-get update \
&& apt-get install --yes gcsfuse \
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
RUN mkdir <MOUNTING_POINT>
RUN gcsfuse -o nonempty <BUCKET_NAME> <MOUNTING_POINT>
Once it is mounted you will be able to upload the files to the SFTP as if they were local files.
Upvotes: 0