Reputation: 1
Previously I was working in AWS and I am new in Google Cloud, in AWS there was a way to upload directories/folder to bucket. I have done bit of research for uploading directory/folder in Google Cloud bucket but couldn't find. Can someone help me, how to achieve this in Google Cloud using Java.
Upvotes: 0
Views: 3115
Reputation: 4650
In Google Cloud Storage client library for java is not built-in the functionality to upload folders, but I crafted this java code to upload folders to GCS, I used Open JDK 8 and Debian
App.java
package com.example.app;
import com.google.cloud.storage.BlobId;
import com.google.cloud.storage.BlobInfo;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.io.File;
public class App {
public static void main(String[] args) throws IOException {
//directory that you want to upload
String dir = "/home/user/repo";
// get the name of the parent directory
String[] path = dir.split("/");
String folder = path[path.length - 1];
//get files in main directory
File[] files = new File(dir).listFiles();
// define your projectID & bucket name
String bucket = "myawesomefolder";
String projectId = "myawesomeprojectID";
Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
System.out.println("Uploading folder: " + folder);
uploadFolder(files, folder, bucket, storage);
}
static void uploadFolder(File[] files, String folder, String bucket, Storage storage) throws IOException {
for (File file : files) {
if (!file.isHidden()) {
// if it is a directory read the files within the subdirectory
if (file.isDirectory()) {
String[] lpath = file.getAbsolutePath().split("/");
String lfolder = lpath[lpath.length - 1];
String xfolder = folder + "/" + lfolder;
uploadFolder(file.listFiles(), xfolder, bucket, storage); // Calls same method again.
} else {
// add directory/subdirectory to the file name to create the file structure
BlobId blobId = BlobId.of(bucket, folder + "/" + file.getName());
//prepare object
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).build();
// upload object
storage.create(blobInfo, Files.readAllBytes(Paths.get(file.getAbsolutePath())));
System.out.println("Uploaded: gs://" + bucket + "/" + folder + "/" + file.getName());
}
}
}
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>testGcs</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>7</source>
<target>7</target>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
<version>1.111.2</version>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-nio</artifactId>
<version>0.121.2</version>
</dependency>
</dependencies>
</project>
Upvotes: 1
Reputation: 76093
There is no embedded functions in the Google Cloud Storage client library (or even in the API) to perform this automatically. You have to recursively upload all your files, managing yourselves the folders tree exploration.
With the gcloud CLI, you can use the command gsutil cp -r ...
. The -r
stands for "recursive" and it performs exactly the same operation.
Upvotes: 3