Amin Shah Gilani
Amin Shah Gilani

Reputation: 9866

There is a way to batch archive GitHub repositories based off of a search?

From the answer to a related question I know it's possible to batch clone repositories based on a GitHub search result:

# cheating knowing we currently have 9 pages
for i in {1..9}
do
    curl "https://api.github.com/search/repositories?q=blazor+language:C%23&per_page=100&page=$i" \
     | jq -r '.items[].ssh_url' >> urls.txt
done

cat urls.txt | xargs -P8 -L1 git clone

I also know that the Hub client allows me to make API calls.

hub api [-it] [-X METHOD] [-H HEADER] [--cache TTL] ENDPOINT [-F FIELD|--input FILE]

I guess the last step is, how do I archive a repository with Hub?

Upvotes: 0

Views: 2495

Answers (2)

Austen
Austen

Reputation: 326

hub seems to have been superseded by the gh cli for api usage.

the gh cli does contain a repo archive feature

Here's an updated script to bulk archive using gh instead:

read -r -d '' TO_ARCHIVE <<EOF
org/repo1
org/repo2
EOF

echo $TO_ARCHIVE | xargs -n 1 -I {} gh repo archive {} -y

Upvotes: 1

Amin Shah Gilani
Amin Shah Gilani

Reputation: 9866

You can update a repository using the Update a Repository API call.

I put all my repositories in a TMP variable in the following way, and ran the following:

echo $TMP | xargs -P8 -L1 hub api -X PATCH -F archived=true

Here is a sample of what the $TMP variable looked like:

echo $TMP
/repos/amingilani/9bot
/repos/amingilani/advent-of-code-2019
/repos/amingilani/alan
/repos/amingilani/annotate_models

Upvotes: 1

Related Questions