user2073048
user2073048

Reputation: 31

Create a hash for every file on a file system

I am trying to write a program that will create a hash of every file on the file system, Window/Linux/Unix. Then the script would check it against say a file of known bad hash values. If a hash in the file system matches something in the file then print out there is a match. Any ideas about going about this?

I'm new to python and curious about the most efficient way to do this.

Upvotes: 3

Views: 3546

Answers (3)

1cedsoda
1cedsoda

Reputation: 642

I have programmed a module which is able to hash the files inside of a directory and its subdirectories. The output is a json-formatted treeview.

pip3 install py_essentials

Use the module like this:

from py_essentials import hashing as hs
hashtree = hs.createHashtree("path/to/the/directory/", 'sha1')

For further information about this function read the documentation.

Here is an example how the ouput looks like printed with json.loads() and json.dumps().

{
    "LICENSE": "1e69fdee9a5b6a177b20178014ffd56e0f64c417",
    ".gitignore": "182ba25cd720d7a2be1314927f9fa72604a9fda7",
    "MANIFEST": "b7efe1149685a1e858b4a83242a5eefbe263e00e",
    "README.md": "63b6fda48fd270e620a3defa9f960a1081d15132",
    "setup.py": "81b0b4bcb7e9076986b7135c01dd1969aedfb256",
    "setup.cfg": "7bc6f4c388d1e0409aee1f79c557b6ce3a3e9511",
    "examples": {
        "advancedList-example1.py": "615082e89704bce44b2b8a78b2738c590efcbf83"
    },
    "py_essentials": {
        "fileHandler.py": "03611c13ab37a15d8d04e79711475114fe818a08",
        "hashing.py": "5a7c77ecfa670309fb685a50ba75b1c9b2fbf161",
        "checkup.py": "95743567715213d57f0e5e6cfbb42521c5a2b661",
        "xcptns.py": "e053b7e4923ba270a7971624802efb32fd480ebb",
        "simpleRandom.py": "a1ded44a7f707c2b0530e18a4f144665e9ad759a",
        "__pycache__": {
            "exceptions.cpython-36.pyc": "ec8c20db40523c09ee51616c4e1b250bb2a0825b",
            "advancedList.cpython-36.pyc": "eeed573a2ec8187315f414e61cd5e6e083a5151d",
            "hashing.cpython-36.pyc": "ea966ec926ed9516df7d332e45a6a411f88a3b7d",
            "prettyPrinting.cpython-36.pyc": "c623260362e08ebfebf229681c357923eee85289",
            "xcptns.cpython-36.pyc": "a2538519fd5fd37f67b73272a7077b93861589ca",
            "__init__.cpython-36.pyc": "e25ee2a75c14026a2dc20170b1b3f8fe3f77069e",
            "simpleRandom.cpython-36.pyc": "1968cf15b29472c855679c71dd5b862d5bec1d26",
            "checkup.cpython-36.pyc": "982fb978040d168e888f6a389a259cc06d32d815",
            "fileHandler.cpython-36.pyc": "0995fea9dfeae4740a77284a8eae19d9231c083f"
        },
        "advancedList.py": "41e9275662562da582326757cf3c9eb7bcc031dd",
        "prettyPrinting.py": "b5244c76e5c33203b2d843705e62d1552b241db0",
        "__init__.py": "5736a4b7e59efc4191665edb471a9282d2fd642e",
        "test.py": "088d1b9786d5f4a3bf02f207a95b3b173cb636e2"
    },
    "dist": {
        "py_essentials-1.4.11.tar.gz": "a8a7f93f1fe3aa263369e6b26fbafae6ed7dbd37",
        "py_essentials-1.4.7.tar.gz": "8c4788c388d64bae1c740a21b200a9d95cf014d6",
        "py_essentials-1.4.10.tar.gz": "e632e6093a1ceb602726e3c0848692b0d4205dd9",
        "py_essentials-1.4.5.tar.gz": "a576e66b2f41a629d55ad585993b819bf3960d40",
        "py_essentials-1.4.9.tar.gz": "bc6786fbeb5be254e5c05215779a3c37a427f398",
        "py_essentials-1.4.6.tar.gz": "08f8dffdb57e7ee4aa99b0494f2b5ae529aae85a",
        "py_essentials-1.4.8.tar.gz": "16c4fff88823c7900d3bb7ecfb489916315ca492"
    }
}

After this you can go recusively through this treeview and compare the hashvalues to a list of bad hashvalues.

Upvotes: 1

Eli Stevens
Eli Stevens

Reputation: 1447

Start by making a set of the hashes you want to detect:

badHash_set = set(['1234', 'abcd'])

Then use os.walk on the root of the directory tree that you want to check:

http://docs.python.org/2/library/os.html#os.walk

for root, dirs, files in os.walk(base_path):
    for file_str in files:
        file_obj = file(os.path.join(root, file_str))
        file_md5 = hashlib.md5(file_obj.read()).hexdigest()

        if file_md5 in badHash_set:
            # ...complain

This is probably going to be painfully slow, however. It's not clear if that's going to be an issue or not.

Upvotes: 0

Hooked
Hooked

Reputation: 88118

To walk through the files in a filesytem, use os.walk. For each file you can create a hash by using the built in library hashlib.

Minimal working example:

import os, hashlib

current_dir = os.getcwd()
for root,dirs,files in os.walk(current_dir):
    for f in files:
        current_file = os.path.join(root,f)
        H = hashlib.md5()

        with open(current_file) as FIN:
            H.update(FIN.read())

        print current_file, H.hexdigest()

Upvotes: 4

Related Questions