Automating Backup Based on Lightroom Rating

Hi,
I currently have multiple TBs of photos being backed up on a NAS. I'm looking to create an off-prem backup too (possibly in cloud) but want to save some coin and only backup the RAWs of images that I have given a 4-5 star rating to in Lightroom.

Does anyone have any advice on if there is already a free/cheap solution for doing this or have any advice on writing a script to identify the images with a high rating? I know that I can export all images with a specific rating to a folder and have the folder backed up, but I was hoping to avoid having to manually export each time.

Thanks

Comments

  • +1

    Free? no, if you value your data pay for it.

    Either backup all of it:
    https://www.backblaze.com/cloud-backup.html

    Or here is my a bit more complicated solution:

    Export the metadata from Lightroom:
    a. Open Lightroom and apply a filter to show only 4-5 star rated images.
    b. Select all the filtered images.
    c. Use a Lightroom plugin like "Listview" or "Jeffrey's Metadata Viewer" to export the metadata of the selected images as a CSV file.

    Install Python and the required packages:

    pip install pandas borgbackup

    1. Create a Python script called lightroom_borgbackup.py and paste the following code:

    https://pastebin.com/hK8KkuWA

    “`
    import os
    import sys
    import pandas as pd
    from subprocess import call

    def main():
    if len(sys.argv) != 4:
    print("Usage: python lightroom_borgbackup.py <csv_file> <repo_path> <backup_name>")
    return

    csv_file = sys.argv[1]
    repo_path = sys.argv[2]
    backup_name = sys.argv[3]
    
    # Read the CSV file
    df = pd.read_csv(csv_file)
    
    # Filter RAW files
    df = df[df['File Name'].str.endswith(('.CR2', '.NEF', '.DNG', '.ARW'))]
    
    # Create a list of files to backup
    files_to_backup = df['File Path'].tolist()
    
    # Create a temporary file with the list of files
    with open('files_to_backup.txt', 'w') as f:
        f.write('\n'.join(files_to_backup))
    
    # Invoke borgbackup
    call(['borg', 'create', '--files-from', 'files_to_backup.txt', f'{repo_path}::{backup_name}', '.'])
    
    # Remove the temporary file
    os.remove('files_to_backup.txt')
    

    if name == "main":
    main()
    “`

    1. Run the script by providing the CSV file you exported from Lightroom, the path to your Borg repository, and the name you want for this backup:

    python lightroom_borgbackup.py path/to/your_metadata.csv path/to/your_borg_repo my_backup_name

    This script will create a Borg backup containing only the RAW files of images you've given a 4-5 star rating to in Lightroom.
    https://www.rsync.net/products/borg.html
    https://www.borgbackup.org/

    The major issue with your question is I think all the ratings are in some lightroom db rather than per file. Reading them is the hardest part.
    I don't have lightroom to test for you.

  • A python or power shell script to recurse the folders, read the metadata, and copy.

    Or maybe there is a sync tool that will do that.

    What OS are you running?

    • Thanks. Lightroom running on a windows box but photos and catalogs on synology.

  • +1

    Oh apparently it's SQlite: https://www.lightroomqueen.com/community/tags/sqlite/

    I don't have lightroom so I've guessed a few things.

    So try this one:

    pip install sqlite3 pandas borgbackup

    “`
    import os
    import sys
    import sqlite3
    import pandas as pd
    from subprocess import call

    def get_rated_images(catalog_file):
    conn = sqlite3.connect(catalog_file)
    cursor = conn.cursor()

    query = '''
        SELECT
            A.name AS Folder_Name,
            B.baseName || '.' || C.value AS File_Name,
            D.value AS Rating,
            A.absolutePath || A.name || '/' || B.baseName || '.' || C.value AS File_Path
        FROM
            AgLibraryFolder A
            JOIN AgLibraryFile B ON A.id_local = B.folder
            JOIN AgLibraryIPTC C ON B.id_local = C.image
            JOIN Adobe_images D ON B.id_local = D.id_local
        WHERE
            C.fieldName = 'XMP:MetadataDate'
            AND D.rating IN (4, 5)
            AND B.fileFormat IN ('RAW', 'CR2', 'NEF', 'DNG', 'ARW')
    '''
    
    cursor.execute(query)
    
    rated_images = cursor.fetchall()
    
    conn.close()
    
    return rated_images
    

    def main():
    if len(sys.argv) != 4:
    print("Usage: python lightroom_borgbackup.py <lightroom_catalog_path> <repo_path> <backup_name>")
    return

    catalog_path = sys.argv[1]
    repo_path = sys.argv[2]
    backup_name = sys.argv[3]
    
    catalog_file = os.path.join(catalog_path, 'Lightroom Catalog.lrcat')
    
    if not os.path.exists(catalog_file):
        print("Lightroom Catalog not found. Please check the path.")
        return
    
    rated_images = get_rated_images(catalog_file)
    
    files_to_backup = [image[3] for image in rated_images]
    
    # Create a temporary file with the list of files
    with open('files_to_backup.txt', 'w') as f:
        f.write('\n'.join(files_to_backup))
    
    # Invoke borgbackup
    call(['borg', 'create', '--files-from', 'files_to_backup.txt', f'{repo_path}::{backup_name}', '.'])
    
    # Remove the temporary file
    os.remove('files_to_backup.txt')
    

    if name == "main":
    main()
    “`

    Here is the whole post: https://pastebin.com/qsdy8RKS

    • Thanks very much. Will have a look.

    • I presume that if im reading from a SQLite database I can skip the export step and the script can just read the rating and then sync based on that?

      • +1

        Yep, I'd take a copy of it and use that rather than your real lightroom db

Login or Join to leave a comment