Mass-modifying (read: deleting) my graphs from an API?

Hi all,

I’m trying to find a way to mass-delete some old graphs based on some condition (if graph.title contains "phrase" delete graph), preferably through an API. (There are enough of them that I really don’t want to do it by hand.) My preferred client is Python, but I’m not afraid to interact with the REST API directly, either. Can anyone point me towards some documentation on tools that would help?


Thanks for posting on the forum!

This functionality is not in the Python client yet, though its being discussed here and should be added within the next 30 days:

In the meantime, you can use the raw REST API. Here are the docs:

1 Like

OK, that looks great! I see how to delete files from an API, now. However, is there any way to get a listing of all of my files? (Initial ideas: just iterate from 0 to whatever the current number is generated when I make a new graph and try each one, but that seems very ugly / silly)

I went ahead and did my ugly implementation (note that this is using a copy of in the branch you linked to):

for loop_file_id in range(first_idx, last_idx):
        # try hard-deleting anything already deleted
        m_file = rsc.FileResource(fid="shawkinsl:"+str(loop_file_id))
        file_info = m_file.retrieve()
        if "_build_" in str(file_info['filename']):
            print "trashing", loop_file_id, file_info['filename']
            print "perma delete", loop_file_id, file_info['filename']
            print "skipping", loop_file_id, file_info['filename']
    except requests.exceptions.HTTPError as e:
        print e

Works reasonable well. New problem though: I’m getting rate limited! Anything I can do about this besides wait? I really don’t want this task to take me days to complete…

Ah, right. Sorry about that. There is an issue to remove rate-limiting for file deletion. If you can wait a week, try then. Otherwise you can upgrade to a Pro account where this is no rate limiting on any API actions.