Say there are 100 records here.Īnd I have a tabled called Individual Investments. Say there are 300 records here.Īnd I have a table called Companies. Now install the dependencies and tests: pip install -e '.So I have a table called Investors. Then create a new virtual environment: cd airtable-export To contribute to this tool, first checkout the code. This will run once a day (at 32 minutes past midnight UTC) and will also run if you manually click the "Run workflow" button, see GitHub Actions: Manual triggers with workflow_dispatch. github/workflows/backup-airtable.yml: name : Backup Airtable on : workflow_dispatch : schedule : - cron : '32 0 * * *' jobs : build : runs-on : ubuntu-latest steps : - name : Check out repo uses : - name : Set up Python uses : with : python-version : 3.8 - uses : name : Configure pip caching with : path : ~/.cache/pip key : $" || exit 0 git push Once you have set those secrets, add the following as a file called. 'My table with spaces in the name' OtherTableWithNoSpaces If any of these contain spaces you will need to enclose them in single quotes, e.g. To run this for your own Airtable database you'll first need to add the following secrets to your GitHub repository: AIRTABLE_BASE_ID The base ID, a string beginning `app.` AIRTABLE_KEY Your Airtable API key AIRTABLE_TABLES A space separated list of the Airtable tables that you want to backup. Doing this gives you a visible commit history of changes you make to your Airtable data - like this one. You can use it to run airtable-export in order to back up your Airtable data to a GitHub repository. GitHub Actions is GitHub's workflow automation product. airtable-export export base_id table1 table2 -key=key -http-read-timeout 60 You can override the timeout during a network read operation using the -http-read-timeout option. You can override the user-agent using the -user-agent option: airtable-export export base_id table1 table2 -key=key -user-agent "Airtable Export Robot" Request optionsīy default the tool uses python-httpx's default configurations. If you run this command against an existing SQLite database records with matching primary keys will be over-written by new records from the export. Those tables will have a primary key column called airtable_id. The SQLite database will have a table created for each table you export. If you only specify -sqlite the export directory argument will be ignored. This can be combined with other format options. You can export tables to a SQLite database file using the -sqlite database.db option: airtable-export export base_id table1 table2 \ ndjson file for each exported table: airtable-export export base_id table1 table2 \ You can pass multiple format options at once. You can also export as JSON or as newline delimited JSON using the -json or -ndjson options: airtable-export export base_id table1 table2 -key=key -ndjson Export optionsīy default the tool exports your data as YAML. Rather than passing the API key using the -key option you can set it as an environment variable called AIRTABLE_KEY. This example would create two files: export/table1.yml and export/table2.yml. You can export all of your data to a folder called export/ by running the following: airtable-export export base_id table1 table2 -key=key The names of each of the tables that you wish to export.Your Airtable API key - this is a string starting with key.Your Airtable base ID - this is a string starting with app.You will need to know the following information: Install this tool using pip: $ pip install airtable-export Export Airtable data to files on disk Installation
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |